What's Changed
- Fix issues around litellm, to support Gemini Flash Thinking model.
- Add support for o1.
Details
- Ryan marten patch 1 by @RyanMarten in #273
- Clean ups in llm.py by @madiator in #274
- Put the examples in respective folders and add requirements.txt everywhere by @madiator in #275
- Catch catch-all Exception since litellm doesn't throw specific error. by @madiator in #281
- feat: add o1 model structured output support by @devin-ai-integration in #284
- Bump to 0.1.13 by @madiator in #285
- Merge dev into main for 0.1.13 release. by @madiator in #286
Full Changelog: v0.1.12...0.1.13