Changeset
📘 What's new in 0.25.0
🔥 API support
🔥 Optional Batching support with backport to existed models
Main Changes
⚠️ Non-framework based Native Batching Support [non-caching mode] #9- 🔧 Optional Caching / No Caching for API. #43
- Simplified API usage with schema
- 🔧 Check that added arguments were not mentioned before #47
🔧 Bugs fixed
- 🔧 Prompt limiting should not be a part of inference modes #48
- 🔥 Optional prompt logging #28
- csv and tsv separation #29
- ✨ Default target path -- implement replace extention function #30
- 🔧 BaseLM -- remove naming #31
- Setup LLM adapter via JSON schema #38
- Bash script inference: --output may cause exception in the case when the file type is not SQLite #34
- 🐛 delimiter parameter that is not related to model does not supported by OpenAI. #40
- args.limit is expected to be outside of the iter_content call #45
- 🔥 🐛 data[c] records and duplicated call on parameters update. [DEBUGING] #42
- 🐛 Cache target should not be None for cached inference #50
- 🐛 API: no need for name parameter in schema #51
- Make source-iter as an optional package #53
Minor changes
- 🔥Switch FlanT5 to Replicate API by default #52
Full Changelog: 0.24.2...0.25.0