Releases: nicolay-r/bulk-chain
Releases · nicolay-r/bulk-chain
bulk-chain-0.25.0
Changeset
📘 What's new in 0.25.0
🔥 API support
🔥 Optional Batching support with backport to existed models
Main Changes
⚠️ Non-framework based Native Batching Support [non-caching mode] #9- 🔧 Optional Caching / No Caching for API. #43
- Simplified API usage with schema
- 🔧 Check that added arguments were not mentioned before #47
🔧 Bugs fixed
- 🔧 Prompt limiting should not be a part of inference modes #48
- 🔥 Optional prompt logging #28
- csv and tsv separation #29
- ✨ Default target path -- implement replace extention function #30
- 🔧 BaseLM -- remove naming #31
- Setup LLM adapter via JSON schema #38
- Bash script inference: --output may cause exception in the case when the file type is not SQLite #34
- 🐛 delimiter parameter that is not related to model does not supported by OpenAI. #40
- args.limit is expected to be outside of the iter_content call #45
- 🔥 🐛 data[c] records and duplicated call on parameters update. [DEBUGING] #42
- 🐛 Cache target should not be None for cached inference #50
- 🐛 API: no need for name parameter in schema #51
- Make source-iter as an optional package #53
Minor changes
- 🔥Switch FlanT5 to Replicate API by default #52
Full Changelog: 0.24.2...0.25.0
bulk-chain-0.24.2
📘 The complete list of implemeted issues: #33
- #32 Adapt readers from
source_iter
project - #35 Support re-attempting inference after any raised exception during inference
chat_with_llm
logger initializes only when we launch the chat mode.
Full Changelog: 0.24.1...0.24.2
bulk-chain-0.24.1
This release represent bug fixes from the version 0.24.0
Details: #26
Full Changelog: 0.24.0...0.24.1
bulk-chain-0.24.0
This is an initial version of the project:
https://pypi.org/project/bulk-chain/0.24.0/
See more details: #8