No-strings tiny Chain-of-Thought framework for your Large Language Model (LLM) that saves you time ⏰ and money 💰
The end goal of this framework is to serve chain of prompts (a.k.a. Chain-of-Thought) formed into schema
towards LLM.
It iterates through your data stored in CSV
/JSONL
/sqlite
.
- Provides iterator over infinite amount of input contexts served in
CSV
/JSONL
. - Caching progress: withstanding exception during LLM calls by using
sqlite3
engine for caching LLM answers; - Support schemas descriptions for Chain-of-Thought concept.
TODO: Work in progress. Use dependencies installation instead.
pip install git+https://github.com/nicolay-r/quick_cot
Just two simple steps:
- Define your sequence of prompts with their dependencies
- For example: Three-hop-Reasoning in Implicit CoT for sentiment analysis at
data/thor_cot_schema.json
- For example: Three-hop-Reasoning in Implicit CoT for sentiment analysis at
- Launch inference:
python infer.py \
--model "dynamic:ext/flan_t5.py:FlanT5" \
--schema "data/default.json" \
--device "cpu" \
--temp 0.1 \
--output "data/output.csv" \
--max-length 512 \
--api-token "<API_TOKEN>" \
--limit 10000 \
--limit-prompt 10000 \
--bf16 \
--l4b
TODO. To be updated.