v0.4.10
Installation
pip install openllm==0.4.10
To upgrade from a previous version, use the following command:
pip install --upgrade openllm==0.4.10
Usage
All available models: openllm models
To start a LLM: python -m openllm start opt
To run OpenLLM within a container environment (requires GPUs): docker run --gpus all -it -P ghcr.io/bentoml/openllm:0.4.10 start opt
To run OpenLLM Clojure UI (community-maintained): docker run -p 8420:80 ghcr.io/bentoml/openllm-ui-clojure:0.4.10
Find more information about this release in the CHANGELOG.md
What's Changed
- fix(runner): remove keyword args for attrs.get() by @jeffwang0516 in #661
- fix: update notebook by @xianml in #662
- feat(type): provide structured annotations stubs by @aarnphm in #663
- feat(llm): respect warnings environment for dtype warning by @aarnphm in #664
- infra: makes huggingface-hub requirements on fine-tune by @aarnphm in #665
- types: update stubs for remaining entrypoints by @aarnphm in #667
- perf: reduce footprint by @aarnphm in #668
- perf(build): locking and improve build speed by @aarnphm in #669
- docs: add LlamaIndex integration by @aarnphm in #646
- infra: remove codegolf by @aarnphm in #671
- feat(models): Phi 1.5 by @aarnphm in #672
- fix(docs): chatglm support on vLLM by @aarnphm in #673
- chore(loading): include verbose warning about trust_remote_code by @aarnphm in #674
- perf: potentially reduce image size by @aarnphm in #675
New Contributors
- @jeffwang0516 made their first contribution in #661
Full Changelog: v0.4.9...v0.4.10