Releases: bentoml/OpenLLM
v0.4.18
Installation
pip install openllm==0.4.18
To upgrade from a previous version, use the following command:
pip install --upgrade openllm==0.4.18
Usage
All available models: openllm models
To start a LLM: python -m openllm start HuggingFaceH4/zephyr-7b-beta
To run OpenLLM within a container environment (requires GPUs): docker run --gpus all -it -P -v $PWD/data:$HOME/.cache/huggingface/ ghcr.io/bentoml/openllm:0.4.18 start HuggingFaceH4/zephyr-7b-beta
Find more information about this release in the CHANGELOG.md
What's Changed
- chore: update lower bound version of bentoml to avoid breakage by @aarnphm in #703
- feat(openai): dynamic model_type registration by @aarnphm in #704
Full Changelog: v0.4.17...v0.4.18
v0.4.17
Installation
pip install openllm==0.4.17
To upgrade from a previous version, use the following command:
pip install --upgrade openllm==0.4.17
Usage
All available models: openllm models
To start a LLM: python -m openllm start HuggingFaceH4/zephyr-7b-beta
To run OpenLLM within a container environment (requires GPUs): docker run --gpus all -it -P -v $PWD/data:$HOME/.cache/huggingface/ ghcr.io/bentoml/openllm:0.4.17 start HuggingFaceH4/zephyr-7b-beta
Find more information about this release in the CHANGELOG.md
What's Changed
- infra: update generate notes and better local handle by @aarnphm in #701
- fix(backend): correct use variable for backend when initialisation by @aarnphm in #702
Full Changelog: v0.4.16...v0.4.17
v0.4.16
Installation
pip install openllm==0.4.16
To upgrade from a previous version, use the following command:
pip install --upgrade openllm==0.4.16
Usage
All available models: openllm models
To start a LLM: python -m openllm start opt
To run OpenLLM within a container environment (requires GPUs): docker run --gpus all -it -P ghcr.io/bentoml/openllm:0.4.16 start opt
To run OpenLLM Clojure UI (community-maintained): docker run -p 8420:80 ghcr.io/bentoml/openllm-ui-clojure:0.4.16
Find more information about this release in the CHANGELOG.md
What's Changed
- feat(ctranslate): initial infrastructure support by @aarnphm in #694
- feat(vllm): bump to 0.2.2 by @aarnphm in #695
- feat(engine): CTranslate2 by @aarnphm in #698
- chore: update documentation about runtime by @aarnphm in #699
- chore: update changelog [skip ci] by @aarnphm in #700
Full Changelog: v0.4.15...v0.4.16
v0.4.15
Installation
pip install openllm==0.4.15
To upgrade from a previous version, use the following command:
pip install --upgrade openllm==0.4.15
Usage
All available models: openllm models
To start a LLM: python -m openllm start opt
To run OpenLLM within a container environment (requires GPUs): docker run --gpus all -it -P ghcr.io/bentoml/openllm:0.4.15 start opt
To run OpenLLM Clojure UI (community-maintained): docker run -p 8420:80 ghcr.io/bentoml/openllm-ui-clojure:0.4.15
Find more information about this release in the CHANGELOG.md
What's Changed
- fix(cattrs): strictly lock <23.2 until we upgrade validation logic by @aarnphm in #690
- fix(annotations): check library through find_spec by @aarnphm in #691
- feat: heuristics logprobs by @aarnphm in #692
- chore: update documentation by @aarnphm in #693
Full Changelog: v0.4.14...v0.4.15
v0.4.14
Installation
pip install openllm==0.4.14
To upgrade from a previous version, use the following command:
pip install --upgrade openllm==0.4.14
Usage
All available models: openllm models
To start a LLM: python -m openllm start opt
To run OpenLLM within a container environment (requires GPUs): docker run --gpus all -it -P ghcr.io/bentoml/openllm:0.4.14 start opt
To run OpenLLM Clojure UI (community-maintained): docker run -p 8420:80 ghcr.io/bentoml/openllm-ui-clojure:0.4.14
Find more information about this release in the CHANGELOG.md
What's Changed
Full Changelog: v0.4.13...v0.4.14
v0.4.13
Installation
pip install openllm==0.4.13
To upgrade from a previous version, use the following command:
pip install --upgrade openllm==0.4.13
Usage
All available models: openllm models
To start a LLM: python -m openllm start opt
To run OpenLLM within a container environment (requires GPUs): docker run --gpus all -it -P ghcr.io/bentoml/openllm:0.4.13 start opt
To run OpenLLM Clojure UI (community-maintained): docker run -p 8420:80 ghcr.io/bentoml/openllm-ui-clojure:0.4.13
Find more information about this release in the CHANGELOG.md
What's Changed
- fix(llm): remove unnecessary check by @aarnphm in #683
- examples: improve instructions and cleanup simple API server by @aarnphm in #684
- fix(build): lock lower version based on each release and update infra by @aarnphm in #686
Full Changelog: v0.4.12...v0.4.13
v0.4.12
Installation
pip install openllm==0.4.12
To upgrade from a previous version, use the following command:
pip install --upgrade openllm==0.4.12
Usage
All available models: openllm models
To start a LLM: python -m openllm start opt
To run OpenLLM within a container environment (requires GPUs): docker run --gpus all -it -P ghcr.io/bentoml/openllm:0.4.12 start opt
To run OpenLLM Clojure UI (community-maintained): docker run -p 8420:80 ghcr.io/bentoml/openllm-ui-clojure:0.4.12
Find more information about this release in the CHANGELOG.md
What's Changed
- fix(envvar): explicitly set NVIDIA_DRIVER_CAPABILITIES by @aarnphm in #681
- fix(torch_dtype): correctly infer based on options by @aarnphm in #682
Full Changelog: v0.4.11...v0.4.12
v0.4.11
Installation
pip install openllm==0.4.11
To upgrade from a previous version, use the following command:
pip install --upgrade openllm==0.4.11
Usage
All available models: openllm models
To start a LLM: python -m openllm start opt
To run OpenLLM within a container environment (requires GPUs): docker run --gpus all -it -P ghcr.io/bentoml/openllm:0.4.11 start opt
To run OpenLLM Clojure UI (community-maintained): docker run -p 8420:80 ghcr.io/bentoml/openllm-ui-clojure:0.4.11
Find more information about this release in the CHANGELOG.md
What's Changed
- infra: update cbfmt options by @aarnphm in #676
- fix(examples): add support for streaming feature by @aarnphm in #677
- fix: correct set item for attrs >23.1 by @aarnphm in #678
- fix(build): correctly parse default env for container by @aarnphm in #679
- fix(env): correct format environment on docker by @aarnphm in #680
Full Changelog: v0.4.10...v0.4.11
v0.4.10
Installation
pip install openllm==0.4.10
To upgrade from a previous version, use the following command:
pip install --upgrade openllm==0.4.10
Usage
All available models: openllm models
To start a LLM: python -m openllm start opt
To run OpenLLM within a container environment (requires GPUs): docker run --gpus all -it -P ghcr.io/bentoml/openllm:0.4.10 start opt
To run OpenLLM Clojure UI (community-maintained): docker run -p 8420:80 ghcr.io/bentoml/openllm-ui-clojure:0.4.10
Find more information about this release in the CHANGELOG.md
What's Changed
- fix(runner): remove keyword args for attrs.get() by @jeffwang0516 in #661
- fix: update notebook by @xianml in #662
- feat(type): provide structured annotations stubs by @aarnphm in #663
- feat(llm): respect warnings environment for dtype warning by @aarnphm in #664
- infra: makes huggingface-hub requirements on fine-tune by @aarnphm in #665
- types: update stubs for remaining entrypoints by @aarnphm in #667
- perf: reduce footprint by @aarnphm in #668
- perf(build): locking and improve build speed by @aarnphm in #669
- docs: add LlamaIndex integration by @aarnphm in #646
- infra: remove codegolf by @aarnphm in #671
- feat(models): Phi 1.5 by @aarnphm in #672
- fix(docs): chatglm support on vLLM by @aarnphm in #673
- chore(loading): include verbose warning about trust_remote_code by @aarnphm in #674
- perf: potentially reduce image size by @aarnphm in #675
New Contributors
- @jeffwang0516 made their first contribution in #661
Full Changelog: v0.4.9...v0.4.10
v0.4.9
Installation
pip install openllm==0.4.9
To upgrade from a previous version, use the following command:
pip install --upgrade openllm==0.4.9
Usage
All available models: openllm models
To start a LLM: python -m openllm start opt
To run OpenLLM within a container environment (requires GPUs): docker run --gpus all -it -P ghcr.io/bentoml/openllm:0.4.9 start opt
To run OpenLLM Clojure UI (community-maintained): docker run -p 8420:80 ghcr.io/bentoml/openllm-ui-clojure:0.4.9
Find more information about this release in the CHANGELOG.md
What's Changed
- infra: update scripts to run update readme automatically by @aarnphm in #658
- chore: update requirements in README.md by @aarnphm in #659
- fix(falcon): remove early_stopping default arguments by @aarnphm in #660
Full Changelog: v0.4.8...v0.4.9