Skip to content

Releases: bentoml/OpenLLM

v0.4.18

20 Nov 05:25
Compare
Choose a tag to compare

Installation

pip install openllm==0.4.18

To upgrade from a previous version, use the following command:

pip install --upgrade openllm==0.4.18

Usage

All available models: openllm models

To start a LLM: python -m openllm start HuggingFaceH4/zephyr-7b-beta

To run OpenLLM within a container environment (requires GPUs): docker run --gpus all -it -P -v $PWD/data:$HOME/.cache/huggingface/ ghcr.io/bentoml/openllm:0.4.18 start HuggingFaceH4/zephyr-7b-beta

Find more information about this release in the CHANGELOG.md

What's Changed

  • chore: update lower bound version of bentoml to avoid breakage by @aarnphm in #703
  • feat(openai): dynamic model_type registration by @aarnphm in #704

Full Changelog: v0.4.17...v0.4.18

v0.4.17

20 Nov 03:54
Compare
Choose a tag to compare

Installation

pip install openllm==0.4.17

To upgrade from a previous version, use the following command:

pip install --upgrade openllm==0.4.17

Usage

All available models: openllm models

To start a LLM: python -m openllm start HuggingFaceH4/zephyr-7b-beta

To run OpenLLM within a container environment (requires GPUs): docker run --gpus all -it -P -v $PWD/data:$HOME/.cache/huggingface/ ghcr.io/bentoml/openllm:0.4.17 start HuggingFaceH4/zephyr-7b-beta

Find more information about this release in the CHANGELOG.md

What's Changed

  • infra: update generate notes and better local handle by @aarnphm in #701
  • fix(backend): correct use variable for backend when initialisation by @aarnphm in #702

Full Changelog: v0.4.16...v0.4.17

v0.4.16

19 Nov 15:51
Compare
Choose a tag to compare

Installation

pip install openllm==0.4.16

To upgrade from a previous version, use the following command:

pip install --upgrade openllm==0.4.16

Usage

All available models: openllm models

To start a LLM: python -m openllm start opt

To run OpenLLM within a container environment (requires GPUs): docker run --gpus all -it -P ghcr.io/bentoml/openllm:0.4.16 start opt

To run OpenLLM Clojure UI (community-maintained): docker run -p 8420:80 ghcr.io/bentoml/openllm-ui-clojure:0.4.16

Find more information about this release in the CHANGELOG.md

What's Changed

Full Changelog: v0.4.15...v0.4.16

v0.4.15

19 Nov 00:56
Compare
Choose a tag to compare

Installation

pip install openllm==0.4.15

To upgrade from a previous version, use the following command:

pip install --upgrade openllm==0.4.15

Usage

All available models: openllm models

To start a LLM: python -m openllm start opt

To run OpenLLM within a container environment (requires GPUs): docker run --gpus all -it -P ghcr.io/bentoml/openllm:0.4.15 start opt

To run OpenLLM Clojure UI (community-maintained): docker run -p 8420:80 ghcr.io/bentoml/openllm-ui-clojure:0.4.15

Find more information about this release in the CHANGELOG.md

What's Changed

  • fix(cattrs): strictly lock <23.2 until we upgrade validation logic by @aarnphm in #690
  • fix(annotations): check library through find_spec by @aarnphm in #691
  • feat: heuristics logprobs by @aarnphm in #692
  • chore: update documentation by @aarnphm in #693

Full Changelog: v0.4.14...v0.4.15

v0.4.14

17 Nov 22:04
Compare
Choose a tag to compare

Installation

pip install openllm==0.4.14

To upgrade from a previous version, use the following command:

pip install --upgrade openllm==0.4.14

Usage

All available models: openllm models

To start a LLM: python -m openllm start opt

To run OpenLLM within a container environment (requires GPUs): docker run --gpus all -it -P ghcr.io/bentoml/openllm:0.4.14 start opt

To run OpenLLM Clojure UI (community-maintained): docker run -p 8420:80 ghcr.io/bentoml/openllm-ui-clojure:0.4.14

Find more information about this release in the CHANGELOG.md

What's Changed

  • fix(dependencies): ignore broken cattrs release by @aarnphm in #689

Full Changelog: v0.4.13...v0.4.14

v0.4.13

17 Nov 21:17
Compare
Choose a tag to compare

Installation

pip install openllm==0.4.13

To upgrade from a previous version, use the following command:

pip install --upgrade openllm==0.4.13

Usage

All available models: openllm models

To start a LLM: python -m openllm start opt

To run OpenLLM within a container environment (requires GPUs): docker run --gpus all -it -P ghcr.io/bentoml/openllm:0.4.13 start opt

To run OpenLLM Clojure UI (community-maintained): docker run -p 8420:80 ghcr.io/bentoml/openllm-ui-clojure:0.4.13

Find more information about this release in the CHANGELOG.md

What's Changed

  • fix(llm): remove unnecessary check by @aarnphm in #683
  • examples: improve instructions and cleanup simple API server by @aarnphm in #684
  • fix(build): lock lower version based on each release and update infra by @aarnphm in #686

Full Changelog: v0.4.12...v0.4.13

v0.4.12

17 Nov 16:04
Compare
Choose a tag to compare

Installation

pip install openllm==0.4.12

To upgrade from a previous version, use the following command:

pip install --upgrade openllm==0.4.12

Usage

All available models: openllm models

To start a LLM: python -m openllm start opt

To run OpenLLM within a container environment (requires GPUs): docker run --gpus all -it -P ghcr.io/bentoml/openllm:0.4.12 start opt

To run OpenLLM Clojure UI (community-maintained): docker run -p 8420:80 ghcr.io/bentoml/openllm-ui-clojure:0.4.12

Find more information about this release in the CHANGELOG.md

What's Changed

  • fix(envvar): explicitly set NVIDIA_DRIVER_CAPABILITIES by @aarnphm in #681
  • fix(torch_dtype): correctly infer based on options by @aarnphm in #682

Full Changelog: v0.4.11...v0.4.12

v0.4.11

17 Nov 15:04
Compare
Choose a tag to compare

Installation

pip install openllm==0.4.11

To upgrade from a previous version, use the following command:

pip install --upgrade openllm==0.4.11

Usage

All available models: openllm models

To start a LLM: python -m openllm start opt

To run OpenLLM within a container environment (requires GPUs): docker run --gpus all -it -P ghcr.io/bentoml/openllm:0.4.11 start opt

To run OpenLLM Clojure UI (community-maintained): docker run -p 8420:80 ghcr.io/bentoml/openllm-ui-clojure:0.4.11

Find more information about this release in the CHANGELOG.md

What's Changed

  • infra: update cbfmt options by @aarnphm in #676
  • fix(examples): add support for streaming feature by @aarnphm in #677
  • fix: correct set item for attrs >23.1 by @aarnphm in #678
  • fix(build): correctly parse default env for container by @aarnphm in #679
  • fix(env): correct format environment on docker by @aarnphm in #680

Full Changelog: v0.4.10...v0.4.11

v0.4.10

17 Nov 06:32
Compare
Choose a tag to compare

Installation

pip install openllm==0.4.10

To upgrade from a previous version, use the following command:

pip install --upgrade openllm==0.4.10

Usage

All available models: openllm models

To start a LLM: python -m openllm start opt

To run OpenLLM within a container environment (requires GPUs): docker run --gpus all -it -P ghcr.io/bentoml/openllm:0.4.10 start opt

To run OpenLLM Clojure UI (community-maintained): docker run -p 8420:80 ghcr.io/bentoml/openllm-ui-clojure:0.4.10

Find more information about this release in the CHANGELOG.md

What's Changed

New Contributors

Full Changelog: v0.4.9...v0.4.10

v0.4.9

15 Nov 08:01
Compare
Choose a tag to compare

Installation

pip install openllm==0.4.9

To upgrade from a previous version, use the following command:

pip install --upgrade openllm==0.4.9

Usage

All available models: openllm models

To start a LLM: python -m openllm start opt

To run OpenLLM within a container environment (requires GPUs): docker run --gpus all -it -P ghcr.io/bentoml/openllm:0.4.9 start opt

To run OpenLLM Clojure UI (community-maintained): docker run -p 8420:80 ghcr.io/bentoml/openllm-ui-clojure:0.4.9

Find more information about this release in the CHANGELOG.md

What's Changed

  • infra: update scripts to run update readme automatically by @aarnphm in #658
  • chore: update requirements in README.md by @aarnphm in #659
  • fix(falcon): remove early_stopping default arguments by @aarnphm in #660

Full Changelog: v0.4.8...v0.4.9