Openllm

Latest version: v0.6.23

Safety actively analyzes 723607 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 10 of 24

0.4.22

Usage

All available models: openllm models

To start a LLM: python -m openllm start HuggingFaceH4/zephyr-7b-beta

To run OpenLLM within a container environment (requires GPUs): docker run --gpus all -it -P -v $PWD/data:$HOME/.cache/huggingface/ ghcr.io/bentoml/openllm:0.4.22 start HuggingFaceH4/zephyr-7b-beta

Find more information about this release in the [CHANGELOG.md](https://github.com/bentoml/OpenLLM/blob/main/CHANGELOG.md)



What's Changed
* refactor: update runner helpers and add max_model_len by aarnphm in https://github.com/bentoml/OpenLLM/pull/712


**Full Changelog**: https://github.com/bentoml/OpenLLM/compare/v0.4.21...v0.4.22

0.4.21

Usage

All available models: openllm models

To start a LLM: python -m openllm start HuggingFaceH4/zephyr-7b-beta

To run OpenLLM within a container environment (requires GPUs): docker run --gpus all -it -P -v $PWD/data:$HOME/.cache/huggingface/ ghcr.io/bentoml/openllm:0.4.21 start HuggingFaceH4/zephyr-7b-beta

Find more information about this release in the [CHANGELOG.md](https://github.com/bentoml/OpenLLM/blob/main/CHANGELOG.md)



What's Changed
* ci: pre-commit autoupdate [pre-commit.ci] by pre-commit-ci in https://github.com/bentoml/OpenLLM/pull/711
* chore(deps): bump taiki-e/install-action from 2.21.11 to 2.21.17 by dependabot in https://github.com/bentoml/OpenLLM/pull/709
* chore(deps): bump docker/build-push-action from 5.0.0 to 5.1.0 by dependabot in https://github.com/bentoml/OpenLLM/pull/708
* chore(deps): bump github/codeql-action from 2.22.5 to 2.22.7 by dependabot in https://github.com/bentoml/OpenLLM/pull/707


**Full Changelog**: https://github.com/bentoml/OpenLLM/compare/v0.4.20...v0.4.21

0.4.20

Usage

All available models: openllm models

To start a LLM: python -m openllm start HuggingFaceH4/zephyr-7b-beta

To run OpenLLM within a container environment (requires GPUs): docker run --gpus all -it -P -v $PWD/data:$HOME/.cache/huggingface/ ghcr.io/bentoml/openllm:0.4.20 start HuggingFaceH4/zephyr-7b-beta

Find more information about this release in the [CHANGELOG.md](https://github.com/bentoml/OpenLLM/blob/main/CHANGELOG.md)



**Full Changelog**: https://github.com/bentoml/OpenLLM/compare/v0.4.19...v0.4.20

0.4.19

Usage

All available models: openllm models

To start a LLM: python -m openllm start HuggingFaceH4/zephyr-7b-beta

To run OpenLLM within a container environment (requires GPUs): docker run --gpus all -it -P -v $PWD/data:$HOME/.cache/huggingface/ ghcr.io/bentoml/openllm:0.4.19 start HuggingFaceH4/zephyr-7b-beta

Find more information about this release in the [CHANGELOG.md](https://github.com/bentoml/OpenLLM/blob/main/CHANGELOG.md)



**Full Changelog**: https://github.com/bentoml/OpenLLM/compare/v0.4.18...v0.4.19

0.4.18

Usage

All available models: openllm models

To start a LLM: python -m openllm start HuggingFaceH4/zephyr-7b-beta

To run OpenLLM within a container environment (requires GPUs): docker run --gpus all -it -P -v $PWD/data:$HOME/.cache/huggingface/ ghcr.io/bentoml/openllm:0.4.18 start HuggingFaceH4/zephyr-7b-beta

Find more information about this release in the [CHANGELOG.md](https://github.com/bentoml/OpenLLM/blob/main/CHANGELOG.md)



What's Changed
* chore: update lower bound version of bentoml to avoid breakage by aarnphm in https://github.com/bentoml/OpenLLM/pull/703
* feat(openai): dynamic model_type registration by aarnphm in https://github.com/bentoml/OpenLLM/pull/704


**Full Changelog**: https://github.com/bentoml/OpenLLM/compare/v0.4.17...v0.4.18

0.4.17

Usage

All available models: openllm models

To start a LLM: python -m openllm start HuggingFaceH4/zephyr-7b-beta

To run OpenLLM within a container environment (requires GPUs): docker run --gpus all -it -P -v $PWD/data:$HOME/.cache/huggingface/ ghcr.io/bentoml/openllm:0.4.17 start HuggingFaceH4/zephyr-7b-beta

Find more information about this release in the [CHANGELOG.md](https://github.com/bentoml/OpenLLM/blob/main/CHANGELOG.md)



What's Changed
* infra: update generate notes and better local handle by aarnphm in https://github.com/bentoml/OpenLLM/pull/701
* fix(backend): correct use variable for backend when initialisation by aarnphm in https://github.com/bentoml/OpenLLM/pull/702


**Full Changelog**: https://github.com/bentoml/OpenLLM/compare/v0.4.16...v0.4.17

Page 10 of 24

© 2025 Safety CLI Cybersecurity Inc. All Rights Reserved.