Openllm

Latest version: v0.6.20

Safety actively analyzes 714875 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 8 of 23

0.4.32

Usage

All available models: openllm models

To start a LLM: python -m openllm start HuggingFaceH4/zephyr-7b-beta

To run OpenLLM within a container environment (requires GPUs): docker run --gpus all -it -P -v $PWD/data:$HOME/.cache/huggingface/ ghcr.io/bentoml/openllm:0.4.32 start HuggingFaceH4/zephyr-7b-beta

Find more information about this release in the [CHANGELOG.md](https://github.com/bentoml/OpenLLM/blob/main/CHANGELOG.md)



What's Changed
* chore(deps): bump taiki-e/install-action from 2.21.17 to 2.21.19 by dependabot in https://github.com/bentoml/OpenLLM/pull/735
* chore(deps): bump github/codeql-action from 2.22.7 to 2.22.8 by dependabot in https://github.com/bentoml/OpenLLM/pull/734
* chore: revert back previous backend support PyTorch by aarnphm in https://github.com/bentoml/OpenLLM/pull/739


**Full Changelog**: https://github.com/bentoml/OpenLLM/compare/v0.4.31...v0.4.32

0.4.31

Usage

All available models: openllm models

To start a LLM: python -m openllm start HuggingFaceH4/zephyr-7b-beta

To run OpenLLM within a container environment (requires GPUs): docker run --gpus all -it -P -v $PWD/data:$HOME/.cache/huggingface/ ghcr.io/bentoml/openllm:0.4.31 start HuggingFaceH4/zephyr-7b-beta

Find more information about this release in the [CHANGELOG.md](https://github.com/bentoml/OpenLLM/blob/main/CHANGELOG.md)



What's Changed
* fix(docs): remove invalid options by aarnphm in https://github.com/bentoml/OpenLLM/pull/733


**Full Changelog**: https://github.com/bentoml/OpenLLM/compare/v0.4.30...v0.4.31

0.4.30

Usage

All available models: openllm models

To start a LLM: python -m openllm start HuggingFaceH4/zephyr-7b-beta

To run OpenLLM within a container environment (requires GPUs): docker run --gpus all -it -P -v $PWD/data:$HOME/.cache/huggingface/ ghcr.io/bentoml/openllm:0.4.30 start HuggingFaceH4/zephyr-7b-beta

Find more information about this release in the [CHANGELOG.md](https://github.com/bentoml/OpenLLM/blob/main/CHANGELOG.md)



**Full Changelog**: https://github.com/bentoml/OpenLLM/compare/v0.4.29...v0.4.30

0.4.29

Usage

All available models: openllm models

To start a LLM: python -m openllm start HuggingFaceH4/zephyr-7b-beta

To run OpenLLM within a container environment (requires GPUs): docker run --gpus all -it -P -v $PWD/data:$HOME/.cache/huggingface/ ghcr.io/bentoml/openllm:0.4.29 start HuggingFaceH4/zephyr-7b-beta

Find more information about this release in the [CHANGELOG.md](https://github.com/bentoml/OpenLLM/blob/main/CHANGELOG.md)



**Full Changelog**: https://github.com/bentoml/OpenLLM/compare/v0.4.28...v0.4.29

0.4.28

Usage

All available models: openllm models

To start a LLM: python -m openllm start HuggingFaceH4/zephyr-7b-beta

To run OpenLLM within a container environment (requires GPUs): docker run --gpus all -it -P -v $PWD/data:$HOME/.cache/huggingface/ ghcr.io/bentoml/openllm:0.4.28 start HuggingFaceH4/zephyr-7b-beta

Find more information about this release in the [CHANGELOG.md](https://github.com/bentoml/OpenLLM/blob/main/CHANGELOG.md)



What's Changed
* fix(baichuan): supported from baichuan 2 from now on. by MingLiangDai in https://github.com/bentoml/OpenLLM/pull/728


**Full Changelog**: https://github.com/bentoml/OpenLLM/compare/v0.4.27...v0.4.28

0.4.26

Usage

All available models: openllm models

To start a LLM: python -m openllm start HuggingFaceH4/zephyr-7b-beta

To run OpenLLM within a container environment (requires GPUs): docker run --gpus all -it -P -v $PWD/data:$HOME/.cache/huggingface/ ghcr.io/bentoml/openllm:0.4.26 start HuggingFaceH4/zephyr-7b-beta

Find more information about this release in the [CHANGELOG.md](https://github.com/bentoml/OpenLLM/blob/main/CHANGELOG.md)



What's Changed
* fix(infra): setup higher timer for building container images by aarnphm in https://github.com/bentoml/OpenLLM/pull/723
* fix(client): correct schemas parser from correct response output by aarnphm in https://github.com/bentoml/OpenLLM/pull/724
* feat(openai): chat templates and complete control of prompt generation by aarnphm in https://github.com/bentoml/OpenLLM/pull/725


**Full Changelog**: https://github.com/bentoml/OpenLLM/compare/v0.4.25...v0.4.26

Page 8 of 23

© 2025 Safety CLI Cybersecurity Inc. All Rights Reserved.