Openllm

Latest version: v0.6.23

Safety actively analyzes 723607 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 18 of 24

0.2.11

Usage

All available models: openllm models

To start a LLM: python -m openllm start opt

Find more information about this release in the [CHANGELOG.md](https://github.com/bentoml/OpenLLM/blob/main/CHANGELOG.md)



What's Changed
* fix(ci): correct tag for checkout by aarnphm in https://github.com/bentoml/OpenLLM/pull/150
* fix: disable auto fixes by aarnphm in https://github.com/bentoml/OpenLLM/pull/151
* chore: add nous to example default id as non-gated Llama by aarnphm in https://github.com/bentoml/OpenLLM/pull/152
* feat: supports embeddings for T5 and ChatGLM family generation by aarnphm in https://github.com/bentoml/OpenLLM/pull/153


**Full Changelog**: https://github.com/bentoml/OpenLLM/compare/v0.2.10...v0.2.11

0.2.10

Usage

All available models: openllm models

To start a LLM: python -m openllm start opt

Find more information about this release in the [CHANGELOG.md](https://github.com/bentoml/OpenLLM/blob/main/CHANGELOG.md)

What's Changed
* feat(ci): automatic release semver + git archival installation by aarnphm in https://github.com/bentoml/OpenLLM/pull/143
* docs: remove extraneous whitespace by aarnphm in https://github.com/bentoml/OpenLLM/pull/144
* docs: update fine tuning model support by aarnphm in https://github.com/bentoml/OpenLLM/pull/145
* fix(build): running from container choosing models correctly by aarnphm in https://github.com/bentoml/OpenLLM/pull/141
* feat(client): embeddings by aarnphm in https://github.com/bentoml/OpenLLM/pull/146


**Full Changelog**: https://github.com/bentoml/OpenLLM/compare/v0.2.9...v0.2.10

0.2.9

Usage

All available models: openllm models

To start a LLM: python -m openllm start opt

Find more information about this release in the [CHANGELOG.md](https://github.com/bentoml/OpenLLM/blob/main/CHANGELOG.md)



What's Changed
* ci: release python earlier than building binary wheels by aarnphm in https://github.com/bentoml/OpenLLM/pull/138
* docs: Update README.md by parano in https://github.com/bentoml/OpenLLM/pull/139


**Full Changelog**: https://github.com/bentoml/OpenLLM/compare/v0.2.8...v0.2.9

0.2.8

Usage

All available models: openllm models

To start a LLM: python -m openllm start opt

Find more information about this release in the [CHANGELOG.md](https://github.com/bentoml/OpenLLM/blob/main/CHANGELOG.md)



What's Changed
* feat(service): provisional API by aarnphm in https://github.com/bentoml/OpenLLM/pull/133
* chore(deps): update bitsandbytes requirement from <0.40 to <0.42 by dependabot in https://github.com/bentoml/OpenLLM/pull/137
* feat: vLLM integration for PagedAttention by aarnphm in https://github.com/bentoml/OpenLLM/pull/134


**Full Changelog**: https://github.com/bentoml/OpenLLM/compare/v0.2.7...v0.2.8

0.2.7

Usage

All available models: openllm models

To start a LLM: python -m openllm start opt

Find more information about this release in the [CHANGELOG.md](https://github.com/bentoml/OpenLLM/blob/main/CHANGELOG.md)



**Full Changelog**: https://github.com/bentoml/OpenLLM/compare/v0.2.6...v0.2.7

0.2.6

Usage

All available models: openllm models

To start a LLM: python -m openllm start opt

Find more information about this release in the [CHANGELOG.md](https://github.com/bentoml/OpenLLM/blob/main/CHANGELOG.md)



What's Changed
* chore(ci): better release flow by aarnphm in https://github.com/bentoml/OpenLLM/pull/131
* perf(serialisation): implement wrapper to reduce callstack by aarnphm in https://github.com/bentoml/OpenLLM/pull/132


**Full Changelog**: https://github.com/bentoml/OpenLLM/compare/v0.2.5...v0.2.6

Page 18 of 24

© 2025 Safety CLI Cybersecurity Inc. All Rights Reserved.