Openllm

Latest version: v0.6.23

Safety actively analyzes 723607 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 15 of 24

0.3.6

Usage

All available models: openllm models

To start a LLM: python -m openllm start opt

To run OpenLLM within a container environment (requires GPUs): docker run --gpus all -it -P ghcr.io/bentoml/openllm:0.3.6 start opt

To run OpenLLM Clojure UI (community-maintained): docker run -p 8420:80 ghcr.io/bentoml/openllm-ui-clojure:0.3.6

Find more information about this release in the [CHANGELOG.md](https://github.com/bentoml/OpenLLM/blob/main/CHANGELOG.md)



What's Changed
* ci: pre-commit autoupdate [pre-commit.ci] by pre-commit-ci in https://github.com/bentoml/OpenLLM/pull/374
* chore(deps): bump peter-evans/create-pull-request from 4.2.4 to 5.0.2 by dependabot in https://github.com/bentoml/OpenLLM/pull/373
* feat: support continuous batching on `generate` by aarnphm in https://github.com/bentoml/OpenLLM/pull/375


**Full Changelog**: https://github.com/bentoml/OpenLLM/compare/v0.3.5...v0.3.6

0.3.5

Usage

All available models: openllm models

To start a LLM: python -m openllm start opt

To run OpenLLM within a container environment (requires GPUs): docker run --gpus all -it -P ghcr.io/bentoml/openllm:0.3.5 start opt

To run OpenLLM Clojure UI (community-maintained): docker run -p 8420:80 ghcr.io/bentoml/openllm-ui-clojure:0.3.5

Find more information about this release in the [CHANGELOG.md](https://github.com/bentoml/OpenLLM/blob/main/CHANGELOG.md)



What's Changed
* fix: set default serialisation methods by aarnphm in https://github.com/bentoml/OpenLLM/pull/355


**Full Changelog**: https://github.com/bentoml/OpenLLM/compare/v0.3.4...v0.3.5

0.3.4

Usage

All available models: openllm models

To start a LLM: python -m openllm start opt

To run OpenLLM within a container environment (requires GPUs): docker run --gpus all -it -P ghcr.io/bentoml/openllm:0.3.4 start opt

To run OpenLLM Clojure UI (community-maintained): docker run -p 8420:80 ghcr.io/bentoml/openllm-ui-clojure:0.3.4

Find more information about this release in the [CHANGELOG.md](https://github.com/bentoml/OpenLLM/blob/main/CHANGELOG.md)



What's Changed
* dos: fix typo by Sherlock113 in https://github.com/bentoml/OpenLLM/pull/305
* fix(serving): vllm bad num_gpus by alanpoulain in https://github.com/bentoml/OpenLLM/pull/326
* fix(serialisation): vllm ignore by aarnphm in https://github.com/bentoml/OpenLLM/pull/324
* feat: continuous batching with vLLM by aarnphm in https://github.com/bentoml/OpenLLM/pull/349
* fix(prompt): correct export extra objects items by aarnphm in https://github.com/bentoml/OpenLLM/pull/351

New Contributors
* alanpoulain made their first contribution in https://github.com/bentoml/OpenLLM/pull/326

**Full Changelog**: https://github.com/bentoml/OpenLLM/compare/v0.3.3...v0.3.4

0.3.3

Usage

All available models: openllm models

To start a LLM: python -m openllm start opt

To run OpenLLM within a container environment (requires GPUs): docker run --gpus all -it -P ghcr.io/bentoml/openllm:0.3.3 start opt

To run OpenLLM Clojure UI (community-maintained): docker run -p 8420:80 ghcr.io/bentoml/openllm-ui-clojure:0.3.3

Find more information about this release in the [CHANGELOG.md](https://github.com/bentoml/OpenLLM/blob/main/CHANGELOG.md)



**Full Changelog**: https://github.com/bentoml/OpenLLM/compare/v0.3.2...v0.3.3

0.3.2

Usage

All available models: openllm models

To start a LLM: python -m openllm start opt

To run OpenLLM within a container environment (requires GPUs): docker run --gpus all -it -P ghcr.io/bentoml/openllm:0.3.2 start opt

To run OpenLLM Clojure UI (community-maintained): docker run -p 8420:80 ghcr.io/bentoml/openllm-ui-clojure:0.3.2

Find more information about this release in the [CHANGELOG.md](https://github.com/bentoml/OpenLLM/blob/main/CHANGELOG.md)



**Full Changelog**: https://github.com/bentoml/OpenLLM/compare/v0.3.1...v0.3.2

0.3.1

Usage

All available models: openllm models

To start a LLM: python -m openllm start opt

To run OpenLLM within a container environment (requires GPUs): docker run --gpus all -it -P ghcr.io/bentoml/openllm:0.3.1 start opt

To run OpenLLM Clojure UI (community-maintained): docker run -p 8420:80 ghcr.io/bentoml/openllm-ui-clojure:0.3.1

Find more information about this release in the [CHANGELOG.md](https://github.com/bentoml/OpenLLM/blob/main/CHANGELOG.md)



What's Changed
* docs: Update the readme by Sherlock113 in https://github.com/bentoml/OpenLLM/pull/302
* revert: disable compiled wheels for now by aarnphm in https://github.com/bentoml/OpenLLM/pull/304

New Contributors
* Sherlock113 made their first contribution in https://github.com/bentoml/OpenLLM/pull/302

**Full Changelog**: https://github.com/bentoml/OpenLLM/compare/v0.3.0...v0.3.1

Page 15 of 24

© 2025 Safety CLI Cybersecurity Inc. All Rights Reserved.