Openllm

Latest version: v0.6.23

Safety actively analyzes 723607 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 13 of 24

0.4.4

Usage

All available models: openllm models

To start a LLM: python -m openllm start opt

To run OpenLLM within a container environment (requires GPUs): docker run --gpus all -it -P ghcr.io/bentoml/openllm:0.4.4 start opt

To run OpenLLM Clojure UI (community-maintained): docker run -p 8420:80 ghcr.io/bentoml/openllm-ui-clojure:0.4.4

Find more information about this release in the [CHANGELOG.md](https://github.com/bentoml/OpenLLM/blob/main/CHANGELOG.md)



What's Changed
* chore: no need compat workaround for setting cell_contents by aarnphm in https://github.com/bentoml/OpenLLM/pull/616
* chore(llm): expose quantise and lazy load heavy imports by aarnphm in https://github.com/bentoml/OpenLLM/pull/617
* feat(llm): update warning envvar and add embedded mode by aarnphm in https://github.com/bentoml/OpenLLM/pull/618


**Full Changelog**: https://github.com/bentoml/OpenLLM/compare/v0.4.3...v0.4.4

0.4.3

Usage

All available models: openllm models

To start a LLM: python -m openllm start opt

To run OpenLLM within a container environment (requires GPUs): docker run --gpus all -it -P ghcr.io/bentoml/openllm:0.4.3 start opt

To run OpenLLM Clojure UI (community-maintained): docker run -p 8420:80 ghcr.io/bentoml/openllm-ui-clojure:0.4.3

Find more information about this release in the [CHANGELOG.md](https://github.com/bentoml/OpenLLM/blob/main/CHANGELOG.md)



What's Changed
* feat(server): helpers endpoints for conversation format by aarnphm in https://github.com/bentoml/OpenLLM/pull/613
* feat(client): support return response_cls to string by aarnphm in https://github.com/bentoml/OpenLLM/pull/614
* feat(client): add helpers subclass by aarnphm in https://github.com/bentoml/OpenLLM/pull/615


**Full Changelog**: https://github.com/bentoml/OpenLLM/compare/v0.4.2...v0.4.3

0.4.2

Usage

All available models: openllm models

To start a LLM: python -m openllm start opt

To run OpenLLM within a container environment (requires GPUs): docker run --gpus all -it -P ghcr.io/bentoml/openllm:0.4.2 start opt

To run OpenLLM Clojure UI (community-maintained): docker run -p 8420:80 ghcr.io/bentoml/openllm-ui-clojure:0.4.2

Find more information about this release in the [CHANGELOG.md](https://github.com/bentoml/OpenLLM/blob/main/CHANGELOG.md)



What's Changed
* refactor(cli): cleanup API by aarnphm in https://github.com/bentoml/OpenLLM/pull/592
* infra: move out clojure to external by aarnphm in https://github.com/bentoml/OpenLLM/pull/593
* infra: using ruff formatter by aarnphm in https://github.com/bentoml/OpenLLM/pull/594
* infra: remove tsconfig by aarnphm in https://github.com/bentoml/OpenLLM/pull/595
* revert: configuration not to dump flatten by aarnphm in https://github.com/bentoml/OpenLLM/pull/597
* package: add openllm core dependencies to labels by aarnphm in https://github.com/bentoml/OpenLLM/pull/600
* fix: loading correct local models by aarnphm in https://github.com/bentoml/OpenLLM/pull/599
* fix: correct importmodules locally by aarnphm in https://github.com/bentoml/OpenLLM/pull/601
* fix: overload flattened dict by aarnphm in https://github.com/bentoml/OpenLLM/pull/602
* feat(client): support authentication token and shim implementation by aarnphm in https://github.com/bentoml/OpenLLM/pull/605
* fix(client): check for should retry header by aarnphm in https://github.com/bentoml/OpenLLM/pull/606
* chore(client): remove ununsed state enum by aarnphm in https://github.com/bentoml/OpenLLM/pull/609
* chore: remove generated stubs for now by aarnphm in https://github.com/bentoml/OpenLLM/pull/610
* refactor(config): simplify configuration and update start CLI output by aarnphm in https://github.com/bentoml/OpenLLM/pull/611
* docs: update supported feature set by aarnphm in https://github.com/bentoml/OpenLLM/pull/612


**Full Changelog**: https://github.com/bentoml/OpenLLM/compare/v0.4.1...v0.4.2

0.4.1

Usage

All available models: openllm models

To start a LLM: python -m openllm start opt

To run OpenLLM within a container environment (requires GPUs): docker run --gpus all -it -P ghcr.io/bentoml/openllm:0.4.1 start opt

To run OpenLLM Clojure UI (community-maintained): docker run -p 8420:80 ghcr.io/bentoml/openllm-ui-clojure:0.4.1

Find more information about this release in the [CHANGELOG.md](https://github.com/bentoml/OpenLLM/blob/main/CHANGELOG.md)



What's Changed
* chore(runner): yield the outputs directly by aarnphm in https://github.com/bentoml/OpenLLM/pull/573
* chore(openai): simplify client examples by aarnphm in https://github.com/bentoml/OpenLLM/pull/574
* fix(examples): correct dependencies in requirements.txt [skip ci] by aarnphm in https://github.com/bentoml/OpenLLM/pull/575
* refactor: cleanup typing to expose correct API by aarnphm in https://github.com/bentoml/OpenLLM/pull/576
* fix(stubs): update initialisation types by aarnphm in https://github.com/bentoml/OpenLLM/pull/577
* refactor(strategies): move logics into openllm-python by aarnphm in https://github.com/bentoml/OpenLLM/pull/578
* chore(service): cleanup API by aarnphm in https://github.com/bentoml/OpenLLM/pull/579
* infra: disable npm updates and correct python packages by aarnphm in https://github.com/bentoml/OpenLLM/pull/580
* chore(deps): bump aquasecurity/trivy-action from 0.13.1 to 0.14.0 by dependabot in https://github.com/bentoml/OpenLLM/pull/583
* chore(deps): bump taiki-e/install-action from 2.21.7 to 2.21.8 by dependabot in https://github.com/bentoml/OpenLLM/pull/581
* chore(deps): bump sigstore/cosign-installer from 3.1.2 to 3.2.0 by dependabot in https://github.com/bentoml/OpenLLM/pull/582
* fix: device imports using strategies by aarnphm in https://github.com/bentoml/OpenLLM/pull/584
* fix(gptq): update config fields by aarnphm in https://github.com/bentoml/OpenLLM/pull/585
* fix: unbound variable for completion client by aarnphm in https://github.com/bentoml/OpenLLM/pull/587
* fix(awq): correct awq detection for support by aarnphm in https://github.com/bentoml/OpenLLM/pull/586
* feat(vllm): squeezellm by aarnphm in https://github.com/bentoml/OpenLLM/pull/588
* docs: update quantization notes by aarnphm in https://github.com/bentoml/OpenLLM/pull/589
* fix(cli): append model-id instruction to build by aarnphm in https://github.com/bentoml/OpenLLM/pull/590
* container: update tracing dependencies by aarnphm in https://github.com/bentoml/OpenLLM/pull/591


**Full Changelog**: https://github.com/bentoml/OpenLLM/compare/v0.4.0...v0.4.1

0.4.0

Usage

All available models: openllm models

To start a LLM: python -m openllm start opt

To run OpenLLM within a container environment (requires GPUs): docker run --gpus all -it -P ghcr.io/bentoml/openllm:0.4.0 start opt

To run OpenLLM Clojure UI (community-maintained): docker run -p 8420:80 ghcr.io/bentoml/openllm-ui-clojure:0.4.0

Find more information about this release in the [CHANGELOG.md](https://github.com/bentoml/OpenLLM/blob/main/CHANGELOG.md)

What's Changed
* ci: pre-commit autoupdate [pre-commit.ci] by pre-commit-ci in https://github.com/bentoml/OpenLLM/pull/563
* chore(deps): bump aquasecurity/trivy-action from 0.13.0 to 0.13.1 by dependabot in https://github.com/bentoml/OpenLLM/pull/562
* chore(deps): bump taiki-e/install-action from 2.21.3 to 2.21.7 by dependabot in https://github.com/bentoml/OpenLLM/pull/561
* chore(deps-dev): bump eslint from 8.47.0 to 8.53.0 by dependabot in https://github.com/bentoml/OpenLLM/pull/558
* chore(deps): bump vercel/og from 0.5.18 to 0.5.20 by dependabot in https://github.com/bentoml/OpenLLM/pull/556
* chore(deps-dev): bump types/react from 18.2.20 to 18.2.35 by dependabot in https://github.com/bentoml/OpenLLM/pull/559
* chore(deps-dev): bump typescript-eslint/eslint-plugin from 6.9.0 to 6.10.0 by dependabot in https://github.com/bentoml/OpenLLM/pull/564
* fix : updated client to toggle tls verification by ABHISHEK03312 in https://github.com/bentoml/OpenLLM/pull/532
* perf: unify LLM interface by aarnphm in https://github.com/bentoml/OpenLLM/pull/518
* fix(stop): stop is not available in config by aarnphm in https://github.com/bentoml/OpenLLM/pull/566
* infra: update docs on serving fine-tuning layers by aarnphm in https://github.com/bentoml/OpenLLM/pull/567
* fix: update build dependencies and format chat prompt by aarnphm in https://github.com/bentoml/OpenLLM/pull/569
* chore(examples): update openai client by aarnphm in https://github.com/bentoml/OpenLLM/pull/568
* fix(client): one-shot generation construction by aarnphm in https://github.com/bentoml/OpenLLM/pull/570
* feat: Mistral support by aarnphm in https://github.com/bentoml/OpenLLM/pull/571

New Contributors
* ABHISHEK03312 made their first contribution in https://github.com/bentoml/OpenLLM/pull/532

**Full Changelog**: https://github.com/bentoml/OpenLLM/compare/v0.3.14...v0.4.0

0.3.14

Usage

All available models: openllm models

To start a LLM: python -m openllm start opt

To run OpenLLM within a container environment (requires GPUs): docker run --gpus all -it -P ghcr.io/bentoml/openllm:0.3.14 start opt

To run OpenLLM Clojure UI (community-maintained): docker run -p 8420:80 ghcr.io/bentoml/openllm-ui-clojure:0.3.14

Find more information about this release in the [CHANGELOG.md](https://github.com/bentoml/OpenLLM/blob/main/CHANGELOG.md)



What's Changed
* chore(deps): bump taiki-e/install-action from 2.20.15 to 2.21.3 by dependabot in https://github.com/bentoml/OpenLLM/pull/546
* ci: pre-commit autoupdate [pre-commit.ci] by pre-commit-ci in https://github.com/bentoml/OpenLLM/pull/548
* chore(deps): bump aquasecurity/trivy-action from 0.12.0 to 0.13.0 by dependabot in https://github.com/bentoml/OpenLLM/pull/545
* chore(deps): bump github/codeql-action from 2.22.4 to 2.22.5 by dependabot in https://github.com/bentoml/OpenLLM/pull/544
* fix: update llama2 notebook example by xianml in https://github.com/bentoml/OpenLLM/pull/516
* chore(deps-dev): bump types/react from 18.2.20 to 18.2.33 by dependabot in https://github.com/bentoml/OpenLLM/pull/542
* chore(deps-dev): bump typescript-eslint/eslint-plugin from 6.8.0 to 6.9.0 by dependabot in https://github.com/bentoml/OpenLLM/pull/537
* chore(deps-dev): bump edge-runtime/vm from 3.1.4 to 3.1.6 by dependabot in https://github.com/bentoml/OpenLLM/pull/540
* chore(deps-dev): bump eslint from 8.47.0 to 8.52.0 by dependabot in https://github.com/bentoml/OpenLLM/pull/541
* fix: Max new tokens by XunchaoZ in https://github.com/bentoml/OpenLLM/pull/550
* chore(inference): update vllm to 0.2.1.post1 and update config parsing by aarnphm in https://github.com/bentoml/OpenLLM/pull/554


**Full Changelog**: https://github.com/bentoml/OpenLLM/compare/v0.3.13...v0.3.14

Page 13 of 24

© 2025 Safety CLI Cybersecurity Inc. All Rights Reserved.