Openllm

Latest version: v0.6.19

Safety actively analyzes 706259 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 11 of 23

0.4.12

Usage

All available models: openllm models

To start a LLM: python -m openllm start opt

To run OpenLLM within a container environment (requires GPUs): docker run --gpus all -it -P ghcr.io/bentoml/openllm:0.4.12 start opt

To run OpenLLM Clojure UI (community-maintained): docker run -p 8420:80 ghcr.io/bentoml/openllm-ui-clojure:0.4.12

Find more information about this release in the [CHANGELOG.md](https://github.com/bentoml/OpenLLM/blob/main/CHANGELOG.md)



What's Changed
* fix(envvar): explicitly set NVIDIA_DRIVER_CAPABILITIES by aarnphm in https://github.com/bentoml/OpenLLM/pull/681
* fix(torch_dtype): correctly infer based on options by aarnphm in https://github.com/bentoml/OpenLLM/pull/682


**Full Changelog**: https://github.com/bentoml/OpenLLM/compare/v0.4.11...v0.4.12

0.4.11

Usage

All available models: openllm models

To start a LLM: python -m openllm start opt

To run OpenLLM within a container environment (requires GPUs): docker run --gpus all -it -P ghcr.io/bentoml/openllm:0.4.11 start opt

To run OpenLLM Clojure UI (community-maintained): docker run -p 8420:80 ghcr.io/bentoml/openllm-ui-clojure:0.4.11

Find more information about this release in the [CHANGELOG.md](https://github.com/bentoml/OpenLLM/blob/main/CHANGELOG.md)



What's Changed
* infra: update cbfmt options by aarnphm in https://github.com/bentoml/OpenLLM/pull/676
* fix(examples): add support for streaming feature by aarnphm in https://github.com/bentoml/OpenLLM/pull/677
* fix: correct set item for attrs >23.1 by aarnphm in https://github.com/bentoml/OpenLLM/pull/678
* fix(build): correctly parse default env for container by aarnphm in https://github.com/bentoml/OpenLLM/pull/679
* fix(env): correct format environment on docker by aarnphm in https://github.com/bentoml/OpenLLM/pull/680


**Full Changelog**: https://github.com/bentoml/OpenLLM/compare/v0.4.10...v0.4.11

0.4.10

Usage

All available models: openllm models

To start a LLM: python -m openllm start opt

To run OpenLLM within a container environment (requires GPUs): docker run --gpus all -it -P ghcr.io/bentoml/openllm:0.4.10 start opt

To run OpenLLM Clojure UI (community-maintained): docker run -p 8420:80 ghcr.io/bentoml/openllm-ui-clojure:0.4.10

Find more information about this release in the [CHANGELOG.md](https://github.com/bentoml/OpenLLM/blob/main/CHANGELOG.md)



What's Changed
* fix(runner): remove keyword args for attrs.get() by jeffwang0516 in https://github.com/bentoml/OpenLLM/pull/661
* fix: update notebook by xianml in https://github.com/bentoml/OpenLLM/pull/662
* feat(type): provide structured annotations stubs by aarnphm in https://github.com/bentoml/OpenLLM/pull/663
* feat(llm): respect warnings environment for dtype warning by aarnphm in https://github.com/bentoml/OpenLLM/pull/664
* infra: makes huggingface-hub requirements on fine-tune by aarnphm in https://github.com/bentoml/OpenLLM/pull/665
* types: update stubs for remaining entrypoints by aarnphm in https://github.com/bentoml/OpenLLM/pull/667
* perf: reduce footprint by aarnphm in https://github.com/bentoml/OpenLLM/pull/668
* perf(build): locking and improve build speed by aarnphm in https://github.com/bentoml/OpenLLM/pull/669
* docs: add LlamaIndex integration by aarnphm in https://github.com/bentoml/OpenLLM/pull/646
* infra: remove codegolf by aarnphm in https://github.com/bentoml/OpenLLM/pull/671
* feat(models): Phi 1.5 by aarnphm in https://github.com/bentoml/OpenLLM/pull/672
* fix(docs): chatglm support on vLLM by aarnphm in https://github.com/bentoml/OpenLLM/pull/673
* chore(loading): include verbose warning about trust_remote_code by aarnphm in https://github.com/bentoml/OpenLLM/pull/674
* perf: potentially reduce image size by aarnphm in https://github.com/bentoml/OpenLLM/pull/675

New Contributors
* jeffwang0516 made their first contribution in https://github.com/bentoml/OpenLLM/pull/661

**Full Changelog**: https://github.com/bentoml/OpenLLM/compare/v0.4.9...v0.4.10

0.4.9

Usage

All available models: openllm models

To start a LLM: python -m openllm start opt

To run OpenLLM within a container environment (requires GPUs): docker run --gpus all -it -P ghcr.io/bentoml/openllm:0.4.9 start opt

To run OpenLLM Clojure UI (community-maintained): docker run -p 8420:80 ghcr.io/bentoml/openllm-ui-clojure:0.4.9

Find more information about this release in the [CHANGELOG.md](https://github.com/bentoml/OpenLLM/blob/main/CHANGELOG.md)



What's Changed
* infra: update scripts to run update readme automatically by aarnphm in https://github.com/bentoml/OpenLLM/pull/658
* chore: update requirements in README.md by aarnphm in https://github.com/bentoml/OpenLLM/pull/659
* fix(falcon): remove early_stopping default arguments by aarnphm in https://github.com/bentoml/OpenLLM/pull/660


**Full Changelog**: https://github.com/bentoml/OpenLLM/compare/v0.4.8...v0.4.9

0.4.8

Usage

All available models: openllm models

To start a LLM: python -m openllm start opt

To run OpenLLM within a container environment (requires GPUs): docker run --gpus all -it -P ghcr.io/bentoml/openllm:0.4.8 start opt

To run OpenLLM Clojure UI (community-maintained): docker run -p 8420:80 ghcr.io/bentoml/openllm-ui-clojure:0.4.8

Find more information about this release in the [CHANGELOG.md](https://github.com/bentoml/OpenLLM/blob/main/CHANGELOG.md)



What's Changed
* docs: update instruction adding new models and remove command docstring by aarnphm in https://github.com/bentoml/OpenLLM/pull/654
* chore(cli): move playground to CLI components by aarnphm in https://github.com/bentoml/OpenLLM/pull/655
* perf: improve build logics and cleanup speed by aarnphm in https://github.com/bentoml/OpenLLM/pull/657


**Full Changelog**: https://github.com/bentoml/OpenLLM/compare/v0.4.7...v0.4.8

0.4.7

Usage

All available models: openllm models

To start a LLM: python -m openllm start opt

To run OpenLLM within a container environment (requires GPUs): docker run --gpus all -it -P ghcr.io/bentoml/openllm:0.4.7 start opt

To run OpenLLM Clojure UI (community-maintained): docker run -p 8420:80 ghcr.io/bentoml/openllm-ui-clojure:0.4.7

Find more information about this release in the [CHANGELOG.md](https://github.com/bentoml/OpenLLM/blob/main/CHANGELOG.md)



What's Changed
* refactor: use DEBUG env-var instead of OPENLLMDEVDEBUG by aarnphm in https://github.com/bentoml/OpenLLM/pull/647
* fix(cli): update context name parsing correctly by aarnphm in https://github.com/bentoml/OpenLLM/pull/652
* feat: Yi models by aarnphm in https://github.com/bentoml/OpenLLM/pull/651
* fix: correct OPENLLM_DEV_BUILD check by xianml in https://github.com/bentoml/OpenLLM/pull/653


**Full Changelog**: https://github.com/bentoml/OpenLLM/compare/v0.4.6...v0.4.7

Page 11 of 23

© 2025 Safety CLI Cybersecurity Inc. All Rights Reserved.