Openllm

Latest version: v0.6.23

Safety actively analyzes 723607 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 12 of 24

0.4.10

Usage

All available models: openllm models

To start a LLM: python -m openllm start opt

To run OpenLLM within a container environment (requires GPUs): docker run --gpus all -it -P ghcr.io/bentoml/openllm:0.4.10 start opt

To run OpenLLM Clojure UI (community-maintained): docker run -p 8420:80 ghcr.io/bentoml/openllm-ui-clojure:0.4.10

Find more information about this release in the [CHANGELOG.md](https://github.com/bentoml/OpenLLM/blob/main/CHANGELOG.md)



What's Changed
* fix(runner): remove keyword args for attrs.get() by jeffwang0516 in https://github.com/bentoml/OpenLLM/pull/661
* fix: update notebook by xianml in https://github.com/bentoml/OpenLLM/pull/662
* feat(type): provide structured annotations stubs by aarnphm in https://github.com/bentoml/OpenLLM/pull/663
* feat(llm): respect warnings environment for dtype warning by aarnphm in https://github.com/bentoml/OpenLLM/pull/664
* infra: makes huggingface-hub requirements on fine-tune by aarnphm in https://github.com/bentoml/OpenLLM/pull/665
* types: update stubs for remaining entrypoints by aarnphm in https://github.com/bentoml/OpenLLM/pull/667
* perf: reduce footprint by aarnphm in https://github.com/bentoml/OpenLLM/pull/668
* perf(build): locking and improve build speed by aarnphm in https://github.com/bentoml/OpenLLM/pull/669
* docs: add LlamaIndex integration by aarnphm in https://github.com/bentoml/OpenLLM/pull/646
* infra: remove codegolf by aarnphm in https://github.com/bentoml/OpenLLM/pull/671
* feat(models): Phi 1.5 by aarnphm in https://github.com/bentoml/OpenLLM/pull/672
* fix(docs): chatglm support on vLLM by aarnphm in https://github.com/bentoml/OpenLLM/pull/673
* chore(loading): include verbose warning about trust_remote_code by aarnphm in https://github.com/bentoml/OpenLLM/pull/674
* perf: potentially reduce image size by aarnphm in https://github.com/bentoml/OpenLLM/pull/675

New Contributors
* jeffwang0516 made their first contribution in https://github.com/bentoml/OpenLLM/pull/661

**Full Changelog**: https://github.com/bentoml/OpenLLM/compare/v0.4.9...v0.4.10

0.4.9

Usage

All available models: openllm models

To start a LLM: python -m openllm start opt

To run OpenLLM within a container environment (requires GPUs): docker run --gpus all -it -P ghcr.io/bentoml/openllm:0.4.9 start opt

To run OpenLLM Clojure UI (community-maintained): docker run -p 8420:80 ghcr.io/bentoml/openllm-ui-clojure:0.4.9

Find more information about this release in the [CHANGELOG.md](https://github.com/bentoml/OpenLLM/blob/main/CHANGELOG.md)



What's Changed
* infra: update scripts to run update readme automatically by aarnphm in https://github.com/bentoml/OpenLLM/pull/658
* chore: update requirements in README.md by aarnphm in https://github.com/bentoml/OpenLLM/pull/659
* fix(falcon): remove early_stopping default arguments by aarnphm in https://github.com/bentoml/OpenLLM/pull/660


**Full Changelog**: https://github.com/bentoml/OpenLLM/compare/v0.4.8...v0.4.9

0.4.8

Usage

All available models: openllm models

To start a LLM: python -m openllm start opt

To run OpenLLM within a container environment (requires GPUs): docker run --gpus all -it -P ghcr.io/bentoml/openllm:0.4.8 start opt

To run OpenLLM Clojure UI (community-maintained): docker run -p 8420:80 ghcr.io/bentoml/openllm-ui-clojure:0.4.8

Find more information about this release in the [CHANGELOG.md](https://github.com/bentoml/OpenLLM/blob/main/CHANGELOG.md)



What's Changed
* docs: update instruction adding new models and remove command docstring by aarnphm in https://github.com/bentoml/OpenLLM/pull/654
* chore(cli): move playground to CLI components by aarnphm in https://github.com/bentoml/OpenLLM/pull/655
* perf: improve build logics and cleanup speed by aarnphm in https://github.com/bentoml/OpenLLM/pull/657


**Full Changelog**: https://github.com/bentoml/OpenLLM/compare/v0.4.7...v0.4.8

0.4.7

Usage

All available models: openllm models

To start a LLM: python -m openllm start opt

To run OpenLLM within a container environment (requires GPUs): docker run --gpus all -it -P ghcr.io/bentoml/openllm:0.4.7 start opt

To run OpenLLM Clojure UI (community-maintained): docker run -p 8420:80 ghcr.io/bentoml/openllm-ui-clojure:0.4.7

Find more information about this release in the [CHANGELOG.md](https://github.com/bentoml/OpenLLM/blob/main/CHANGELOG.md)



What's Changed
* refactor: use DEBUG env-var instead of OPENLLMDEVDEBUG by aarnphm in https://github.com/bentoml/OpenLLM/pull/647
* fix(cli): update context name parsing correctly by aarnphm in https://github.com/bentoml/OpenLLM/pull/652
* feat: Yi models by aarnphm in https://github.com/bentoml/OpenLLM/pull/651
* fix: correct OPENLLM_DEV_BUILD check by xianml in https://github.com/bentoml/OpenLLM/pull/653


**Full Changelog**: https://github.com/bentoml/OpenLLM/compare/v0.4.6...v0.4.7

0.4.6

Usage

All available models: openllm models

To start a LLM: python -m openllm start opt

To run OpenLLM within a container environment (requires GPUs): docker run --gpus all -it -P ghcr.io/bentoml/openllm:0.4.6 start opt

To run OpenLLM Clojure UI (community-maintained): docker run -p 8420:80 ghcr.io/bentoml/openllm-ui-clojure:0.4.6

Find more information about this release in the [CHANGELOG.md](https://github.com/bentoml/OpenLLM/blob/main/CHANGELOG.md)



What's Changed
* chore: cleanup unused code path by aarnphm in https://github.com/bentoml/OpenLLM/pull/633
* perf(model): update mistral inference parameters and prompt format by larme in https://github.com/bentoml/OpenLLM/pull/632
* infra: remove unused postprocess_generate by aarnphm in https://github.com/bentoml/OpenLLM/pull/634
* docs: update README.md by aarnphm in https://github.com/bentoml/OpenLLM/pull/635
* fix(client): correct destructor the httpx object boht sync and async by aarnphm in https://github.com/bentoml/OpenLLM/pull/636
* doc: update adding new model guide by larme in https://github.com/bentoml/OpenLLM/pull/637
* fix(generation): compatibility dtype with CPU by aarnphm in https://github.com/bentoml/OpenLLM/pull/638
* fix(cpu): more verbose definition for dtype casting by aarnphm in https://github.com/bentoml/OpenLLM/pull/639
* fix(service): to yield out correct JSON objects by aarnphm in https://github.com/bentoml/OpenLLM/pull/640
* fix(cli): set default dtype to auto infer by aarnphm in https://github.com/bentoml/OpenLLM/pull/642
* fix(dependencies): lock build < 1 for now by aarnphm in https://github.com/bentoml/OpenLLM/pull/643
* chore(openapi): unify inject param by aarnphm in https://github.com/bentoml/OpenLLM/pull/645


**Full Changelog**: https://github.com/bentoml/OpenLLM/compare/v0.4.5...v0.4.6

0.4.5

Usage

All available models: openllm models

To start a LLM: python -m openllm start opt

To run OpenLLM within a container environment (requires GPUs): docker run --gpus all -it -P ghcr.io/bentoml/openllm:0.4.5 start opt

To run OpenLLM Clojure UI (community-maintained): docker run -p 8420:80 ghcr.io/bentoml/openllm-ui-clojure:0.4.5

Find more information about this release in the [CHANGELOG.md](https://github.com/bentoml/OpenLLM/blob/main/CHANGELOG.md)



What's Changed
* refactor(cli): move out to its own packages by aarnphm in https://github.com/bentoml/OpenLLM/pull/619
* fix(cli): correct set working_dir by aarnphm in https://github.com/bentoml/OpenLLM/pull/620
* chore(cli): always show available models by aarnphm in https://github.com/bentoml/OpenLLM/pull/621
* fix(sdk): make sure build to quiet out stdout by aarnphm in https://github.com/bentoml/OpenLLM/pull/622
* chore: update jupyter notebooks with new API by aarnphm in https://github.com/bentoml/OpenLLM/pull/623
* fix(ruff): correct consistency between isort and formatter by aarnphm in https://github.com/bentoml/OpenLLM/pull/624
* feat(vllm): support passing specific dtype by aarnphm in https://github.com/bentoml/OpenLLM/pull/626
* chore(deps): bump taiki-e/install-action from 2.21.8 to 2.21.11 by dependabot in https://github.com/bentoml/OpenLLM/pull/625
* feat(cli): `--dtype` arguments by aarnphm in https://github.com/bentoml/OpenLLM/pull/627
* fix(cli): make sure to pass the dtype to subprocess service by aarnphm in https://github.com/bentoml/OpenLLM/pull/628
* ci: pre-commit autoupdate [pre-commit.ci] by pre-commit-ci in https://github.com/bentoml/OpenLLM/pull/629
* infra: removing clojure frontend from infra cycle by aarnphm in https://github.com/bentoml/OpenLLM/pull/630
* fix(torch_dtype): load eagerly by aarnphm in https://github.com/bentoml/OpenLLM/pull/631


**Full Changelog**: https://github.com/bentoml/OpenLLM/compare/v0.4.4...v0.4.5

Page 12 of 24

© 2025 Safety CLI Cybersecurity Inc. All Rights Reserved.