Usage
All available models: openllm models
To start a LLM: python -m openllm start opt
To run OpenLLM within a container environment (requires GPUs): docker run --gpus all -it -P ghcr.io/bentoml/openllm:0.2.25 start opt
To run OpenLLM Clojure UI (community-maintained): docker run -p 8420:80 ghcr.io/bentoml/openllm-ui-clojure:0.2.25
Find more information about this release in the [CHANGELOG.md](https://github.com/bentoml/OpenLLM/blob/main/CHANGELOG.md)
What's Changed
* chore: upload nightly wheels to test.pypi.org by aarnphm in https://github.com/bentoml/OpenLLM/pull/215
* feat(contrib): ClojureScript UI by GutZuFusss in https://github.com/bentoml/OpenLLM/pull/89
* fix(ci): remove broken build hooks by aarnphm in https://github.com/bentoml/OpenLLM/pull/216
* chore(ci): add dependabot and fix vllm release container by aarnphm in https://github.com/bentoml/OpenLLM/pull/217
* feat(models): add vLLM support for Falcon by aarnphm in https://github.com/bentoml/OpenLLM/pull/223
* chore(readme): update nightly badge [skip ci] by aarnphm in https://github.com/bentoml/OpenLLM/pull/224
New Contributors
* GutZuFusss made their first contribution in https://github.com/bentoml/OpenLLM/pull/89
**Full Changelog**: https://github.com/bentoml/OpenLLM/compare/v0.2.24...v0.2.25