Usage
All available models: openllm models
To start a LLM: python -m openllm start opt
To run OpenLLM within a container environment (requires GPUs): docker run --gpus all -it -P ghcr.io/bentoml/openllm:0.3.6 start opt
To run OpenLLM Clojure UI (community-maintained): docker run -p 8420:80 ghcr.io/bentoml/openllm-ui-clojure:0.3.6
Find more information about this release in the [CHANGELOG.md](https://github.com/bentoml/OpenLLM/blob/main/CHANGELOG.md)
What's Changed
* ci: pre-commit autoupdate [pre-commit.ci] by pre-commit-ci in https://github.com/bentoml/OpenLLM/pull/374
* chore(deps): bump peter-evans/create-pull-request from 4.2.4 to 5.0.2 by dependabot in https://github.com/bentoml/OpenLLM/pull/373
* feat: support continuous batching on `generate` by aarnphm in https://github.com/bentoml/OpenLLM/pull/375
**Full Changelog**: https://github.com/bentoml/OpenLLM/compare/v0.3.5...v0.3.6