Usage
All available models: openllm models
To start a LLM: python -m openllm start opt
To run OpenLLM within a container environment (requires GPUs): docker run --gpus all -it -P ghcr.io/bentoml/openllm:0.4.4 start opt
To run OpenLLM Clojure UI (community-maintained): docker run -p 8420:80 ghcr.io/bentoml/openllm-ui-clojure:0.4.4
Find more information about this release in the [CHANGELOG.md](https://github.com/bentoml/OpenLLM/blob/main/CHANGELOG.md)
What's Changed
* chore: no need compat workaround for setting cell_contents by aarnphm in https://github.com/bentoml/OpenLLM/pull/616
* chore(llm): expose quantise and lazy load heavy imports by aarnphm in https://github.com/bentoml/OpenLLM/pull/617
* feat(llm): update warning envvar and add embedded mode by aarnphm in https://github.com/bentoml/OpenLLM/pull/618
**Full Changelog**: https://github.com/bentoml/OpenLLM/compare/v0.4.3...v0.4.4