Openllm

Latest version: v0.6.14

Safety actively analyzes 681857 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 9 of 22

0.4.19

Usage

All available models: openllm models

To start a LLM: python -m openllm start HuggingFaceH4/zephyr-7b-beta

To run OpenLLM within a container environment (requires GPUs): docker run --gpus all -it -P -v $PWD/data:$HOME/.cache/huggingface/ ghcr.io/bentoml/openllm:0.4.19 start HuggingFaceH4/zephyr-7b-beta

Find more information about this release in the [CHANGELOG.md](https://github.com/bentoml/OpenLLM/blob/main/CHANGELOG.md)



**Full Changelog**: https://github.com/bentoml/OpenLLM/compare/v0.4.18...v0.4.19

0.4.18

Usage

All available models: openllm models

To start a LLM: python -m openllm start HuggingFaceH4/zephyr-7b-beta

To run OpenLLM within a container environment (requires GPUs): docker run --gpus all -it -P -v $PWD/data:$HOME/.cache/huggingface/ ghcr.io/bentoml/openllm:0.4.18 start HuggingFaceH4/zephyr-7b-beta

Find more information about this release in the [CHANGELOG.md](https://github.com/bentoml/OpenLLM/blob/main/CHANGELOG.md)



What's Changed
* chore: update lower bound version of bentoml to avoid breakage by aarnphm in https://github.com/bentoml/OpenLLM/pull/703
* feat(openai): dynamic model_type registration by aarnphm in https://github.com/bentoml/OpenLLM/pull/704


**Full Changelog**: https://github.com/bentoml/OpenLLM/compare/v0.4.17...v0.4.18

0.4.17

Usage

All available models: openllm models

To start a LLM: python -m openllm start HuggingFaceH4/zephyr-7b-beta

To run OpenLLM within a container environment (requires GPUs): docker run --gpus all -it -P -v $PWD/data:$HOME/.cache/huggingface/ ghcr.io/bentoml/openllm:0.4.17 start HuggingFaceH4/zephyr-7b-beta

Find more information about this release in the [CHANGELOG.md](https://github.com/bentoml/OpenLLM/blob/main/CHANGELOG.md)



What's Changed
* infra: update generate notes and better local handle by aarnphm in https://github.com/bentoml/OpenLLM/pull/701
* fix(backend): correct use variable for backend when initialisation by aarnphm in https://github.com/bentoml/OpenLLM/pull/702


**Full Changelog**: https://github.com/bentoml/OpenLLM/compare/v0.4.16...v0.4.17

0.4.16

Usage

All available models: openllm models

To start a LLM: python -m openllm start opt

To run OpenLLM within a container environment (requires GPUs): docker run --gpus all -it -P ghcr.io/bentoml/openllm:0.4.16 start opt

To run OpenLLM Clojure UI (community-maintained): docker run -p 8420:80 ghcr.io/bentoml/openllm-ui-clojure:0.4.16

Find more information about this release in the [CHANGELOG.md](https://github.com/bentoml/OpenLLM/blob/main/CHANGELOG.md)



What's Changed
* feat(ctranslate): initial infrastructure support by aarnphm in https://github.com/bentoml/OpenLLM/pull/694
* feat(vllm): bump to 0.2.2 by aarnphm in https://github.com/bentoml/OpenLLM/pull/695
* feat(engine): CTranslate2 by aarnphm in https://github.com/bentoml/OpenLLM/pull/698
* chore: update documentation about runtime by aarnphm in https://github.com/bentoml/OpenLLM/pull/699
* chore: update changelog [skip ci] by aarnphm in https://github.com/bentoml/OpenLLM/pull/700


**Full Changelog**: https://github.com/bentoml/OpenLLM/compare/v0.4.15...v0.4.16

0.4.15

Usage

All available models: openllm models

To start a LLM: python -m openllm start opt

To run OpenLLM within a container environment (requires GPUs): docker run --gpus all -it -P ghcr.io/bentoml/openllm:0.4.15 start opt

To run OpenLLM Clojure UI (community-maintained): docker run -p 8420:80 ghcr.io/bentoml/openllm-ui-clojure:0.4.15

Find more information about this release in the [CHANGELOG.md](https://github.com/bentoml/OpenLLM/blob/main/CHANGELOG.md)



What's Changed
* fix(cattrs): strictly lock <23.2 until we upgrade validation logic by aarnphm in https://github.com/bentoml/OpenLLM/pull/690
* fix(annotations): check library through find_spec by aarnphm in https://github.com/bentoml/OpenLLM/pull/691
* feat: heuristics logprobs by aarnphm in https://github.com/bentoml/OpenLLM/pull/692
* chore: update documentation by aarnphm in https://github.com/bentoml/OpenLLM/pull/693


**Full Changelog**: https://github.com/bentoml/OpenLLM/compare/v0.4.14...v0.4.15

0.4.14

Usage

All available models: openllm models

To start a LLM: python -m openllm start opt

To run OpenLLM within a container environment (requires GPUs): docker run --gpus all -it -P ghcr.io/bentoml/openllm:0.4.14 start opt

To run OpenLLM Clojure UI (community-maintained): docker run -p 8420:80 ghcr.io/bentoml/openllm-ui-clojure:0.4.14

Find more information about this release in the [CHANGELOG.md](https://github.com/bentoml/OpenLLM/blob/main/CHANGELOG.md)



What's Changed
* fix(dependencies): ignore broken cattrs release by aarnphm in https://github.com/bentoml/OpenLLM/pull/689


**Full Changelog**: https://github.com/bentoml/OpenLLM/compare/v0.4.13...v0.4.14

Page 9 of 22

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.