Openllm

Latest version: v0.6.14

Safety actively analyzes 681866 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 16 of 22

0.2.16

Fixes a regression introduced between 0.2.13 to 0.2.15 wrt to vLLM not able to run correctly within the docker container

**Full Changelog**: https://github.com/bentoml/OpenLLM/compare/v0.2.13...v0.2.16

0.2.13

What changes?

Fixes auto-gptq kernel CUDA within base container.
Add support for all vLLM models. Update the vllm to latest stable commit.


**Full Changelog**: https://github.com/bentoml/OpenLLM/compare/v0.2.12...v0.2.13

0.2.12

News

OpenLLM now release a base container containing all compiled kernels, removing the needs for building kernels with `openllm build` when using vLLM or auto-gptq

vLLM supports (experimental)

Currently, only OPT and Llama 2 supports vLLM. Simply use `OPENLLM_LLAMA_FRAMEWORK=vllm` to startup openllm runners with vllm.

Installation

bash

0.2.11

Usage

All available models: openllm models

To start a LLM: python -m openllm start opt

Find more information about this release in the [CHANGELOG.md](https://github.com/bentoml/OpenLLM/blob/main/CHANGELOG.md)



What's Changed
* fix(ci): correct tag for checkout by aarnphm in https://github.com/bentoml/OpenLLM/pull/150
* fix: disable auto fixes by aarnphm in https://github.com/bentoml/OpenLLM/pull/151
* chore: add nous to example default id as non-gated Llama by aarnphm in https://github.com/bentoml/OpenLLM/pull/152
* feat: supports embeddings for T5 and ChatGLM family generation by aarnphm in https://github.com/bentoml/OpenLLM/pull/153


**Full Changelog**: https://github.com/bentoml/OpenLLM/compare/v0.2.10...v0.2.11

0.2.10

Usage

All available models: openllm models

To start a LLM: python -m openllm start opt

Find more information about this release in the [CHANGELOG.md](https://github.com/bentoml/OpenLLM/blob/main/CHANGELOG.md)

What's Changed
* feat(ci): automatic release semver + git archival installation by aarnphm in https://github.com/bentoml/OpenLLM/pull/143
* docs: remove extraneous whitespace by aarnphm in https://github.com/bentoml/OpenLLM/pull/144
* docs: update fine tuning model support by aarnphm in https://github.com/bentoml/OpenLLM/pull/145
* fix(build): running from container choosing models correctly by aarnphm in https://github.com/bentoml/OpenLLM/pull/141
* feat(client): embeddings by aarnphm in https://github.com/bentoml/OpenLLM/pull/146


**Full Changelog**: https://github.com/bentoml/OpenLLM/compare/v0.2.9...v0.2.10

0.2.9

Usage

All available models: openllm models

To start a LLM: python -m openllm start opt

Find more information about this release in the [CHANGELOG.md](https://github.com/bentoml/OpenLLM/blob/main/CHANGELOG.md)



What's Changed
* ci: release python earlier than building binary wheels by aarnphm in https://github.com/bentoml/OpenLLM/pull/138
* docs: Update README.md by parano in https://github.com/bentoml/OpenLLM/pull/139


**Full Changelog**: https://github.com/bentoml/OpenLLM/compare/v0.2.8...v0.2.9

Page 16 of 22

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.