Openllm

Latest version: v0.6.14

Safety actively analyzes 681866 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 18 of 22

0.2.0

LlaMA, Baichuan and GPT-NeoX supported!

LlaMA 2 is also supported

python
openllm start llama --model-id meta-llama/Llama-2-13b-hf


What's Changed
* feat: GPTNeoX by aarnphm in https://github.com/bentoml/OpenLLM/pull/106
* feat(test): snapshot testing by aarnphm in https://github.com/bentoml/OpenLLM/pull/107
* fix(resource): correctly parse CUDA_VISIBLE_DEVICES by aarnphm in https://github.com/bentoml/OpenLLM/pull/114
* feat(models): Baichuan by hetaoBackend in https://github.com/bentoml/OpenLLM/pull/115
* fix: add the requirements for baichuan by hetaoBackend in https://github.com/bentoml/OpenLLM/pull/117
* fix: build isolation by aarnphm in https://github.com/bentoml/OpenLLM/pull/116
* ci: pre-commit autoupdate [pre-commit.ci] by pre-commit-ci in https://github.com/bentoml/OpenLLM/pull/119
* feat: GPTQ + vLLM and LlaMA by aarnphm in https://github.com/bentoml/OpenLLM/pull/113

New Contributors
* hetaoBackend made their first contribution in https://github.com/bentoml/OpenLLM/pull/115

**Full Changelog**: https://github.com/bentoml/OpenLLM/compare/v0.1.20...v0.2.0

0.1.20

Usage

All available models: python -m openllm.models

To start a LLM: python -m openllm start dolly-v2

Find more information about this release in the [CHANGELOG.md](https://github.com/bentoml/OpenLLM/blob/main/CHANGELOG.md)

What's Changed
* fix: running MPT on CPU by aarnphm in https://github.com/bentoml/OpenLLM/pull/92
* tests: add sanity check for openllm.client by aarnphm in https://github.com/bentoml/OpenLLM/pull/93
* feat: custom dockerfile templates by aarnphm in https://github.com/bentoml/OpenLLM/pull/95
* feat(llm): fine-tuning Falcon by aarnphm in https://github.com/bentoml/OpenLLM/pull/98
* feat: add citation by aarnphm in https://github.com/bentoml/OpenLLM/pull/103
* peft: improve speed and quality by aarnphm in https://github.com/bentoml/OpenLLM/pull/102
* chore: fix mpt loading on single GPU by aarnphm in https://github.com/bentoml/OpenLLM/pull/105


**Full Changelog**: https://github.com/bentoml/OpenLLM/compare/v0.1.19...v0.1.20

0.1.19

Usage

All available models: python -m openllm.models

To start a LLM: python -m openllm start dolly-v2

Find more information about this release in the [CHANGELOG.md](https://github.com/bentoml/OpenLLM/blob/main/CHANGELOG.md)



**Full Changelog**: https://github.com/bentoml/OpenLLM/compare/v0.1.18...v0.1.19

0.1.17

Usage

All available models: python -m openllm.models

To start a LLM: python -m openllm start dolly-v2

Find more information about this release in the [CHANGELOG.md](https://github.com/bentoml/OpenLLM/blob/main/CHANGELOG.md)


What's Changed
* feat(start): `openllm start bento` by aarnphm in https://github.com/bentoml/OpenLLM/pull/80


**Full Changelog**: https://github.com/bentoml/OpenLLM/compare/v0.1.16...v0.1.17

0.1.15

Usage

All available models: python -m openllm.models

To start a LLM: python -m openllm start dolly-v2

Find more information about this release in the [CHANGELOG.md](https://github.com/bentoml/OpenLLM/blob/main/CHANGELOG.md)

What's Changed
* chore: better gif quality by aarnphm in https://github.com/bentoml/OpenLLM/pull/71
* [pre-commit.ci] pre-commit autoupdate by pre-commit-ci in https://github.com/bentoml/OpenLLM/pull/74
* feat: cascading resource strategies by aarnphm in https://github.com/bentoml/OpenLLM/pull/72


**Full Changelog**: https://github.com/bentoml/OpenLLM/compare/v0.1.14...v0.1.15

0.1.14

Usage

All available models: python -m openllm.models

To start a LLM: python -m openllm start dolly-v2

Find more information about this release in the [CHANGELOG.md](https://github.com/bentoml/OpenLLM/blob/main/CHANGELOG.md)


What's Changed
* models: migrate away from pipelines by aarnphm in https://github.com/bentoml/OpenLLM/pull/60
* fix(test): robustness by aarnphm in https://github.com/bentoml/OpenLLM/pull/64
* fix: converting envvar to string by aarnphm in https://github.com/bentoml/OpenLLM/pull/68
* chore: add more test matrices by aarnphm in https://github.com/bentoml/OpenLLM/pull/70
* feat: release binary distribution by aarnphm in https://github.com/bentoml/OpenLLM/pull/66


**Full Changelog**: https://github.com/bentoml/OpenLLM/compare/v0.1.13...v0.1.14

Page 18 of 22

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.