Openllm

Latest version: v0.6.14

Safety actively analyzes 681844 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 12 of 22

0.4.1

Usage

All available models: openllm models

To start a LLM: python -m openllm start opt

To run OpenLLM within a container environment (requires GPUs): docker run --gpus all -it -P ghcr.io/bentoml/openllm:0.4.1 start opt

To run OpenLLM Clojure UI (community-maintained): docker run -p 8420:80 ghcr.io/bentoml/openllm-ui-clojure:0.4.1

Find more information about this release in the [CHANGELOG.md](https://github.com/bentoml/OpenLLM/blob/main/CHANGELOG.md)



What's Changed
* chore(runner): yield the outputs directly by aarnphm in https://github.com/bentoml/OpenLLM/pull/573
* chore(openai): simplify client examples by aarnphm in https://github.com/bentoml/OpenLLM/pull/574
* fix(examples): correct dependencies in requirements.txt [skip ci] by aarnphm in https://github.com/bentoml/OpenLLM/pull/575
* refactor: cleanup typing to expose correct API by aarnphm in https://github.com/bentoml/OpenLLM/pull/576
* fix(stubs): update initialisation types by aarnphm in https://github.com/bentoml/OpenLLM/pull/577
* refactor(strategies): move logics into openllm-python by aarnphm in https://github.com/bentoml/OpenLLM/pull/578
* chore(service): cleanup API by aarnphm in https://github.com/bentoml/OpenLLM/pull/579
* infra: disable npm updates and correct python packages by aarnphm in https://github.com/bentoml/OpenLLM/pull/580
* chore(deps): bump aquasecurity/trivy-action from 0.13.1 to 0.14.0 by dependabot in https://github.com/bentoml/OpenLLM/pull/583
* chore(deps): bump taiki-e/install-action from 2.21.7 to 2.21.8 by dependabot in https://github.com/bentoml/OpenLLM/pull/581
* chore(deps): bump sigstore/cosign-installer from 3.1.2 to 3.2.0 by dependabot in https://github.com/bentoml/OpenLLM/pull/582
* fix: device imports using strategies by aarnphm in https://github.com/bentoml/OpenLLM/pull/584
* fix(gptq): update config fields by aarnphm in https://github.com/bentoml/OpenLLM/pull/585
* fix: unbound variable for completion client by aarnphm in https://github.com/bentoml/OpenLLM/pull/587
* fix(awq): correct awq detection for support by aarnphm in https://github.com/bentoml/OpenLLM/pull/586
* feat(vllm): squeezellm by aarnphm in https://github.com/bentoml/OpenLLM/pull/588
* docs: update quantization notes by aarnphm in https://github.com/bentoml/OpenLLM/pull/589
* fix(cli): append model-id instruction to build by aarnphm in https://github.com/bentoml/OpenLLM/pull/590
* container: update tracing dependencies by aarnphm in https://github.com/bentoml/OpenLLM/pull/591


**Full Changelog**: https://github.com/bentoml/OpenLLM/compare/v0.4.0...v0.4.1

0.4.0

Usage

All available models: openllm models

To start a LLM: python -m openllm start opt

To run OpenLLM within a container environment (requires GPUs): docker run --gpus all -it -P ghcr.io/bentoml/openllm:0.4.0 start opt

To run OpenLLM Clojure UI (community-maintained): docker run -p 8420:80 ghcr.io/bentoml/openllm-ui-clojure:0.4.0

Find more information about this release in the [CHANGELOG.md](https://github.com/bentoml/OpenLLM/blob/main/CHANGELOG.md)

What's Changed
* ci: pre-commit autoupdate [pre-commit.ci] by pre-commit-ci in https://github.com/bentoml/OpenLLM/pull/563
* chore(deps): bump aquasecurity/trivy-action from 0.13.0 to 0.13.1 by dependabot in https://github.com/bentoml/OpenLLM/pull/562
* chore(deps): bump taiki-e/install-action from 2.21.3 to 2.21.7 by dependabot in https://github.com/bentoml/OpenLLM/pull/561
* chore(deps-dev): bump eslint from 8.47.0 to 8.53.0 by dependabot in https://github.com/bentoml/OpenLLM/pull/558
* chore(deps): bump vercel/og from 0.5.18 to 0.5.20 by dependabot in https://github.com/bentoml/OpenLLM/pull/556
* chore(deps-dev): bump types/react from 18.2.20 to 18.2.35 by dependabot in https://github.com/bentoml/OpenLLM/pull/559
* chore(deps-dev): bump typescript-eslint/eslint-plugin from 6.9.0 to 6.10.0 by dependabot in https://github.com/bentoml/OpenLLM/pull/564
* fix : updated client to toggle tls verification by ABHISHEK03312 in https://github.com/bentoml/OpenLLM/pull/532
* perf: unify LLM interface by aarnphm in https://github.com/bentoml/OpenLLM/pull/518
* fix(stop): stop is not available in config by aarnphm in https://github.com/bentoml/OpenLLM/pull/566
* infra: update docs on serving fine-tuning layers by aarnphm in https://github.com/bentoml/OpenLLM/pull/567
* fix: update build dependencies and format chat prompt by aarnphm in https://github.com/bentoml/OpenLLM/pull/569
* chore(examples): update openai client by aarnphm in https://github.com/bentoml/OpenLLM/pull/568
* fix(client): one-shot generation construction by aarnphm in https://github.com/bentoml/OpenLLM/pull/570
* feat: Mistral support by aarnphm in https://github.com/bentoml/OpenLLM/pull/571

New Contributors
* ABHISHEK03312 made their first contribution in https://github.com/bentoml/OpenLLM/pull/532

**Full Changelog**: https://github.com/bentoml/OpenLLM/compare/v0.3.14...v0.4.0

0.3.14

Usage

All available models: openllm models

To start a LLM: python -m openllm start opt

To run OpenLLM within a container environment (requires GPUs): docker run --gpus all -it -P ghcr.io/bentoml/openllm:0.3.14 start opt

To run OpenLLM Clojure UI (community-maintained): docker run -p 8420:80 ghcr.io/bentoml/openllm-ui-clojure:0.3.14

Find more information about this release in the [CHANGELOG.md](https://github.com/bentoml/OpenLLM/blob/main/CHANGELOG.md)



What's Changed
* chore(deps): bump taiki-e/install-action from 2.20.15 to 2.21.3 by dependabot in https://github.com/bentoml/OpenLLM/pull/546
* ci: pre-commit autoupdate [pre-commit.ci] by pre-commit-ci in https://github.com/bentoml/OpenLLM/pull/548
* chore(deps): bump aquasecurity/trivy-action from 0.12.0 to 0.13.0 by dependabot in https://github.com/bentoml/OpenLLM/pull/545
* chore(deps): bump github/codeql-action from 2.22.4 to 2.22.5 by dependabot in https://github.com/bentoml/OpenLLM/pull/544
* fix: update llama2 notebook example by xianml in https://github.com/bentoml/OpenLLM/pull/516
* chore(deps-dev): bump types/react from 18.2.20 to 18.2.33 by dependabot in https://github.com/bentoml/OpenLLM/pull/542
* chore(deps-dev): bump typescript-eslint/eslint-plugin from 6.8.0 to 6.9.0 by dependabot in https://github.com/bentoml/OpenLLM/pull/537
* chore(deps-dev): bump edge-runtime/vm from 3.1.4 to 3.1.6 by dependabot in https://github.com/bentoml/OpenLLM/pull/540
* chore(deps-dev): bump eslint from 8.47.0 to 8.52.0 by dependabot in https://github.com/bentoml/OpenLLM/pull/541
* fix: Max new tokens by XunchaoZ in https://github.com/bentoml/OpenLLM/pull/550
* chore(inference): update vllm to 0.2.1.post1 and update config parsing by aarnphm in https://github.com/bentoml/OpenLLM/pull/554


**Full Changelog**: https://github.com/bentoml/OpenLLM/compare/v0.3.13...v0.3.14

0.3.13

Usage

All available models: openllm models

To start a LLM: python -m openllm start opt

To run OpenLLM within a container environment (requires GPUs): docker run --gpus all -it -P ghcr.io/bentoml/openllm:0.3.13 start opt

To run OpenLLM Clojure UI (community-maintained): docker run -p 8420:80 ghcr.io/bentoml/openllm-ui-clojure:0.3.13

Find more information about this release in the [CHANGELOG.md](https://github.com/bentoml/OpenLLM/blob/main/CHANGELOG.md)



**Full Changelog**: https://github.com/bentoml/OpenLLM/compare/v0.3.12...v0.3.13

0.3.12

Usage

All available models: openllm models

To start a LLM: python -m openllm start opt

To run OpenLLM within a container environment (requires GPUs): docker run --gpus all -it -P ghcr.io/bentoml/openllm:0.3.12 start opt

To run OpenLLM Clojure UI (community-maintained): docker run -p 8420:80 ghcr.io/bentoml/openllm-ui-clojure:0.3.12

Find more information about this release in the [CHANGELOG.md](https://github.com/bentoml/OpenLLM/blob/main/CHANGELOG.md)



**Full Changelog**: https://github.com/bentoml/OpenLLM/compare/v0.3.11...v0.3.12

0.3.10

Usage

All available models: openllm models

To start a LLM: python -m openllm start opt

To run OpenLLM within a container environment (requires GPUs): docker run --gpus all -it -P ghcr.io/bentoml/openllm:0.3.10 start opt

To run OpenLLM Clojure UI (community-maintained): docker run -p 8420:80 ghcr.io/bentoml/openllm-ui-clojure:0.3.10

Find more information about this release in the [CHANGELOG.md](https://github.com/bentoml/OpenLLM/blob/main/CHANGELOG.md)



What's Changed
* chore(deps-dev): bump typescript-eslint/parser from 6.7.5 to 6.8.0 by dependabot in https://github.com/bentoml/OpenLLM/pull/511
* chore(deps-dev): bump next/eslint-plugin-next from 13.5.4 to 13.5.5 by dependabot in https://github.com/bentoml/OpenLLM/pull/510
* chore(deps-dev): bump typescript-eslint/eslint-plugin from 6.7.5 to 6.8.0 by dependabot in https://github.com/bentoml/OpenLLM/pull/509
* ci: pre-commit autoupdate [pre-commit.ci] by pre-commit-ci in https://github.com/bentoml/OpenLLM/pull/529
* chore(deps): bump actions/checkout from 4.1.0 to 4.1.1 by dependabot in https://github.com/bentoml/OpenLLM/pull/528
* chore(deps): bump github/codeql-action from 2.22.3 to 2.22.4 by dependabot in https://github.com/bentoml/OpenLLM/pull/527
* chore(deps): bump taiki-e/install-action from 2.20.3 to 2.20.15 by dependabot in https://github.com/bentoml/OpenLLM/pull/526
* chore(deps-dev): bump eslint-plugin-import from 2.28.1 to 2.29.0 by dependabot in https://github.com/bentoml/OpenLLM/pull/525
* chore(deps-dev): bump turbo from 1.10.15 to 1.10.16 by dependabot in https://github.com/bentoml/OpenLLM/pull/521
* chore(deps-dev): bump types/dedent from 0.7.0 to 0.7.1 by dependabot in https://github.com/bentoml/OpenLLM/pull/524
* chore(falcon): Use official implementation by aarnphm in https://github.com/bentoml/OpenLLM/pull/530
* chore(deps-dev): bump types/node from 20.5.3 to 20.8.7 by dependabot in https://github.com/bentoml/OpenLLM/pull/522
* feat: Conversation template by XunchaoZ in https://github.com/bentoml/OpenLLM/pull/519


**Full Changelog**: https://github.com/bentoml/OpenLLM/compare/v0.3.9...v0.3.10

Page 12 of 22

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.