Bentoml

Latest version: v1.4.7

Safety actively analyzes 723400 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 13 of 28

1.1.7

Not secure
What's Changed

Update OTEL deps to 0.41b0 to address CVE for 0.39b0

General documentation client updates.

* docs: Add the SDXL deployment quickstart by Sherlock113 in https://github.com/bentoml/BentoML/pull/4175
* Update pytorch.rst by piercus in https://github.com/bentoml/BentoML/pull/4176
* chore(deps): bump actions/checkout from 3 to 4 by dependabot in https://github.com/bentoml/BentoML/pull/4177
* fix: parse tag from multiline output by frostming in https://github.com/bentoml/BentoML/pull/4178
* docs: Update the user management docs by Sherlock113 in https://github.com/bentoml/BentoML/pull/4186
* fix(config): set default runner timeout to 15min by sauyon in https://github.com/bentoml/BentoML/pull/4184
* docs: Add observability to the BentoCloud overview docs by Sherlock113 in https://github.com/bentoml/BentoML/pull/4187
* fix(framework): add args and kwargs to sklearn and xgboost methods by jianshen92 in https://github.com/bentoml/BentoML/pull/4189
* docs: fix typo in bento.rst and model.rst by seedspirit in https://github.com/bentoml/BentoML/pull/4192
* fix: Rename ASGIHTTPSender to BufferedASGISender for Ray compatibility. by HamzaFarhan in https://github.com/bentoml/BentoML/pull/4191
* fix(client): make get_client raise instead of logging by sauyon in https://github.com/bentoml/BentoML/pull/4181
* fix(cloud-client): delete unused field of schema by Haivilo in https://github.com/bentoml/BentoML/pull/4196
* chore(deps): bump docker/setup-buildx-action from 2 to 3 by dependabot in https://github.com/bentoml/BentoML/pull/4195
* chore(deps): bump docker/setup-qemu-action from 2 to 3 by dependabot in https://github.com/bentoml/BentoML/pull/4194
* chore: client_request_hook type fix by sauyon in https://github.com/bentoml/BentoML/pull/4199
* docs: Add docs for the new bentoml.Server API by Sherlock113 in https://github.com/bentoml/BentoML/pull/4198
* docs: Add the OneDiffusion Google Colab task by Sherlock113 in https://github.com/bentoml/BentoML/pull/4202
* docs: Add best practices doc for cost optimization by Sherlock113 in https://github.com/bentoml/BentoML/pull/4200
* docs: Update the Manage Models and Bentos docs by Sherlock113 in https://github.com/bentoml/BentoML/pull/4203
* fix: do not use UDS on WSL by frostming in https://github.com/bentoml/BentoML/pull/4204
* docs: fix typos in help messages by smidm in https://github.com/bentoml/BentoML/pull/4206
* fix: subprocess not using same python as main process causing `bentoml.bentos.build` to crash by nickolasrm in https://github.com/bentoml/BentoML/pull/4209
* fix: allow WSL in the condition by frostming in https://github.com/bentoml/BentoML/pull/4210
* docs: Update manage access token docs by Sherlock113 in https://github.com/bentoml/BentoML/pull/4215
* ci: pre-commit autoupdate [skip ci] by pre-commit-ci in https://github.com/bentoml/BentoML/pull/4216
* fix: EasyOCR integration docs mistake by jianshen92 in https://github.com/bentoml/BentoML/pull/4214
* fix: include mounted FastAPI app's OpenAPI components by RobbieFernandez in https://github.com/bentoml/BentoML/pull/4212
* UPDATE: model.py -> fix Model class Exepction message. by JminJ in https://github.com/bentoml/BentoML/pull/4219
* docs: Remove private access mention by Sherlock113 in https://github.com/bentoml/BentoML/pull/4221
* docs: Change to sentence case by Sherlock113 in https://github.com/bentoml/BentoML/pull/4222
* docs: Fix dead link by Sherlock113 in https://github.com/bentoml/BentoML/pull/4225
* feat: support ipv6 addresses for serve by sauyon in https://github.com/bentoml/BentoML/pull/3914
* docs: Fix all dead links in BentoML docs by Sherlock113 in https://github.com/bentoml/BentoML/pull/4229
* docs: Add the BYOC doc by Sherlock113 in https://github.com/bentoml/BentoML/pull/4223
* docs: Update the Services doc by Sherlock113 in https://github.com/bentoml/BentoML/pull/4231
* fix(client): type fixes by sauyon in https://github.com/bentoml/BentoML/pull/4182
* fix: correct the bento size to include the size of models by frostming in https://github.com/bentoml/BentoML/pull/4226
* fix: use httpx for usage tracking by sauyon in https://github.com/bentoml/BentoML/pull/4228
* fix(deps): bump otel for CVE by aarnphm in https://github.com/bentoml/BentoML/pull/4233
* feat: separate and optimize async and sync clients by judahrand in https://github.com/bentoml/BentoML/pull/4116

New Contributors
* piercus made their first contribution in https://github.com/bentoml/BentoML/pull/4176
* seedspirit made their first contribution in https://github.com/bentoml/BentoML/pull/4192
* HamzaFarhan made their first contribution in https://github.com/bentoml/BentoML/pull/4191
* nickolasrm made their first contribution in https://github.com/bentoml/BentoML/pull/4209
* JminJ made their first contribution in https://github.com/bentoml/BentoML/pull/4219

**Full Changelog**: https://github.com/bentoml/BentoML/compare/v1.1.6...v1.1.7

1.1.6

Not secure
What's Changed
* fix(exception): catch exception for users' runners code by aarnphm in https://github.com/bentoml/BentoML/pull/4150
* docs: Add the streaming docs by Sherlock113 in https://github.com/bentoml/BentoML/pull/4164
* ci: pre-commit autoupdate [skip ci] by pre-commit-ci in https://github.com/bentoml/BentoML/pull/4167
* fix(httpclient): take into account trailing slash in from_url by sauyon in https://github.com/bentoml/BentoML/pull/4169
* docs: fix typo by Sherlock113 in https://github.com/bentoml/BentoML/pull/4173
* fix: apply env map for distributed runner workers by bojiang in https://github.com/bentoml/BentoML/pull/4174

New Contributors
* pre-commit-ci made their first contribution in https://github.com/bentoml/BentoML/pull/4167

**Full Changelog**: https://github.com/bentoml/BentoML/compare/v1.1.5...v1.1.6

1.1.5

Not secure
What's Changed
* fix(type): explicit init for attrs Runner by aarnphm in https://github.com/bentoml/BentoML/pull/4140
* fix: typo in ALLOWED_CUDA_VERSION_ARGS by thomasjo in https://github.com/bentoml/BentoML/pull/4156
* chore(deps): open Starlette version, to allow latest by alexeyshockov in https://github.com/bentoml/BentoML/pull/4100
* chore: lower bound for cloudpickle by aarnphm in https://github.com/bentoml/BentoML/pull/4098
* docs: Add embedded runners docs by Sherlock113 in https://github.com/bentoml/BentoML/pull/4157
* fix cloud client types by sauyon in https://github.com/bentoml/BentoML/pull/4160
* fix: use closer-integrated callbackwrapper by sauyon in https://github.com/bentoml/BentoML/pull/4161
* chore(annotations): cleanup compat and fix ModelSignatureDict type by aarnphm in https://github.com/bentoml/BentoML/pull/4162
* fix(pull): correct use `cloud_context` for models pull by aarnphm in https://github.com/bentoml/BentoML/pull/4163

New Contributors
* thomasjo made their first contribution in https://github.com/bentoml/BentoML/pull/4156
* alexeyshockov made their first contribution in https://github.com/bentoml/BentoML/pull/4100

**Full Changelog**: https://github.com/bentoml/BentoML/compare/v1.1.4...v1.1.5

1.1.4

Not secure
🍱 To better support LLM serving through response streaming, we are proud to introduce an experimental support of server-sent events (SSE) streaming support in this release of BentoML `v1.14` and OpenLLM `v0.2.27`. See an example [service definition](https://gist.github.com/ssheng/38e59e475f3ac5b0f9299c71f7dc3185) for SSE streaming with Llama2.

- Added response streaming through SSE to the `bentoml.io.Text` IO Descriptor type.
- Added async generator support to both API Server and Runner to `yield` incremental text responses.
- Added supported to ☁️ BentoCloud to natively support SSE streaming.

🦾 OpenLLM added token streaming capabilities to support streaming responses from LLMs.

- Added `/v1/generate_stream` endpoint for streaming responses from LLMs.

bash
curl -N -X 'POST' 'http://0.0.0.0:3000/v1/generate_stream' -H 'accept: application/json' -H 'Content-Type: application/json' -d '{
"prompt": " Instruction:\n What is the definition of time (200 words essay)?\n\n Response:",
"llm_config": {
"use_llama2_prompt": false,
"max_new_tokens": 4096,
"early_stopping": false,
"num_beams": 1,
"num_beam_groups": 1,
"use_cache": true,
"temperature": 0.89,
"top_k": 50,
"top_p": 0.76,
"typical_p": 1,
"epsilon_cutoff": 0,
"eta_cutoff": 0,
"diversity_penalty": 0,
"repetition_penalty": 1,
"encoder_repetition_penalty": 1,
"length_penalty": 1,
"no_repeat_ngram_size": 0,
"renormalize_logits": false,
"remove_invalid_values": false,
"num_return_sequences": 1,
"output_attentions": false,
"output_hidden_states": false,
"output_scores": false,
"encoder_no_repeat_ngram_size": 0,
"n": 1,
"best_of": 1,
"presence_penalty": 0.5,
"frequency_penalty": 0,
"use_beam_search": false,
"ignore_eos": false
},
"adapter_name": null
}'


What's Changed
* docs: Update the models doc by Sherlock113 in https://github.com/bentoml/BentoML/pull/4145
* docs: Add more workflows to the GitHub Actions doc by Sherlock113 in https://github.com/bentoml/BentoML/pull/4146
* docs: Add text embedding example to readme by Sherlock113 in https://github.com/bentoml/BentoML/pull/4151
* fix: bento build cache miss by xianml in https://github.com/bentoml/BentoML/pull/4153
* fix(buildx): parsing attestation on docker desktop by aarnphm in https://github.com/bentoml/BentoML/pull/4155

New Contributors
* xianml made their first contribution in https://github.com/bentoml/BentoML/pull/4153

**Full Changelog**: https://github.com/bentoml/BentoML/compare/v1.1.3...v1.1.4

1.1.2

Not secure
Patch releases

BentoML now provides a new diffusers integration, `bentoml.diffusers_simple`.

This introduces two integration for `stable_diffusion` and `stable_diffusion_xl` model.

python
import bentoml

Create a Runner for a Stable Diffusion model
runner = bentoml.diffusers_simple.stable_diffusion.create_runner("CompVis/stable-diffusion-v1-4")

Create a Runner for a Stable Diffusion XL model
runner_xl = bentoml.diffusers_simple.stable_diffusion_xl.create_runner("stabilityai/stable-diffusion-xl-base-1.0")


General bug fixes and documentation improvement

What's Changed
* docs: Add the Overview and Quickstarts sections by Sherlock113 in https://github.com/bentoml/BentoML/pull/4088
* chore(type): makes ModelInfo mypy-compatible by aarnphm in https://github.com/bentoml/BentoML/pull/4094
* feat(store): update annotations by aarnphm in https://github.com/bentoml/BentoML/pull/4092
* docs: Fix some relative links by Sherlock113 in https://github.com/bentoml/BentoML/pull/4097
* docs: Add the Iris quickstart doc by Sherlock113 in https://github.com/bentoml/BentoML/pull/4096
* docs: Add the yolo quickstart by Sherlock113 in https://github.com/bentoml/BentoML/pull/4099
* docs: Code format fix by Sherlock113 in https://github.com/bentoml/BentoML/pull/4101
* fix: respect environment during `bentoml.bentos.build` by aarnphm in https://github.com/bentoml/BentoML/pull/4081
* docs: replaced deprecated save to save_model in pytorch.rst by EgShes in https://github.com/bentoml/BentoML/pull/4102
* fix: Make the install command shorter by frostming in https://github.com/bentoml/BentoML/pull/4103
* docs: Update the BentoCloud Build doc by Sherlock113 in https://github.com/bentoml/BentoML/pull/4104
* docs: Add quickstart repo link and move torch import in Yolo by Sherlock113 in https://github.com/bentoml/BentoML/pull/4106
* docs: fix typo by zhangwm404 in https://github.com/bentoml/BentoML/pull/4108
* docs: fix typo by zhangwm404 in https://github.com/bentoml/BentoML/pull/4109
* fix: calculate Pandas DataFrame batch size correctly by judahrand in https://github.com/bentoml/BentoML/pull/4110
* fix(cli): fix CLI output to BentoCloud by Haivilo in https://github.com/bentoml/BentoML/pull/4114
* Fix sklearn example docs by jianshen92 in https://github.com/bentoml/BentoML/pull/4121
* docs: Add the BentoCloud Deployment creation and update page property explanations by Sherlock113 in https://github.com/bentoml/BentoML/pull/4105
* fix: disable pyright for being too strict by frostming in https://github.com/bentoml/BentoML/pull/4113
* refactor(cli): change prompt of cloud cli to unify Yatai and BentoCloud by Haivilo in https://github.com/bentoml/BentoML/pull/4124
* fix(cli): change model to lower case by Haivilo in https://github.com/bentoml/BentoML/pull/4126
* chore(ci): remove codestyle jobs by aarnphm in https://github.com/bentoml/BentoML/pull/4125
* fix: don't pass column names twice by judahrand in https://github.com/bentoml/BentoML/pull/4120
* feat: SSE (Experimental) by jianshen92 in https://github.com/bentoml/BentoML/pull/4083
* docs: Restructure the get started section in BentoCloud docs by Sherlock113 in https://github.com/bentoml/BentoML/pull/4129
* docs: change monitoring image by Haivilo in https://github.com/bentoml/BentoML/pull/4133
* feat: Rust gRPC client by aarnphm in https://github.com/bentoml/BentoML/pull/3368
* feature(framework): diffusers lora and textual inversion support by larme in https://github.com/bentoml/BentoML/pull/4086
* feat(buildx): support for attestation and sbom with buildx by aarnphm in https://github.com/bentoml/BentoML/pull/4132

New Contributors
* EgShes made their first contribution in https://github.com/bentoml/BentoML/pull/4102
* zhangwm404 made their first contribution in https://github.com/bentoml/BentoML/pull/4108

**Full Changelog**: https://github.com/bentoml/BentoML/compare/v1.1.1...v1.1.2

1.1.1

Not secure
- Added more extensive cloud config option for `bentoml deployment` CLI, Thanks Haivilo.
Note that `bentoml deployment update` now takes the name as a optional positional argument instead of the previous behaviour `--name`:
bash
bentoml deployment update DEPLOYMENT_NAME

See 4087
- Added documentation about bento release GitHub action, Thanks frostming. See 4071

**Full Changelog**: https://github.com/bentoml/BentoML/compare/v1.1.0...v1.1.1

Page 13 of 28

© 2025 Safety CLI Cybersecurity Inc. All Rights Reserved.