Bentoml

Latest version: v1.4.3

Safety actively analyzes 714772 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 1 of 27

1.4.3

What's Changed
* docs: Update examples to use new APIs by Sherlock113 in https://github.com/bentoml/BentoML/pull/5252
* Add alt text to all images in documentation by devin-ai-integration in https://github.com/bentoml/BentoML/pull/5253
* ci: pre-commit autoupdate [skip ci] by pre-commit-ci in https://github.com/bentoml/BentoML/pull/5254
* docs: update deprecated links by aarnphm in https://github.com/bentoml/BentoML/pull/5256
* docs: Add root input usage by Sherlock113 in https://github.com/bentoml/BentoML/pull/5257
* fix: reformat the code by frostming in https://github.com/bentoml/BentoML/pull/5258
* fix: forbid requests with pickle encoding at the entry service

**Full Changelog**: https://github.com/bentoml/BentoML/compare/v1.4.2...v1.4.3

1.4.2

What's Changed
* Fix RunPod unhashable model error by converting svc.models to hashable ids by rivaon in https://github.com/bentoml/BentoML/pull/5244
* fix(io): OpenAPI schema for multipart form request body by frostming in https://github.com/bentoml/BentoML/pull/5249
* fix: drop uv as a hard dependency by frostming in https://github.com/bentoml/BentoML/pull/5238
* docs: Update examples to use new HF API by Sherlock113 in https://github.com/bentoml/BentoML/pull/5242
* refactor: drop deepmerge dependency by frostming in https://github.com/bentoml/BentoML/pull/5250
* refactor: unify logic of loading service by frostming in https://github.com/bentoml/BentoML/pull/5232
* fix: collect requirements from image spec for codespaces by frostming in https://github.com/bentoml/BentoML/pull/5251

New Contributors
* rivaon made their first contribution in https://github.com/bentoml/BentoML/pull/5244

**Full Changelog**: https://github.com/bentoml/BentoML/compare/v1.4.1...v1.4.2

1.4.1

What's Changed
* docs: use diff for runtime image by parano in https://github.com/bentoml/BentoML/pull/5236
* docs: Update runtime explanations by Sherlock113 in https://github.com/bentoml/BentoML/pull/5240
* fix(regression): call error when context parameter is present by frostming in https://github.com/bentoml/BentoML/pull/5247


**Full Changelog**: https://github.com/bentoml/BentoML/compare/v1.4.0...v1.4.1

1.4.0

We are thrilled to announce the release of BentoML 1.4! This version introduces several new features and improvements to accelerate your iteration cycle and enhance the overall developer experience.

Below are the key highlights of 1.4, and you can find more details in [the release blog post](https://www.bentoml.com/blog/announcing-bentoml-1-4).

🚀 20x faster iteration with Codespaces

- Introduced [BentoML Codespaces](https://docs.bentoml.com/en/latest/scale-with-bentocloud/codespaces.html), a development platform built on BentoCloud
- Added the `bentoml code` command for creating a Codespace
- Auto-sync of local changes to the cloud environment
- Access to a variety of powerful cloud GPUs
- Real-time logs and debugging through the cloud dashboard
- Eliminate dependency headaches and ensure consistency between dev and prod environments

🐍 New Python SDK for runtime configurations

- Added `bentoml.images.PythonImage` for defining the [Bento runtime environment](https://docs.bentoml.com/en/latest/build-with-bentoml/runtime-environment.html) in Python instead of using `bentofile.yaml` or `pyproject.toml`
- Support customizing runtime configurations (e.g., Python version, system packages, and dependencies) directly in the `service.py` file
- Introduced context-sensitive `run()` method for running custom build commands
- Backward compatible with existing `bentofile.yaml` and `pyproject.toml` configurations

⚡ Accelerated model loading with safetensors

- Implemented build-time model downloads and parallel loading of model weights using safetensors to reduce cold start time and improve scaling performance. See [the documentation](https://docs.bentoml.com/en/latest/build-with-bentoml/model-loading-and-management.html#load-a-model) to learn more.
- Added `bentoml.models.HuggingFaceModel` for loading models from HF. It supports private model repositories and custom endpoints
- Added `bentoml.models.BentoModel` for loading models from BentoCloud and the Model Store

🌍 External deployment dependencies

- Extended `bentoml.depends()` to [support external deployments](https://docs.bentoml.com/en/latest/build-with-bentoml/distributed-services.html#depend-on-an-external-deployment)
- Added support for calling BentoCloud Deployments via name or URL
- Added support for calling self-hosted HTTP AI services outside BentoCloud

⚠️ Legacy Service API deprecation

- The legacy `bentoml.Service` API (with runners) is now officially deprecated and is scheduled for removal in a future release. We recommend you use the `bentoml.service` decorator.

---

Note that:

- `1.4` remains fully compatible with Bentos created by `1.3`.
- The [BentoML documentation](https://docs.bentoml.com/en/latest/index.html) has been updated with examples and guides for `1.4`.

🙏 As always, we appreciate your continued support!

What's Changed
* feat: support bentoml serve without service name by frostming in https://github.com/bentoml/BentoML/pull/5208
* feat(service): expose service-level labels definition by aarnphm in https://github.com/bentoml/BentoML/pull/5211
* fix: restore path after import by frostming in https://github.com/bentoml/BentoML/pull/5214
* fix: compile bytecode when installing python packages by frostming in https://github.com/bentoml/BentoML/pull/5212
* fix: IO descriptor honor validators by frostming in https://github.com/bentoml/BentoML/pull/5213
* feat(image): add support for chaining `.pyproject.toml` by aarnphm in https://github.com/bentoml/BentoML/pull/5218
* feat: support root input spec using positonal-only argument by frostming in https://github.com/bentoml/BentoML/pull/5217
* fix: gradio error when uploading file by frostming in https://github.com/bentoml/BentoML/pull/5220
* fix: input data validation for root input by frostming in https://github.com/bentoml/BentoML/pull/5221
* fix: don't restore model store after importing service by frostming in https://github.com/bentoml/BentoML/pull/5223
* feat(metrics): extend histogram buckets to support LLM latencies by devin-ai-integration in https://github.com/bentoml/BentoML/pull/5222
* fix: always add bentoml req unless it is specified as a url dependency by frostming in https://github.com/bentoml/BentoML/pull/5225
* docs: update links to examples by aarnphm in https://github.com/bentoml/BentoML/pull/5224
* docs: add environment variable authentication documentation by devin-ai-integration in https://github.com/bentoml/BentoML/pull/5231
* docs: Update docs to use new runtime API by Sherlock113 in https://github.com/bentoml/BentoML/pull/5177
* fix: add files under env/docker by frostming in https://github.com/bentoml/BentoML/pull/5234


**Full Changelog**: https://github.com/bentoml/BentoML/compare/v1.3.22...v1.4.0

1.4.0a2

What's Changed
* fix: restore path after import by frostming in https://github.com/bentoml/BentoML/pull/5214


**Full Changelog**: https://github.com/bentoml/BentoML/compare/v1.4.0a1...v1.4.0a2

1.4.0a1

What's Changed
* feat: support bentoml serve without service name by frostming in https://github.com/bentoml/BentoML/pull/5208
* feat(service): expose service-level labels definition by aarnphm in https://github.com/bentoml/BentoML/pull/5211


**Full Changelog**: https://github.com/bentoml/BentoML/compare/v1.3.22...v1.4.0a1

Page 1 of 27

© 2025 Safety CLI Cybersecurity Inc. All Rights Reserved.