Bentoml

Latest version: v1.4.4

Safety actively analyzes 714860 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 21 of 27

0.8.2

Not secure
What's New?

* Support Debian-slim docker images for containerizing model server, 822 by jackyzha0. User can choose to use :
python
env(
auto_pip_dependencies=True,
docker_base_image="bentoml/model-server:0.8.2-slim-py37"
)


* New `bentoml retrieve` command for downloading saved bundle from remote YataiService model registry, 810 by iancoffey
bash
bentoml retrieve ModelServe:20200610145522_D08399 --target_dir /tmp/modelserve


* Added `--print-location` option to `bentoml get` command to print the saved path, 825 by jackyzha0
bash
$ bentoml get IrisClassifier:20200625114130_F3480B --print-location
/Users/chaoyu/bentoml/repository/IrisClassifier/20200625114130_F3480B


* Support Dataframe input JSON format orient parameter. DataframeInput now supports all pandas JSON orient options: records, columns, values split, index. 809 815, by bojiang

For example, with `orient="records"`:
python
api(input=DataframeInput(orient="records"))
def predict(self, df):
...

The API endpoint will be expecting HTTP request with JSON payload in the following format:
json
[{"col 1":"a","col 2":"b"},{"col 1":"c","col 2":"d"}]

Or with `orient="index"`:
json
'{"row 1":{"col 1":"a","col 2":"b"},"row 2":{"col 1":"c","col 2":"d"}}'

See pandas's documentation on the orient option of to_json/from_json function for more detail: https://pandas.pydata.org/pandas-docs/stable/reference/api/pandas.DataFrame.to_json.html

* Support Azure Functions deployment (beta). A new fully automated cloud deployment option that BentoML provides in addition to AWS SageMaker and AWS Lambda. See usage documentation here: https://docs.bentoml.org/en/latest/deployment/azure_functions.html


* ModelServer API Swagger schema improvements including the ability to specify example HTTP request, 807 by Korusuke
* Add prediction logging when deploying with AWS Lambda, 790 by jackyzha0
* Artifact string name validation, 817 by AlexDut
* Fixed micro batching parameter(max latency and max batch size) not applied, 818 by bojiang
* Fixed issue with handling CSV file input by following RFC4180. 814 by bojiang
* Fixed TfTensorOutput casts floats as ints 813, in 823 by bojiang

Announcements:

* The BentoML team has created a new [mailing list](https://groups.google.com/forum/#!forum/bentoml) for future announcements, community-related discussions. Join now [here](https://groups.google.com/forum/#!forum/bentoml)!
* For those interested in contributing to BentoML, there is a new [contributing docs](https://github.com/bentoml/BentoML/blob/master/CONTRIBUTING.md) now, be sure to check it out.
* We are starting a bi-weekly community meeting for community members to demo new features they are building, discuss the roadmap and gather feedback, etc. More details will be announced soon.

0.8.1

Not secure
What's New?

* Service API Input/Output adapter 783 784 789, by bojiang
* A new API for defining service input and output data types and configs
* The new `InputAdapter` is essentially the `API Handler` concept in BentoML prior to version 0.8.x release
* The old `API Handler` syntax is being deprecated, it will continue to be supported until version 1.0
* The main motivation for this change, is to enable us to build features such as new API output types(such as file/image as service output), add gRPC support, better OpenAPI support, and more performance optimizations in online serving down the line

* Model server docker image build improvements 761
* Reduced docker build time by using a pre-built BentoML model server docker image as the base image
* Removed the dependency on `apt-get` and `conda` from the custom docker base image
* Added alpine based docker image for model server deployment

* Improved Image Input handling:
* Add micro-batching support for ImageInput (former ImageHandler) 717, by bojiang
* Add support for using a list of images as input from CLI prediction run 731, by bojiang
* In the new Input Adapter API introduced in 0.8.0, the `LegacyImageInput` is identical to the previous `ImageHandler`
* The new `ImageInput` works only for single image input, unlike the old `ImageHandler`
* For users using the old `ImageHandler`, we recommend migrating to the new `ImageInput` if it is only used to handle single image input
* For users using `ImageHanlder` for multiple images input, wait until the `MultiImageInput` is added, which will be a separate input adapter type

* Added CORS support for AWS Lambda serving 752, by omrihar
* Added JsonArtifact for storing configuration and JsonSerializable data 746, by lemontheme

Bug Fixes & Improvements:
* Fixed Sagemaker deployment `ModuleNotFounderError` due to wrong gevent version 785 by flosincapite
* Fixed SpacyModelArtifact not exposed in `bentoml.artifacts` 782, by docteurZ
* Fixed errors when inheriting handler 767, by bojiang
* Removed `future` statements for py2 support, 756, by jjmachan
* Fixed bundled_pip_dependencies installation on AWS Lambda deployment 794
* Removed `aws.region` config, use AWS CLI's own config instead 740
* Fixed SageMaker deployment CLI: delete deployment with namespace specified 741
* Removed `pandas` from BentoML dependencies list, it is only required when using DataframeInput 738


Internal, CI, Testing:
* Added docs watch script for Linux 781, by akainth015
* Improved build bash scripts 774, by akainth015, flosincapite
* Fixed YataiService end-to-end tests 773
* Added PyTorch integration tests 762, by jjmachan
* Added ONNX integration tests 726, by yubozhao
* Added linter and formatting check to Travis CI
* Codebase cleanup, reorganized deployment and repository module 768 769 771


Announcements:

* The BentoML team is planning to start a bi-weekly community meeting to demo new features, discuss the roadmap and gather feedback. Join the BentoML slack channel for more details: [click to join BentoML slack](https://join.slack.com/t/bentoml/shared_invite/enQtNjcyMTY3MjE4NTgzLTU3ZDc1MWM5MzQxMWQxMzJiNTc1MTJmMzYzMTYwMjQ0OGEwNDFmZDkzYWQxNzgxYWNhNjAxZjk4MzI4OGY1Yjg).
* There are a few issues with PyPI release `0.8.0` that made it not usable. The newer `0.8.1` release has those issues fixed. Please do not use version `0.8.0`.

0.7.8

Not secure
What's New?
* ONNX model support with onnxruntime backend. More example notebooks and tutorials are coming soon!
* Added Python 3.8 support

Documentation:
* BentoML API Server architecture overview https://docs.bentoml.org/en/latest/guides/micro_batching.html
* Deploying YataiService behind Nginx https://docs.bentoml.org/en/latest/guides/yatai_service.html

Internal:
* [benchmark] moved benchmark notebooks it a separate repo: https://github.com/bentoml/benchmark
* [CI] Enabled Linting style check test on Travis CI, contributed by kautukkundan
* [CI] Fixed all existing linting errors in bentoml and tests module, contributed by kautukkundan
* [CI] Enabled Python 3.8 on Travis CI

Announcements:
* There will be breaking changes in the coming 0.8.0 release, around ImageHandler, custom Handler and custom Artifacts. If you're using those features in production, please reach out.
* Help us promote BentoML on [Twitter bentomlai](https://twitter.com/bentomlai) and [Linkedin Page](https://www.linkedin.com/company/bentoml/)!
* Be sure to join the BentoML slack channel for roadmap discussions and development updates, [click to join BentoML slack](https://join.slack.com/t/bentoml/shared_invite/enQtNjcyMTY3MjE4NTgzLTU3ZDc1MWM5MzQxMWQxMzJiNTc1MTJmMzYzMTYwMjQ0OGEwNDFmZDkzYWQxNzgxYWNhNjAxZjk4MzI4OGY1Yjg).

0.7.7

Not secure
What's New?
* Support custom docker base image, contributed by withsmilo
* Improved model saving & loading with YataiService backed by S3 storage, contributed by withsmilo, BentoML now works with custom s3-like services such as a MinIO deployment

Improvements & Bug Fixes
* Fixed a number of issues that are breaking Windows OS support, contributed by bojiang
* [YataiService] Fixed an issue where the deployment namespace configured on the server-side will be ignored

Internal:
* [CI] Added Windows test environment in BentoML's CI test setup on Travis

Announcements:
* Help us promote BentoML on [Twitter bentomlai](https://twitter.com/bentomlai) and [Linkedin Page](https://www.linkedin.com/company/bentoml/)!
* Be sure to join the BentoML slack channel for roadmap discussions and development updates, [click to join BentoML slack](https://join.slack.com/t/bentoml/shared_invite/enQtNjcyMTY3MjE4NTgzLTU3ZDc1MWM5MzQxMWQxMzJiNTc1MTJmMzYzMTYwMjQ0OGEwNDFmZDkzYWQxNzgxYWNhNjAxZjk4MzI4OGY1Yjg).

0.7.6

Not secure
What's New?
* Added Spacy Support, contributed by spotter (641)
* Support custom s3_endpoint_url in BentoML’s model registry component(YataiService) (656)
* YataiService client can now connect via secure gRPC (650)

Improvements & Bug Fixes
* Micro-batching server performance optimization & troubleshoot back pressure (630)
* [YataiService] Included postgreSQL required dependency in the YataiService docker image by default
* [Documentation] New fastest example project
* [Bug Fix] Fixed overwriting pip_dependencies specified through env (657 642)

Internal:
* [Benchmark] released newly updated benchmark notebook with latest changes in micro batching server
* [Benchmark] notebook updates and count dropped requests (645)
* [e2e test] Added e2e test using dockerized YataiService gRPC server

0.7.5

Not secure
What's new:
* Added FastAI2 support, contributed by HenryDashwood

Bug fixes:
* S3 bucket creation in us-east-1 region https://github.com/bentoml/BentoML/issues/631
* Fix issue with fastcore and ruamel-yaml https://github.com/bentoml/BentoML/pull/637

Documentation updates:
* Added Kubeflow deployment guide
* Added Kubernetes deployment guide
* Added Knative deployment guide

Page 21 of 27

© 2025 Safety CLI Cybersecurity Inc. All Rights Reserved.