Bentoml

Latest version: v1.2.19

Safety actively analyzes 641082 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 18 of 21

0.5.4

* Prometheus metrics improvements in API server
* metrics now work in multi-process mode when running with gunicorn
* added 3 default metrics including to BentoAPIServer:
* request latency
* request count total labeled by status code
* request gauge for monitoring concurrent prediction requests

* New Tensorflow TensorHandler!
* Receive tf.Tensor data structure within your API function for your Tensorflow and Keras model
* TfSavedModel now can automatically transform input tensor shape based on tf.Function input signature

* Largely improved error handling in API Server
* Proper HTTP error code and message
* Protect user error details being leaked to client-side
* Introduced Exception classes for users to use when customizing BentoML handlers

* Deployment guide on Google Cloud Run

* Deployment guide on AWS Fargate ECS

* AWS Lambda deployment improvements

0.5.3

* New LightGBM support, contributed by 7lagrange
* LightGBM example notebook: https://github.com/bentoml/gallery/blob/master/lightbgm/titanic-survival-prediction/lightbgm-titanic-survival-prediction.ipynb

* Minor AWS Lambda deployment improvements
* Improved error message when docker or sam-cli not available
* Pinned aws-sam-cli version to 0.33.1

0.5.2

* New improved AWS Lambda support!
* Support uploading large model files to s3 when deploying to AWS Lambda
* Support trimming down the size of bundled python dependencies
* Support setting memory size up to 3008MB for Lambda function
* Support updating Lambda deployment to a newer version of saved BentoService bundle

* Fixed an issue when installing BentoService saved bundle as PyPI package, the setup.py file failed to parse requirements.txt as install_requires filed.

0.5.0

* Improved Clipper.ai deployment support
* Work seemlessly with clipper v0.4.1 release, updated deployment guide
* https://github.com/bentoml/BentoML/blob/master/guides/deployment/deploy-with-clipper/bentoml-clipper-deployment-guide.ipynb
* New S3 based repository
* BentoML users can now save to, load from BentoService bundle on S3 storage and deploy those bundles directly
* Deployment python APIs are now available in Beta
* `from bentoml.yatai.python_api import create_deployment`

0.4.9

* Added Tensorflow SavedModel format support
* Added support for s3 based model repository
* New syntax for BentoServicepack, making it easier to work with multiple models
* Fixed REST API server docker image build issue with new release of gunicorn

0.4.8

* Fixed an issue with loading Fastai model in FastaiModelArtifact, when the basic_learn submodule is not already imported
* Fixed an issue with creating AWS SageMaker deployment, previously it will fail with KeyError in certain condition

Page 18 of 21

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.