Bentoml

Latest version: v1.2.18

Safety actively analyzes 638845 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 20 of 21

0.3.1

This is a minor release with mostly bug fixes:

* Added `bentoml config` cli command for configuring local BentoML preferences and configs
* Fixed an issue when serving Keras model with API server in docker
* Fixed an issue when dependency missing in docker environment when using ImageHandler

0.3.0

* Fast.ai support, find example notebooks here: https://github.com/bentoml/gallery/tree/master/fast-ai

* PyTorch support - fixed a number of issues related to PyTorch model serialization and updated example notebook here: https://github.com/bentoml/BentoML/blob/master/examples/pytorch-fashion-mnist/pytorch-fashion-mnist.ipynb

* Keras Support - fixed a number of issues related to serving Keras model as API server

* Clipper deployment support - easily deploy BentoML service to Clipper cluster, read more about it here: https://github.com/bentoml/BentoML/blob/master/examples/deploy-with-clipper/deploy-iris-classifier-to-clipper.ipynb

* ImageHandler improvements - API server's web UI now support posting images to API server for testing API endpoint:
![image](https://user-images.githubusercontent.com/489344/61393491-eb125d00-a875-11e9-9edf-ee0f50edcf36.png)

0.2.2beta

* Fast.ai support is in beta now, check out the example notebook here: https://colab.research.google.com/github/bentoml/gallery/blob/master/fast-ai/pet-classification/notebook.ipynb

* Improved OpenAPI docs endpoint:

* DataframeHandler allows specifying input types now - users can also generate API Client library that respects the expected input format for each BentoML API service user defined, e.g.:

python
class MyClassifier(BentoService):

api(DataframeHandler, input_types=['int8', 'int8', 'float', 'str', 'bool'])
def predict(self, df):
...

or specifying both column name & type:
api(DataframeHandler, input_types={'id': 'string', 'age': 'int' })
def predict(self, df):
...


* API server index page now provides web UI for testing API endpoints and shows instructions for how to generate Client API library:

![image](https://user-images.githubusercontent.com/489344/61011095-e60d5500-a32d-11e9-8856-d9a6abe6d2fc.png)

![image](https://user-images.githubusercontent.com/489344/61011104-ec9bcc80-a32d-11e9-8d2e-ea3bd1a8b28c.png)

0.2.1beta

* Improved inline python docs and new documentation site launched https://bentoml.readthedocs.io
* Support running examples on google colab
* OpenAPI docs endpoint beta
* Configurable prediction and feedback logging in API server
* Serverless deployment improved
* Bug fixes

0.2

> bentoml get IrisClassifier:latest


3. Separated deployment commands to sub-commands

AWS Lambda model serving deployment:
https://docs.bentoml.org/en/latest/deployment/aws_lambda.html

AWS Sagemaker model serving deployment:
https://docs.bentoml.org/en/latest/deployment/aws_sagemaker.html


4. Breaking Change: Improved `bentoml run` command for inferencing from CLI

Changing from:

> bentoml {API_NAME} {saved_path} {run_args}
> bentoml predict {saved_path} --input=my_test_data.csvo:

To:

> bentoml run {BENTO/saved_path} {API_NAME} {run_args}
> bentoml run IrisClassifier:latest predict --input='[[1,2,3,4]]'

previous users can directly use the API name as the command to load and run a model API from cli, it looks like this: `bentoml predict {saved_path} --input=my_test_data.csv`. The problem is that the API names are dynamically loaded and this makes it hard for bentoml command to provide useful `--help` docs. And the `default command` workaround with Click, makes it very confusing when the user types a wrong command. So we decided to make this change.

5. Breaking Change: `--quiet` and `--verbose` options position

Previously both `--quiet` and `--verbose` options must follow immediately after `bentoml` command, now they are being added to options list of all subcommands.

If you are using these two options, you will need to change your CLI from:

> bentoml --verbose serve ...

To:

> bentoml serve ... --verbose

0.2.0

* Support for H2O models
* Support for deploying BentoArchive to Amazon SageMaker endpoint
* Support for querying deployment status and delete deployment created with BentoML
* Fixed kubernetes ingress configuration for gunicorn server https://github.com/bentoml/BentoML/issues/136

Page 20 of 21

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.