Bentoml

Latest version: v1.3.14

Safety actively analyzes 681866 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 24 of 25

0.2.1beta

* Improved inline python docs and new documentation site launched https://bentoml.readthedocs.io
* Support running examples on google colab
* OpenAPI docs endpoint beta
* Configurable prediction and feedback logging in API server
* Serverless deployment improved
* Bug fixes

0.2

> bentoml get IrisClassifier:latest


3. Separated deployment commands to sub-commands

AWS Lambda model serving deployment:
https://docs.bentoml.org/en/latest/deployment/aws_lambda.html

AWS Sagemaker model serving deployment:
https://docs.bentoml.org/en/latest/deployment/aws_sagemaker.html


4. Breaking Change: Improved `bentoml run` command for inferencing from CLI

Changing from:

> bentoml {API_NAME} {saved_path} {run_args}
> bentoml predict {saved_path} --input=my_test_data.csvo:

To:

> bentoml run {BENTO/saved_path} {API_NAME} {run_args}
> bentoml run IrisClassifier:latest predict --input='[[1,2,3,4]]'

previous users can directly use the API name as the command to load and run a model API from cli, it looks like this: `bentoml predict {saved_path} --input=my_test_data.csv`. The problem is that the API names are dynamically loaded and this makes it hard for bentoml command to provide useful `--help` docs. And the `default command` workaround with Click, makes it very confusing when the user types a wrong command. So we decided to make this change.

5. Breaking Change: `--quiet` and `--verbose` options position

Previously both `--quiet` and `--verbose` options must follow immediately after `bentoml` command, now they are being added to options list of all subcommands.

If you are using these two options, you will need to change your CLI from:

> bentoml --verbose serve ...

To:

> bentoml serve ... --verbose

0.2.0

Not secure
* Support for H2O models
* Support for deploying BentoArchive to Amazon SageMaker endpoint
* Support for querying deployment status and delete deployment created with BentoML
* Fixed kubernetes ingress configuration for gunicorn server https://github.com/bentoml/BentoML/issues/136

0.2.0beta

0.1.2beta

* _Breaking Change_: updated the format of config file `bentml.yml`' in the generated archive to include more information. Newer version BentoML won't be able to load archive generated before 0.1.2
* Added support for deploying to serverless platforms, including AWS lambda and google cloud platform functions
* Added REST API index page documentation on available endpoints

0.1.1beta

* Added Xgboost support
* Added DataframeHandler options includes 'typ', 'input_columns', 'orient'
* BentoML cli command now supporst json string as input, e.g. --input="{"col": {"0": "bc"}}"
* BentoML cli command now supports input file from s3, e.g. --input=s3://my-bucket/test.csv
* Added gunicorn support, serve-gunicorn is now default in generated Docker image
* Added `ver` decorator for specifying versions with [semantic versioning](https://semver.org/)

Page 24 of 25

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.