Detailed Changelog: https://github.com/bentoml/BentoML/compare/v0.11.0...v0.12.0
New Features
- **Breaking Change:** Default Model Worker count is set to **one** 1454
- Please use the `--worker` CLI argument for specifying a number of workers of your deployment
- For heavy production workload, we recommend experiment with different worker count and benchmark test your BentoML service in API server in your target hardware to get a better understanding of the model server performance
- **Breaking Change:** Micro-batching layer(Marshal Server) is now enabled by default 1498
- For Inference APIs defined with`batch=True`, this will enable micro-batching behavior when serving. User can disable with the `--diable-microbatch` flag
- For Inference APIs with `batch=False`, API requests are now being queued in Marshal and then forwarded to the model backend server
- **New:** Use non-root user in BentoML's API server docker image
- **New:** API/CLI for bulk delete of BentoML bundle in Yatai 1313
- Easier dependency management for PyPI and conda
- Support all pip install options via a user-provided `requirements.txt` file
- **Breaking Change:** when `requirements_txt_file` option is in use, other pip package options will be ignored
- `conda_override_channels` option for using explicit conda channel for conda dependencies: https://docs.bentoml.org/en/latest/concepts.html#conda-packages
---
- Better support for pip install options and remote python dependencies 1421
1. Let BentoML do it for you:
python
bentoml.env(infer_pip_packages=True)
2. use the existing "pip_packages" API, to specify list of dependencies:
python
bentoml.env(
pip_packages=[
'scikit-learn',
'pandas https://github.com/pypa/pip/archive/1.3.1.zip',
]
)
3. use a requirements.txt file to specify all dependencies:
python
bentoml.env(requirements_txt_file='./requirements.txt')
In the `./requirements.txt` file, all pip install options can be used:
python
These requirements were autogenerated by pipenv
To regenerate from the project's Pipfile, run:
pipenv lock --requirements
-i https://pypi.org/simple
scikit-learn==0.20.3
aws-sam-cli==0.33.1
psycopg2-binary
azure-cli
bentoml
pandas https://github.com/pypa/pip/archive/1.3.1.zip
https://[username[:password]]pypi.company.com/simple
https://user:he%2F%2Fopypi.company.com
git+https://myvcs.com/some_dependencysometag#egg=SomeDependency
- API/CLI for bulk delete 1313
CLI command for delete:
python
Delete all saved Bento with specific name
bentoml delete --name IrisClassifier
bentoml delete --name IrisClassifier -y do it without confirming with user
bentoml delete --name IrisClassifier --yatai-url=yatai.mycompany.com delete in remote Yatai
Delete all saved Bento with specific tag
bentoml delete --labels "env=dev"
bentoml delete --labels "env=dev, user=foobar"
bentoml delete --labels "key1=value1, key2!=value2, key3 In (value3, value3a), key4 DoesNotExist"
Delete multiple saved Bento by their name:version tag
bentoml delete --tag "IrisClassifier:v1, MyService:v3, FooBar:20200103_Lkj81a"
Delete all
bentoml delete --all
Yatai Client Python API:
python
yc = get_yatai_client() local Yatai
yc = get_yatai_client('remote.yatai.com:50051') remoate Yatai
yc.repository.delete(prune, labels, bento_tag, bento_name, bento_version, require_confirm)
"""
Params:
prune: boolean, Set true to delete all bento services
bento_tag: Bento tag
labels: string, label selector to filter bento services to delete
bento_name: string
bento_version: string,
require_confirm: boolean require user confirm interactively in CLI
"""
- 1334 Customize route of an API endpoint
python
env(infer_pip_packages=True)
artifacts([...])
class MyPredictionService(BentoService)
api(route="/my_url_route/foo/bar", batch=True, input=DataframeInput())
def predict(self, df):
instead of "/predict", the URL for this API endpoint will be "/my_url_route/foo/bar"
...
- 1416 Support custom authentication header in Yatai gRPC server
- 1284 Add health check endpoint to Yatai web server
- 1409 Fix Postgres disconnect issue with Yatai server