> bentoml get IrisClassifier:latest
3. Separated deployment commands to sub-commands
AWS Lambda model serving deployment:
https://docs.bentoml.org/en/latest/deployment/aws_lambda.html
AWS Sagemaker model serving deployment:
https://docs.bentoml.org/en/latest/deployment/aws_sagemaker.html
4. Breaking Change: Improved `bentoml run` command for inferencing from CLI
Changing from:
> bentoml {API_NAME} {saved_path} {run_args}
> bentoml predict {saved_path} --input=my_test_data.csvo:
To:
> bentoml run {BENTO/saved_path} {API_NAME} {run_args}
> bentoml run IrisClassifier:latest predict --input='[[1,2,3,4]]'
previous users can directly use the API name as the command to load and run a model API from cli, it looks like this: `bentoml predict {saved_path} --input=my_test_data.csv`. The problem is that the API names are dynamically loaded and this makes it hard for bentoml command to provide useful `--help` docs. And the `default command` workaround with Click, makes it very confusing when the user types a wrong command. So we decided to make this change.
5. Breaking Change: `--quiet` and `--verbose` options position
Previously both `--quiet` and `--verbose` options must follow immediately after `bentoml` command, now they are being added to options list of all subcommands.
If you are using these two options, you will need to change your CLI from:
> bentoml --verbose serve ...
To:
> bentoml serve ... --verbose