Bentoml

Latest version: v1.2.18

Safety actively analyzes 638845 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 11 of 21

1.0.0rc3

We have just released BentoML `1.0.0rc3` with a number of highly anticipated features and improvements. Check it out with the following command!

bash
$ pip install -U bentoml --pre


⚠️ BentoML will release the official `1.0.0` version next week and remove the need to use `--pre` tag to install BentoML versions after `1.0.0`. If you wish to stay on the `0.13.1` LTS version, please lock the dependency with `bentoml==0.13.1`.

- Added support for framework runners in the following ML frameworks.
- [fast.ai](https://www.fast.ai/)
- [CatBoost](https://catboost.ai/)
- [ONNX](https://onnx.ai/)
- Added support for Huggingface Transformers custom pipelines.
- Fixed a logging issue causing the api_server and runners to not generate error logs.
- Optimized Tensorflow inference procedure.
- Improved resource request configuration for runners.
- Resource request can be now configured in the BentoML configuration. If unspecified, runners will be scheduled to best utilized the available system resources.

yaml
runners:
resources:
cpu: 8.0
nvidia.com/gpu: 4.0


- Updated the API for custom runners to declare the types of supported resources.

python
import bentoml

class MyRunnable(bentoml.Runnable):
SUPPORTS_CPU_MULTI_THREADING = True Deprecated SUPPORT_CPU_MULTI_THREADING
SUPPORTED_RESOURCES = ("nvidia.com/gpu", "cpu") Deprecated SUPPORT_NVIDIA_GPU
...

my_runner = bentoml.Runner(
MyRunnable,
runnable_init_params={"foo": foo, "bar": bar},
name="custom_runner_name",
...
)


- Deprecated the API for specifying resources from the framework `to_runner()` and custom Runner APIs. For better flexibility at runtime, it is recommended to specifying resources through configuration.

What's Changed
* fix(dependencies): require pyyaml>=5 by sauyon in https://github.com/bentoml/BentoML/pull/2626
* refactor(server): merge contexts; add yatai headers by bojiang in https://github.com/bentoml/BentoML/pull/2621
* chore(pylint): update pylint configuration by sauyon in https://github.com/bentoml/BentoML/pull/2627
* fix: Transformers NVIDIA_VISIBLE_DEVICES value type casting by ssheng in https://github.com/bentoml/BentoML/pull/2624
* fix: Server silently crash without logging exceptions by ssheng in https://github.com/bentoml/BentoML/pull/2635
* fix(framework): some GPU related fixes by larme in https://github.com/bentoml/BentoML/pull/2637
* tests: minor e2e test cleanup by sauyon in https://github.com/bentoml/BentoML/pull/2643
* docs: Add model in bentoml.pytorch.save_model() pytorch integration example by AlexandreNap in https://github.com/bentoml/BentoML/pull/2644
* chore(ci): always enable actions on PR by sauyon in https://github.com/bentoml/BentoML/pull/2646
* chore: updates ci by aarnphm in https://github.com/bentoml/BentoML/pull/2650
* fix(docker): templates bash heredoc should pass `-ex` by aarnphm in https://github.com/bentoml/BentoML/pull/2651
* feat: CatBoost integration by yetone in https://github.com/bentoml/BentoML/pull/2615
* feat: FastAI by aarnphm in https://github.com/bentoml/BentoML/pull/2571
* feat: Support Transformers custom pipeline by ssheng in https://github.com/bentoml/BentoML/pull/2640
* feat(framework): onnx support by larme in https://github.com/bentoml/BentoML/pull/2629
* chore(tensorflow): optimize inference procedure by bojiang in https://github.com/bentoml/BentoML/pull/2567
* fix(runner): validate runner names by sauyon in https://github.com/bentoml/BentoML/pull/2588
* fix(runner): lowercase runner names and add tests by sauyon in https://github.com/bentoml/BentoML/pull/2656
* style: github naming by aarnphm in https://github.com/bentoml/BentoML/pull/2659
* tests(framework): add new framework tests by sauyon in https://github.com/bentoml/BentoML/pull/2660
* docs: missing code annotation by jjmachan in https://github.com/bentoml/BentoML/pull/2654
* perf(templates): cache python installation via conda by aarnphm in https://github.com/bentoml/BentoML/pull/2662
* fix(ci): destroy the runner after init_local by bojiang in https://github.com/bentoml/BentoML/pull/2665
* fix(conda): python installation order by aarnphm in https://github.com/bentoml/BentoML/pull/2668
* fix(tensorflow): casting error on kwargs by bojiang in https://github.com/bentoml/BentoML/pull/2664
* feat(runner): implement resource configuration by sauyon in https://github.com/bentoml/BentoML/pull/2632

New Contributors
* AlexandreNap made their first contribution in https://github.com/bentoml/BentoML/pull/2644

**Full Changelog**: https://github.com/bentoml/BentoML/compare/v1.0.0-rc2...v1.0.0-rc3

1.0.0rc2

We have just released BentoML 1.0.0rc2 with an exciting lineup of improvements. Check it out with the following command!

bash
$ pip install -U bentoml --pre


- Standardized logging configuration and improved logging performance.
- If imported as a library, BentoML will no longer configure logging explicitly and will respect the logging configuration of the importing Python process. To customize BentoML logging as a library, configurations can be added for the `bentoml` logger.

bash

formatters:
...
handlers:
...
loggers:
...
bentoml:
handlers: [...]
level: INFO
...


- If started as a server, BentoML will continue to configure logging format and output to `stdout` at `INFO` level. All third party libraries will be configured to log at the `WARNING` level.
- **Added LightGBM framework support.**
- Updated model and bento creation timestamps CLI display to use the local timezone for better use experience, while timestamps in metadata will remain in the UTC timezone.
- Improved the reliability of bento build with advanced options including base_image and dockerfile_template.

Beside all the exciting product work, we also started a blog at [modelserving.com](https://modelserving.com/) sharing our learnings gained from building BentoML and supporting the MLOps community. Checkout our latest blog [Breaking up with Flask & FastAPI: Why ML model serving requires a specialized framework] (share your thoughts with us on our [LinkedIn post](https://www.linkedin.com/posts/activity-6943273635138740224-6jm0/).

Lastly, a big shoutout to Mike Kuhlen for adding the LightGBM framework support. 🥂

What's Changed
* feat(cli): output times in the local timezone by sauyon in https://github.com/bentoml/BentoML/pull/2572
* fix(store): use >= for time checking by sauyon in https://github.com/bentoml/BentoML/pull/2574
* fix(build): use subprocess to call pip-compile by sauyon in https://github.com/bentoml/BentoML/pull/2573
* docs: fix wrong variable name in comment by kim-sardine in https://github.com/bentoml/BentoML/pull/2575
* feat: improve logging by sauyon in https://github.com/bentoml/BentoML/pull/2568
* fix(service): JsonIO doesn't return a pydantic model by bojiang in https://github.com/bentoml/BentoML/pull/2578
* fix: update conda env yaml file name and default channel by parano in https://github.com/bentoml/BentoML/pull/2580
* chore(runner): add shcedule shortcuts to runners by bojiang in https://github.com/bentoml/BentoML/pull/2576
* fix(cli): cli encoding error on Windows by bojiang in https://github.com/bentoml/BentoML/pull/2579
* fix(bug): Make `model.with_options()` additive by ssheng in https://github.com/bentoml/BentoML/pull/2519
* feat: dockerfile templates advanced guides by aarnphm in https://github.com/bentoml/BentoML/pull/2548
* docs: add setuptools to docs dependencies by parano in https://github.com/bentoml/BentoML/pull/2586
* test(frameworks): minor test improvements by sauyon in https://github.com/bentoml/BentoML/pull/2590
* feat: Bring LightGBM back by mqk in https://github.com/bentoml/BentoML/pull/2589
* fix(runner): pass init params to runnable by sauyon in https://github.com/bentoml/BentoML/pull/2587
* fix: propagate should be false by aarnphm in https://github.com/bentoml/BentoML/pull/2594
* fix: Remove starlette request log by ssheng in https://github.com/bentoml/BentoML/pull/2595
* fix: Bug fix for 2596 by timc in https://github.com/bentoml/BentoML/pull/2597
* chore(frameworks): update framework template with new checks and remove old framework code by sauyon in https://github.com/bentoml/BentoML/pull/2592
* docs: Update streaming.rst by ssheng in https://github.com/bentoml/BentoML/pull/2605
* bug: Fix Yatai client push bentos with model options by ssheng in https://github.com/bentoml/BentoML/pull/2604
* docs: allow running tutorial from docker by parano in https://github.com/bentoml/BentoML/pull/2611
* fix(model): lock attrs to >=21.1.0 by bojiang in https://github.com/bentoml/BentoML/pull/2610
* docs: Fix documentation links and formats by ssheng in https://github.com/bentoml/BentoML/pull/2612
* fix(model): load ModelOptions lazily by sauyon in https://github.com/bentoml/BentoML/pull/2608
* feat: install.sh for python packages by aarnphm in https://github.com/bentoml/BentoML/pull/2555
* fix/routing path by aarnphm in https://github.com/bentoml/BentoML/pull/2606
* qa: build config by aarnphm in https://github.com/bentoml/BentoML/pull/2581
* fix: invalid build option python_version="None" when base_image is used by parano in https://github.com/bentoml/BentoML/pull/2623

New Contributors
* kim-sardine made their first contribution in https://github.com/bentoml/BentoML/pull/2575
* timc made their first contribution in https://github.com/bentoml/BentoML/pull/2597

**Full Changelog**: https://github.com/bentoml/BentoML/compare/v1.0.0-rc1...v1.0.0rc2

1.0.0rc1

We are very excited to share that BentoML 1.0.0rc1 has just been released with a number of dev experience improvements and bug fixes.

- Enabled users to run just `bentoml serve` from a project directory containing a bentofile.yaml build file.
- Added request contexts and opening access to request and response headers.
- Introduced new runner design to [simplify creation of custom runners](https://docs.bentoml.org/en/latest/concepts/runner.html#custom-runner) and framework `to_runner` API to [simplify runner creation from model](https://docs.bentoml.org/en/latest/concepts/model.html#using-model-runner).

bash
import numpy as np
import bentoml
from bentoml.io import NumpyNdarray

iris_clf_runner = bentoml.sklearn.get("iris_clf:latest").to_runner()

svc = bentoml.Service("iris_classifier", runners=[iris_clf_runner])

svc.api(input=NumpyNdarray(), output=NumpyNdarray())
def classify(input_series: np.ndarray) -> np.ndarray:
result = iris_clf_runner.predict.run(input_series)
return result


- Introduced framework `save_model`, `load_model`, and `to_runnable` APIs to complement the new `to_runner` API in the following frameworks. Other ML frameworks are still being migrated to the new Runner API at the moment. Coming in the next release are Onnx, FastAI, MLFlow and Catboost.
- PyTorch (TorchScript, PyTorch Lightning)
- Tensorflow
- Keras
- Scikit Learn
- XGBoost
- Huggingface Transformers
- Introduced a refreshing documentation website with more contents, see [https://docs.bentoml.org/](https://docs.bentoml.org/en/latest/index.html).
- Enhanced `bentoml containerize` command to include the following capabilities.
- Support multi-platform docker image build with [Docker Buildx](https://docs.docker.com/buildx/working-with-buildx/).
- Support for defining Environment Variables in generated docker images.
- Support for installing system packages via `bentofile.yaml`
- Support for customizing the generated Dockerfile via user-provided templates.

A big shout out to all the contributors for getting us a step closer to the BentoML 1.0 release. 🎉

What's Changed
* docs: update readme installation --pre flag by parano in https://github.com/bentoml/BentoML/pull/2515
* chore(ci): quit immediately for errors e2e tests by bojiang in https://github.com/bentoml/BentoML/pull/2517
* fix(ci): cover sync endpoints; cover cors by bojiang in https://github.com/bentoml/BentoML/pull/2520
* docs: fix cuda_version string value by rapidrabbit76 in https://github.com/bentoml/BentoML/pull/2523
* fix(framework): fix tf2 and keras class variable names by larme in https://github.com/bentoml/BentoML/pull/2525
* chore(ci): add more edge cases; boost e2e tests by bojiang in https://github.com/bentoml/BentoML/pull/2521
* fix(docker): remove backslash in comments by aarnphm in https://github.com/bentoml/BentoML/pull/2527
* fix(runner): sync remote runner uri schema with runner_app by larme in https://github.com/bentoml/BentoML/pull/2531
* fix: major bugs fixes about serving and GPU placement by bojiang in https://github.com/bentoml/BentoML/pull/2535
* chore(sdk): allowed single int value as the batch_dim by bojiang in https://github.com/bentoml/BentoML/pull/2536
* chore(ci): cover add_asgi_middleware in e2e tests by bojiang in https://github.com/bentoml/BentoML/pull/2537
* chore(framework): Add api_version for current implemented frameworks by larme in https://github.com/bentoml/BentoML/pull/2522
* doc(server): remove unnecessary `svc.asgi` lines by bojiang in https://github.com/bentoml/BentoML/pull/2543
* chore(server): lazy load meters; cover asgi app mounting in e2e test by bojiang in https://github.com/bentoml/BentoML/pull/2542
* feat: push runner to yatai by yetone in https://github.com/bentoml/BentoML/pull/2528
* style(runner): revert b14919db(factor out batching) by bojiang in https://github.com/bentoml/BentoML/pull/2549
* chore(ci): skip unsupported frameworks for now by bojiang in https://github.com/bentoml/BentoML/pull/2550
* doc: fix github action CI badge link by parano in https://github.com/bentoml/BentoML/pull/2554
* doc(server): fix header div by bojiang in https://github.com/bentoml/BentoML/pull/2557
* fix(metrics): filter out non-API endpoints in metrics by parano in https://github.com/bentoml/BentoML/pull/2559
* fix: Update SwaggerUI config by parano in https://github.com/bentoml/BentoML/pull/2560
* fix(server): wrong status code format in metrics by bojiang in https://github.com/bentoml/BentoML/pull/2561
* fix(server): metrics name issue under specify service names by bojiang in https://github.com/bentoml/BentoML/pull/2556
* fix: path for custom dockerfile templates by aarnphm in https://github.com/bentoml/BentoML/pull/2547
* feat: include env build options in bento.yaml by parano in https://github.com/bentoml/BentoML/pull/2562
* chore: minor fixes and docs change from QA by parano in https://github.com/bentoml/BentoML/pull/2564
* fix(qa): allow cuda_version when distro is None with default by aarnphm in https://github.com/bentoml/BentoML/pull/2565
* fix(qa): bento runner resource should limit to user provided configs by parano in https://github.com/bentoml/BentoML/pull/2566

New Contributors
* rapidrabbit76 made their first contribution in https://github.com/bentoml/BentoML/pull/2523

**Full Changelog**: https://github.com/bentoml/BentoML/compare/v1.0.0-rc0...v1.0.0-rc1

1.0.0rc0

This is a preview release for BentoML 1.0, check out the quick start guide here: https://docs.bentoml.org/en/latest/quickstart.html and documentation at http://docs.bentoml.org/

Key changes

What's Changed
* chore(server): pass runner map through envvar by bojiang in https://github.com/bentoml/BentoML/pull/2396
* fix(server): init prometheus dir for standalone running by bojiang in https://github.com/bentoml/BentoML/pull/2397
* fix(2316): --quiet should set logger level by parano in https://github.com/bentoml/BentoML/pull/2399
* feat: allow serving from project dir using import str from bentofile.yaml by parano in https://github.com/bentoml/BentoML/pull/2398
* chore(server): default values for entrypoints by bojiang in https://github.com/bentoml/BentoML/pull/2401
* fix(ci): use local bentoml in e2e test by bojiang in https://github.com/bentoml/BentoML/pull/2403
* docs: update advanced guide on building bentos by splch in https://github.com/bentoml/BentoML/pull/2346
* freeze model info and validate metadata entries by sauyon in https://github.com/bentoml/BentoML/pull/2363
* feat: store runners in bento manifest by yetone in https://github.com/bentoml/BentoML/pull/2407
* docs: fix readthedocs build issue by parano in https://github.com/bentoml/BentoML/pull/2422
* docs: update fossa license scan badge by parano in https://github.com/bentoml/BentoML/pull/2420
* fix(server): ensure distributed serving / serving on all platforms by bojiang in https://github.com/bentoml/BentoML/pull/2414
* Docs/core and guides by timliubentoml in https://github.com/bentoml/BentoML/pull/2417
* feat(internal): implement request contexts and check inference API types by sauyon in https://github.com/bentoml/BentoML/pull/2375
* fix: ensure compatibility with attrs 20.1.0 by sauyon in https://github.com/bentoml/BentoML/pull/2423
* chore(server): resource utils by bojiang in https://github.com/bentoml/BentoML/pull/2370
* fix: consistent naming accross docker and build config by aarnphm in https://github.com/bentoml/BentoML/pull/2426
* refactor: runner/runnable interface by bojiang in https://github.com/bentoml/BentoML/pull/2432
* feat(internal): add signature to Model and remove bentoml_version by sauyon in https://github.com/bentoml/BentoML/pull/2433
* runner refactor: Model to_runner/to_runnable interface by parano in https://github.com/bentoml/BentoML/pull/2435
* runnablehandle proposal by sauyon in https://github.com/bentoml/BentoML/pull/2438
* Runnable refactors and Model info update by sauyon in https://github.com/bentoml/BentoML/pull/2439
* Runner Resources implementation by bojiang in https://github.com/bentoml/BentoML/pull/2436
* refactor(runner): clean runner handle by bojiang in https://github.com/bentoml/BentoML/pull/2441
* chore(runner): make runnable scheduling traits constant by bojiang in https://github.com/bentoml/BentoML/pull/2442
* fix(runner): async run by bojiang in https://github.com/bentoml/BentoML/pull/2443
* added details for each paramters in options by timliubentoml in https://github.com/bentoml/BentoML/pull/2429
* Runners refactor: service & bento build changes by parano in https://github.com/bentoml/BentoML/pull/2440
* refactor: runner app by bojiang in https://github.com/bentoml/BentoML/pull/2445
* fix(internal): remove unused response_code field by sauyon in https://github.com/bentoml/BentoML/pull/2444
* Fix ModelInfo cattrs serialization issue by parano in https://github.com/bentoml/BentoML/pull/2446
* feat(internal): File I/O descriptor (re-)implementation by sauyon in https://github.com/bentoml/BentoML/pull/2272
* docs: Update Development.md by kakokat in https://github.com/bentoml/BentoML/pull/2424
* docs: Update DEVELOPMENT.md by parano in https://github.com/bentoml/BentoML/pull/2452
* refactor: datacontainer api changes with ndarray draft by larme in https://github.com/bentoml/BentoML/pull/2449
* feat(server): implement runner app by sauyon in https://github.com/bentoml/BentoML/pull/2451
* chore(runner): use low level nvml API by bojiang in https://github.com/bentoml/BentoML/pull/2450
* fix(server): fix container in runner app IPC by sauyon in https://github.com/bentoml/BentoML/pull/2454
* feat(runner): scheduling strategy by bojiang in https://github.com/bentoml/BentoML/pull/2453
* Fix: attribute error runner_type in bento serve by parano in https://github.com/bentoml/BentoML/pull/2457
* refactor(runner): update Pandas and Default DataContainer by larme in https://github.com/bentoml/BentoML/pull/2455
* chore(yatai): add version and org_uid to tracking by aarnphm in https://github.com/bentoml/BentoML/pull/2458
* chore(internal): fix typing by sauyon in https://github.com/bentoml/BentoML/pull/2460
* tests: fix runner1.0 branch unit tests by parano in https://github.com/bentoml/BentoML/pull/2462
* docs(model): update ModelSignature documentation by sauyon in https://github.com/bentoml/BentoML/pull/2463
* feat(xgboost): 1.0 XGBoost implementation by sauyon in https://github.com/bentoml/BentoML/pull/2459
* feat(frameworks): update framework template by sauyon in https://github.com/bentoml/BentoML/pull/2461
* fix(framework): fix Runnable closing over loop variable bug by larme in https://github.com/bentoml/BentoML/pull/2466
* chore: fix types by sauyon in https://github.com/bentoml/BentoML/pull/2468
* chore: make ModelInfo yaml backwards compatible by parano in https://github.com/bentoml/BentoML/pull/2470
* fix(runner): fix bugs in runner batching by sauyon in https://github.com/bentoml/BentoML/pull/2469
* docs: re-organize docs for 1.0rc release by parano in https://github.com/bentoml/BentoML/pull/2474
* chore: add furo to docs-requirements.txt by aarnphm in https://github.com/bentoml/BentoML/pull/2475
* feat(ci): re-enable e2e tests by bojiang in https://github.com/bentoml/BentoML/pull/2456
* chore: add runners-1.0 to CI by aarnphm in https://github.com/bentoml/BentoML/pull/2431
* fix(runner): remove unnecessary runnable_self arugment by larme in https://github.com/bentoml/BentoML/pull/2482
* docs: update for xgboost doc by kakokat in https://github.com/bentoml/BentoML/pull/2481
* test(runner): update DataContainer tests by larme in https://github.com/bentoml/BentoML/pull/2476
* feat: buildx backend for `bentoml containerize` by aarnphm in https://github.com/bentoml/BentoML/pull/2483
* refactor(runner): simplify batch dim by bojiang in https://github.com/bentoml/BentoML/pull/2484
* fix(runner): removing inspect by bojiang in https://github.com/bentoml/BentoML/pull/2485
* fix(server): fix development_mode in the config by bojiang in https://github.com/bentoml/BentoML/pull/2488
* fix(server): fix containerize subcommand by bojiang in https://github.com/bentoml/BentoML/pull/2490
* fix(tests): update model unit tests for new batch_dim type by sauyon in https://github.com/bentoml/BentoML/pull/2487
* refactor(server): supervise dev server with circus by bojiang in https://github.com/bentoml/BentoML/pull/2489
* fix(server): correctly use starlette APIs by sauyon in https://github.com/bentoml/BentoML/pull/2486
* fix(internal): revert typing strictness changes by sauyon in https://github.com/bentoml/BentoML/pull/2494
* feat: Transformers framework runner implementation 1.0 by ssheng in https://github.com/bentoml/BentoML/pull/2479
* Runners 1.0 tensorflow_v2 impl by larme in https://github.com/bentoml/BentoML/pull/2430
* Testing framework and runner app update by sauyon in https://github.com/bentoml/BentoML/pull/2500
* refactor(framework): update keras to runners-1.0 branch by larme in https://github.com/bentoml/BentoML/pull/2498
* fix: swagger UI bundle update by parano in https://github.com/bentoml/BentoML/pull/2501
* refactor: Dockerfile generation by aarnphm in https://github.com/bentoml/BentoML/pull/2473
* feat(internal): add save_format_version for BentoML model by larme in https://github.com/bentoml/BentoML/pull/2502
* Revert "feat(internal): add save_format_version for BentoML model" by larme in https://github.com/bentoml/BentoML/pull/2504
* docs: Update documentation for 1.0 by parano in https://github.com/bentoml/BentoML/pull/2506
* fix(framework): adapt changes for Tensorflow DataContainer by larme in https://github.com/bentoml/BentoML/pull/2507
* feat(framework): pytorch by bojiang in https://github.com/bentoml/BentoML/pull/2499
* docs: misc docs updates by parano in https://github.com/bentoml/BentoML/pull/2511
* refactor(framework): move `_mapping` for tf2 and keras by larme in https://github.com/bentoml/BentoML/pull/2510
* chore: unify circus logs to bentoml + fix circus config parsing for api_server by aarnphm in https://github.com/bentoml/BentoML/pull/2509
* chore: add release candidate backwards compatibility warnings by parano in https://github.com/bentoml/BentoML/pull/2512
* fix: revert pining pip version for tests by bojiang in https://github.com/bentoml/BentoML/pull/2514
* Merge 1.0 development branch by parano in https://github.com/bentoml/BentoML/pull/2513

New Contributors
* splch made their first contribution in https://github.com/bentoml/BentoML/pull/2346
* kakokat made their first contribution in https://github.com/bentoml/BentoML/pull/2424

**Full Changelog**: https://github.com/bentoml/BentoML/compare/v1.0.0-a7...v1.0.0-rc0

1.0.0a7

This is a preview release for BentoML 1.0, check out the quick start guide here: https://docs.bentoml.org/en/latest/quickstart.html and documentation at http://docs.bentoml.org/

Key changes

* **BREAKING CHANGE**: Default serving port has been changed to 3000
* This is due to an issue with new MacOS where 5000 port is always in use.
* This will affect default serving port when deploying with Docker. Existing 1.0 preview release users will need to either change deployment config to use port 3000, or pass `--port 5000` to the container command, in order to use the previous default port setting.
* New import/export API
* Users can now export models and bentos from local store to a standalone file
* Lean more via `bentoml export --help` and `bentoml models export --help`

What's Changed
* docs(cli): clean up cli docstrings by larme in https://github.com/bentoml/BentoML/pull/2342
* fix: YataiClientContext initialization missing email argument by yetone in https://github.com/bentoml/BentoML/pull/2348
* chore(ci): run e2e tests in docker by bojiang in https://github.com/bentoml/BentoML/pull/2349
* style: minor typing fixes by bojiang in https://github.com/bentoml/BentoML/pull/2350
* Refactor model save to include labels, metadata and custom_objects by larme in https://github.com/bentoml/BentoML/pull/2351
* fix: better error message in python < 3.9 by larme in https://github.com/bentoml/BentoML/pull/2352
* refactor(internal): move Tag out of types by sauyon in https://github.com/bentoml/BentoML/pull/2358
* fix(frameworks): use bentoml.models.create instead of Model.create by sauyon in https://github.com/bentoml/BentoML/pull/2360
* fix: add change_global_cwd params to bentoml.load by parano in https://github.com/bentoml/BentoML/pull/2356
* fix: import model from S3 by almirb in https://github.com/bentoml/BentoML/pull/2361
* fix: extract correct desired Python version by matheusMoreno in https://github.com/bentoml/BentoML/pull/2362
* fix(service): fix `load_bento` arguments position when retrying after `import_service` failed by larme in https://github.com/bentoml/BentoML/pull/2369
* fix: cgroups for cpu should be 1 when <= 0 by aarnphm in https://github.com/bentoml/BentoML/pull/2372
* chore: lock rich to be >=11.2.0 by aarnphm in https://github.com/bentoml/BentoML/pull/2378
* internal: usage tracking by aarnphm in https://github.com/bentoml/BentoML/pull/2318
* feat(internal): try to correct missing latest files by sauyon in https://github.com/bentoml/BentoML/pull/2383
* chore: cleanup 3.6 metadata by aarnphm in https://github.com/bentoml/BentoML/pull/2388
* chore: remove unecessary model_store by aarnphm in https://github.com/bentoml/BentoML/pull/2384
* fix: not lock typing_extensions to fix rich and pytorch lightning requirements by aarnphm in https://github.com/bentoml/BentoML/pull/2390
* bug: fix CLI command delete with latest tag by parano in https://github.com/bentoml/BentoML/pull/2391
* feat: improve list CLI command output by parano in https://github.com/bentoml/BentoML/pull/2392
* fix: update yatai client to work with BentoInfo changes by parano in https://github.com/bentoml/BentoML/pull/2393
* fix(server): duplicate metrics by bojiang in https://github.com/bentoml/BentoML/pull/2394

New Contributors
* almirb made their first contribution in https://github.com/bentoml/BentoML/pull/2361
* matheusMoreno made their first contribution in https://github.com/bentoml/BentoML/pull/2362

**Full Changelog**: https://github.com/bentoml/BentoML/compare/v1.0.0-a6...v1.0.0-a7

1.0.0a6

This is a preview release for BentoML 1.0, check out the quick start guide here: https://docs.bentoml.org/en/latest/quickstart.html and documentation at http://docs.bentoml.org/

Page 11 of 21

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.