Bentoml

Latest version: v1.2.18

Safety actively analyzes 638845 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 9 of 21

1.0.13

🍱 BentoML `v1.0.13` is released featuring a preview of [batch inference with Spark](https://docs.bentoml.org/en/latest/integrations/spark.html).

- Run the batch inference job using the `bentoml.batch.run_in_spark()` method. This method takes the API name, the Spark DataFrame containing the input data, and the Spark session itself as parameters, and it returns a DataFrame containing the results of the batch inference job.

python
import bentoml

Import the bento from a repository or get the bento from the bento store
bento = bentoml.import_bento("s3://bentoml/quickstart")

Run the run_in_spark function with the bento, API name, and Spark session
results_df = bentoml.batch.run_in_spark(bento, "classify", df, spark)


- Internally, what happens when you run `run_in_spark` is as follows:
- First, the bento is distributed to the cluster. Note that if the bento has already been distributed, i.e. you have already run a computation with that bento, this step is skipped.
- Next, a process function is created, which starts a BentoML server on each of the Spark workers, then uses a client to process all the data. This is done so that the workers take advantage of the batch processing features of the BentoML server. PySpark pickles this process function and dispatches it, along with the relevant data, to the workers.
- Finally, the function is evaluated on the given dataframe. Once all methods that the user defined in the script have been executed, the data is returned to the master node.

⚠️ The `bentoml.batch` API may undergo incompatible changes until general availability announced in a later minor version release.
🥂 Shout out to [jeffthebear](https://github.com/jeffthebear), [KimSoungRyoul](https://github.com/KimSoungRyoul), [Robert Fernandez](https://github.com/RobbieFernandez), [Marco Vela](https://github.com/characat0), [Quan Nguyen](https://github.com/qu8n), and [y1450](https://github.com/y1450) from the community for their contributions in this release.

What's Changed
* docs: add inline notes and better exception by bojiang in https://github.com/bentoml/BentoML/pull/3296
* chore(deps): bump pytest-asyncio from 0.20.2 to 0.20.3 by dependabot in https://github.com/bentoml/BentoML/pull/3334
* feat: bentoserver client by qu8n in https://github.com/bentoml/BentoML/pull/3321
* fix(transformers): check for task aliases by jeffthebear in https://github.com/bentoml/BentoML/pull/3337
* chore(framework): add partial_kwargs to picklable and pytorch runners by bojiang in https://github.com/bentoml/BentoML/pull/3338
* feat: protobuf shim by aarnphm in https://github.com/bentoml/BentoML/pull/3333
* fix: CI breakage by aarnphm in https://github.com/bentoml/BentoML/pull/3350
* chore(deps): bump black[jupyter] from 22.10.0 to 22.12.0 by dependabot in https://github.com/bentoml/BentoML/pull/3354
* chore(deps): bump isort from 5.10.1 to 5.11.1 by dependabot in https://github.com/bentoml/BentoML/pull/3355
* feat(http server): pass-through openapi of mounted apps by bojiang in https://github.com/bentoml/BentoML/pull/3358
* fix(pytorch): runnable method collision by bojiang in https://github.com/bentoml/BentoML/pull/3357
* fix(torchscript): runnable method collision by bojiang in https://github.com/bentoml/BentoML/pull/3364
* chore(deps): bump isort from 5.11.1 to 5.11.2 by dependabot in https://github.com/bentoml/BentoML/pull/3361
* chore(deps): bump isort from 5.11.2 to 5.11.3 in /requirements by dependabot in https://github.com/bentoml/BentoML/pull/3374
* chore(deps): bump bufbuild/buf-setup-action from 1.9.0 to 1.10.0 by dependabot in https://github.com/bentoml/BentoML/pull/3370
* chore(deps): bump coverage[toml] from 6.5.0 to 7.0.0 in /requirements by dependabot in https://github.com/bentoml/BentoML/pull/3373
* chore(deps): bump pylint from 2.15.8 to 2.15.9 in /requirements by dependabot in https://github.com/bentoml/BentoML/pull/3372
* chore(deps): bump imageio from 2.22.4 to 2.23.0 in /requirements by dependabot in https://github.com/bentoml/BentoML/pull/3371
* fix: make sure to handle relative path for templates by aarnphm in https://github.com/bentoml/BentoML/pull/3375
* fix(containerize): fs path format on windows by bojiang in https://github.com/bentoml/BentoML/pull/3378
* chore(deps): bump isort from 5.11.3 to 5.11.4 by dependabot in https://github.com/bentoml/BentoML/pull/3380
* docs: tracing and configuration by aarnphm in https://github.com/bentoml/BentoML/pull/3067
* fix: use relative urls in swagger UI by sauyon in https://github.com/bentoml/BentoML/pull/3381
* chore(deps): bump bufbuild/buf-setup-action from 1.10.0 to 1.11.0 by dependabot in https://github.com/bentoml/BentoML/pull/3382
* chore(deps): bump coverage[toml] from 7.0.0 to 7.0.1 by dependabot in https://github.com/bentoml/BentoML/pull/3383
* chore(config): ignore blank lines in bentoml config options by bojiang in https://github.com/bentoml/BentoML/pull/3385
* chore(deps): bump coverage[toml] from 7.0.1 to 7.0.2 by dependabot in https://github.com/bentoml/BentoML/pull/3386
* fix: log error when runnable instantiation fails by sauyon in https://github.com/bentoml/BentoML/pull/3388
* chore(deps): bump coverage[toml] from 7.0.2 to 7.0.3 by dependabot in https://github.com/bentoml/BentoML/pull/3390
* fix: don't use logger for CLI output by sauyon in https://github.com/bentoml/BentoML/pull/3395
* fix: allow passing server URLs with paths by sauyon in https://github.com/bentoml/BentoML/pull/3394
* fix(sdk): handling container platform from CLI separately by aarnphm in https://github.com/bentoml/BentoML/pull/3366
* fix: wrong self annotations by aarnphm in https://github.com/bentoml/BentoML/pull/3397
* chore(deps): bump imageio from 2.23.0 to 2.24.0 by dependabot in https://github.com/bentoml/BentoML/pull/3410
* chore(deps): bump coverage[toml] from 7.0.3 to 7.0.4 by dependabot in https://github.com/bentoml/BentoML/pull/3409
* chore(deps): bump pylint from 2.15.9 to 2.15.10 by dependabot in https://github.com/bentoml/BentoML/pull/3407
* fix: serve missing logic from 3321 by aarnphm in https://github.com/bentoml/BentoML/pull/3336
* chore(deps): bump coverage[toml] from 7.0.4 to 7.0.5 by dependabot in https://github.com/bentoml/BentoML/pull/3413
* chore(deps): bump yamllint from 1.28.0 to 1.29.0 by dependabot in https://github.com/bentoml/BentoML/pull/3414
* fix: regression f-string by aarnphm in https://github.com/bentoml/BentoML/pull/3416
* fix(runner): log correct error types during model validation by characat0 in https://github.com/bentoml/BentoML/pull/3421
* fix(client): make sure tags is available in specs by KimSoungRyoul in https://github.com/bentoml/BentoML/pull/3359
* fix: handling KeyError when accessing IODescriptor spec by aarnphm in https://github.com/bentoml/BentoML/pull/3398
* chore(deps): bump build[virtualenv] from 0.9.0 to 0.10.0 by dependabot in https://github.com/bentoml/BentoML/pull/3419
* feat: support bentos and tags in bentoml.bentos.serve by sauyon in https://github.com/bentoml/BentoML/pull/3424
* feat: add endpoints list to client by sauyon in https://github.com/bentoml/BentoML/pull/3423
* fix: 3399 during `containerize` by aarnphm in https://github.com/bentoml/BentoML/pull/3400
* feat: add context manager support for `bentoml.client` by y1450 in https://github.com/bentoml/BentoML/pull/3402
* chore: migrate to newer API in docstring by KimSoungRyoul in https://github.com/bentoml/BentoML/pull/3429
* chore(deps): bump bufbuild/buf-setup-action from 1.11.0 to 1.12.0 by dependabot in https://github.com/bentoml/BentoML/pull/3430
* chore(deps): bump pytest from 7.2.0 to 7.2.1 by dependabot in https://github.com/bentoml/BentoML/pull/3433
* feat: openapi_components method for Multipart by RobbieFernandez in https://github.com/bentoml/BentoML/pull/3438
* ci: disable 3.10 e2e for gRPC on Mac X86 by aarnphm in https://github.com/bentoml/BentoML/pull/3441
* chore(exportable): update exception message and errors imports by aarnphm in https://github.com/bentoml/BentoML/pull/3435
* feat: make `load_bento` take Tag and Bento by sauyon in https://github.com/bentoml/BentoML/pull/3444
* chore: add setuptools-scm as dev deps by aarnphm in https://github.com/bentoml/BentoML/pull/3443
* fix: load_bento Tag import by sauyon in https://github.com/bentoml/BentoML/pull/3445
* feat: support batch inference with Spark by sauyon in https://github.com/bentoml/BentoML/pull/3425
* chore: add pandas-stubs as dev-dependencies by aarnphm in https://github.com/bentoml/BentoML/pull/3442
* fix: raise more specific error in `from_spec` by sauyon in https://github.com/bentoml/BentoML/pull/3447
* fix(cli): overriding memoized options via `--opt` by aarnphm in https://github.com/bentoml/BentoML/pull/3401
* fix(exception): wrong variable reference by aarnphm in https://github.com/bentoml/BentoML/pull/3450
* fix: make sure to run migration for envvar by aarnphm in https://github.com/bentoml/BentoML/pull/3339
* feat: YataiClient context to communicate with multiple Yatai instances by ssheng in https://github.com/bentoml/BentoML/pull/3448

New Contributors
* characat0 made their first contribution in https://github.com/bentoml/BentoML/pull/3421
* y1450 made their first contribution in https://github.com/bentoml/BentoML/pull/3402
* RobbieFernandez made their first contribution in https://github.com/bentoml/BentoML/pull/3438

**Full Changelog**: https://github.com/bentoml/BentoML/compare/v1.0.12...v1.0.13

1.0.12

Important bug fixes.
- Fixed runner call failures with keyword arguments.
- Fixed incorrect user base image override .

What's Changed
* fix(runner): content-type error by aarnphm in https://github.com/bentoml/BentoML/pull/3302
* feat: grpc servicer implementation per version by aarnphm in https://github.com/bentoml/BentoML/pull/3316
* feat(grpc): adding service metadata by aarnphm in https://github.com/bentoml/BentoML/pull/3278
* docs: Update monitoring docs format by ssheng in https://github.com/bentoml/BentoML/pull/3324
* fix(runner): remote run_method with kwargs by larme in https://github.com/bentoml/BentoML/pull/3326
* fix: don't overwrite user base image by aarnphm in https://github.com/bentoml/BentoML/pull/3329
* fix: add upper bound for packaging version by aarnphm in https://github.com/bentoml/BentoML/pull/3331
* fix(container): podman health result string parsing by aarnphm in https://github.com/bentoml/BentoML/pull/3330
* fix: io descriptor backward compatibility by sauyon in https://github.com/bentoml/BentoML/pull/3327


**Full Changelog**: https://github.com/bentoml/BentoML/compare/v1.0.11...v1.0.12

1.0.11

🍱 BentoML `v1.0.11` is here featuring the introduction of an [inference collection and model monitoring API](https://docs.bentoml.org/en/latest/guides/monitoring.html) that can be easily integrated with any model monitoring frameworks.

![image](https://user-images.githubusercontent.com/861225/206288329-7d13f261-a45a-47b7-8598-3b24b3d29421.png)

- Introduced the `bentoml.monitor` API for monitoring any features, predictions, and target data in numerical, categorical, and numerical sequence types.

python
import bentoml
from bentoml.io import Text
from bentoml.io import NumpyNdarray

CLASS_NAMES = ["setosa", "versicolor", "virginica"]

iris_clf_runner = bentoml.sklearn.get("iris_clf:latest").to_runner()
svc = bentoml.Service("iris_classifier", runners=[iris_clf_runner])

svc.api(
input=NumpyNdarray.from_sample(np.array([4.9, 3.0, 1.4, 0.2], dtype=np.double)),
output=Text(),
)
async def classify(features: np.ndarray) -> str:
with bentoml.monitor("iris_classifier_prediction") as mon:
mon.log(features[0], name="sepal length", role="feature", data_type="numerical")
mon.log(features[1], name="sepal width", role="feature", data_type="numerical")
mon.log(features[2], name="petal length", role="feature", data_type="numerical")
mon.log(features[3], name="petal width", role="feature", data_type="numerical")

results = await iris_clf_runner.predict.async_run([features])
result = results[0]
category = CLASS_NAMES[result]

mon.log(category, name="pred", role="prediction", data_type="categorical")
return category


- Enabled monitoring data collection through log file forwarding using any forwarders (fluentbit, filebeat, logstash) or OTLP exporter implementations.
- Configuration for monitoring data collection through log files.

yaml
monitoring:
enabled: true
type: default
options:
log_path: path/to/log/file


- Configuration for monitoring data collection through an OTLP exporter.

yaml
monitoring:
enable: true
type: otlp
options:
endpoint: http://localhost:5000
insecure: true
credentials: null
headers: null
timeout: 10
compression: null
meta_sample_rate: 1.0


- Supported third-party monitoring data collector integrations through BentoML Plugins. See [bentoml/plugins](https://github.com/bentoml/plugins) repository for more details.

🐳 Improved containerization SDK and CLI options, read more in [3164](https://github.com/bentoml/BentoML/pull/3164).

- Added support for multiple backend builder options (Docker, nerdctl, Podman, Buildah, Buildx) in addition to buildctl (standalone buildkit builder).
- Improved Python SDK for containerization with different backend builder options.

python
import bentoml

bentoml.container.build("iris_classifier:latest", backend="podman", features=["grpc","grpc-reflection"], **kwargs)


- Improved CLI to include the newly added options.

bash
bentoml containerize --help


- Standardized the generated Dockerfile in bentos to be compatible with all build tools for use cases that require building from a Dockerfile directly.

💡 We continue to update the documentation and examples on every release to help the community unlock the full power of BentoML.

- Learn more about [inference data collection and model monitoring](https://docs.bentoml.org/en/latest/guides/monitoring.html) capabilities in BentoML.
- Learn more about the [default metrics](https://docs.bentoml.org/en/latest/guides/metrics.html#default-metrics) that comes out-of-the-box and how to add [custom metrics](https://docs.bentoml.org/en/latest/guides/metrics.html#custom-metrics) in BentoML.

What's Changed
* chore: add framework utils functions directory by larme in https://github.com/bentoml/BentoML/pull/3203
* fix: missing f-string in tag validation error message by csh3695 in https://github.com/bentoml/BentoML/pull/3205
* chore(build_config): bypass exception when cuda and conda is specified by aarnphm in https://github.com/bentoml/BentoML/pull/3188
* docs: Update asynchronous API documentation by ssheng in https://github.com/bentoml/BentoML/pull/3204
* style: use relative import inside _internal/ by larme in https://github.com/bentoml/BentoML/pull/3209
* style: fix `monitoring` type error by aarnphm in https://github.com/bentoml/BentoML/pull/3208
* chore(build): add dependabot for pyproject.toml by aarnphm in https://github.com/bentoml/BentoML/pull/3139
* chore(deps): bump black[jupyter] from 22.8.0 to 22.10.0 in /requirements by dependabot in https://github.com/bentoml/BentoML/pull/3217
* chore(deps): bump pylint from 2.15.3 to 2.15.5 in /requirements by dependabot in https://github.com/bentoml/BentoML/pull/3212
* chore(deps): bump pytest-asyncio from 0.19.0 to 0.20.1 in /requirements by dependabot in https://github.com/bentoml/BentoML/pull/3216
* chore(deps): bump imageio from 2.22.1 to 2.22.4 in /requirements by dependabot in https://github.com/bentoml/BentoML/pull/3211
* fix: don't index ContextVar at runtime by sauyon in https://github.com/bentoml/BentoML/pull/3221
* chore(deps): bump pyarrow from 9.0.0 to 10.0.0 in /requirements by dependabot in https://github.com/bentoml/BentoML/pull/3214
* chore: configuration check for development by aarnphm in https://github.com/bentoml/BentoML/pull/3223
* fix bento create by quandollar in https://github.com/bentoml/BentoML/pull/3220
* fix(docs): missing `table` tag by nyongja in https://github.com/bentoml/BentoML/pull/3231
* docs: grammar corrections by tbazin in https://github.com/bentoml/BentoML/pull/3234
* chore(deps): bump pytest-asyncio from 0.20.1 to 0.20.2 in /requirements by dependabot in https://github.com/bentoml/BentoML/pull/3238
* chore(deps): bump pytest-xdist[psutil] from 2.5.0 to 3.0.2 by dependabot in https://github.com/bentoml/BentoML/pull/3245
* chore(deps): bump pytest from 7.1.3 to 7.2.0 in /requirements by dependabot in https://github.com/bentoml/BentoML/pull/3237
* chore(deps): bump build[virtualenv] from 0.8.0 to 0.9.0 in /requirements by dependabot in https://github.com/bentoml/BentoML/pull/3240
* deps: bumping gRPC and OTLP dependencies by aarnphm in https://github.com/bentoml/BentoML/pull/3228
* feat(file): support custom mime type for file proto by aarnphm in https://github.com/bentoml/BentoML/pull/3095
* fix: multipart for client by sauyon in https://github.com/bentoml/BentoML/pull/3253
* fix(json): make sure to parse a list of dict for_sample by aarnphm in https://github.com/bentoml/BentoML/pull/3229
* chore: move test proto to internal tests only by aarnphm in https://github.com/bentoml/BentoML/pull/3255
* fix(framework): external_modules for loading pytorch by bojiang in https://github.com/bentoml/BentoML/pull/3254
* feat(container): builder implementation by aarnphm in https://github.com/bentoml/BentoML/pull/3164
* feat(sdk): implement otlp monitoring exporter by bojiang in https://github.com/bentoml/BentoML/pull/3257
* chore(grpc): add missing __init__.py by aarnphm in https://github.com/bentoml/BentoML/pull/3259
* docs(metrics): Update docs for the default metrics by ssheng in https://github.com/bentoml/BentoML/pull/3262
* chore: generate plain dockerfile without buildkit syntax by aarnphm in https://github.com/bentoml/BentoML/pull/3261
* style: remove ` type: ignore` by aarnphm in https://github.com/bentoml/BentoML/pull/3265
* fix: lazy load ONNX utils by aarnphm in https://github.com/bentoml/BentoML/pull/3266
* fix(pytorch): pickle is the unpickler of cloudpickle by bojiang in https://github.com/bentoml/BentoML/pull/3269
* fix: instructions for missing sklearn dependency by benjamintanweihao in https://github.com/bentoml/BentoML/pull/3271
* docs: ONNX signature docs by larme in https://github.com/bentoml/BentoML/pull/3272
* chore(deps): bump pyarrow from 10.0.0 to 10.0.1 by dependabot in https://github.com/bentoml/BentoML/pull/3273
* chore(deps): bump pylint from 2.15.5 to 2.15.6 by dependabot in https://github.com/bentoml/BentoML/pull/3274
* fix(pandas): only set columns when `apply_column_names` is set by mqk in https://github.com/bentoml/BentoML/pull/3275
* feat: configuration versioning by aarnphm in https://github.com/bentoml/BentoML/pull/3052
* fix(container): support comma in docker env by larme in https://github.com/bentoml/BentoML/pull/3285
* chore(stub): `import filetype` by aarnphm in https://github.com/bentoml/BentoML/pull/3260
* fix(container): ensure to stream logs when `DOCKER_BUILDKIT=0` by aarnphm in https://github.com/bentoml/BentoML/pull/3294
* docs: update instructions for containerize message by aarnphm in https://github.com/bentoml/BentoML/pull/3289
* fix: unset `NVIDIA_VISIBLE_DEVICES` when cuda image is used by aarnphm in https://github.com/bentoml/BentoML/pull/3298
* fix: multipart logic by sauyon in https://github.com/bentoml/BentoML/pull/3297
* chore(deps): bump pylint from 2.15.6 to 2.15.7 by dependabot in https://github.com/bentoml/BentoML/pull/3291
* docs: wrong arguments when saving by KimSoungRyoul in https://github.com/bentoml/BentoML/pull/3306
* chore(deps): bump pylint from 2.15.7 to 2.15.8 in /requirements by dependabot in https://github.com/bentoml/BentoML/pull/3308
* chore(deps): bump pytest-xdist[psutil] from 3.0.2 to 3.1.0 in /requirements by dependabot in https://github.com/bentoml/BentoML/pull/3309
* chore(pyproject): bumping python version typeshed to 3.11 by aarnphm in https://github.com/bentoml/BentoML/pull/3281
* fix(monitor): disable validate for Formatter by bojiang in https://github.com/bentoml/BentoML/pull/3317
* doc(monitoring): monitoring guide by bojiang in https://github.com/bentoml/BentoML/pull/3300
* feat: parsing path for env by aarnphm in https://github.com/bentoml/BentoML/pull/3314
* fix: remove assertion for dtype by aarnphm in https://github.com/bentoml/BentoML/pull/3320
* feat: client lazy load by aarnphm in https://github.com/bentoml/BentoML/pull/3323
* chore: provides shim for bentoctl by aarnphm in https://github.com/bentoml/BentoML/pull/3322

New Contributors
* csh3695 made their first contribution in https://github.com/bentoml/BentoML/pull/3205
* nyongja made their first contribution in https://github.com/bentoml/BentoML/pull/3231
* tbazin made their first contribution in https://github.com/bentoml/BentoML/pull/3234
* KimSoungRyoul made their first contribution in https://github.com/bentoml/BentoML/pull/3306

**Full Changelog**: https://github.com/bentoml/BentoML/compare/v1.0.10...v1.0.11

1.0.10

🍱 BentoML `v1.0.10` is released to address a recurring `broken pipe` reported by the community. Also included in this release, is a list of improvements we’d like to share with the community.

- Fixed an `aiohttp.client_exceptions.ClientOSError` caused by asymmetrical keep alive timeout settings between the API Server and Runner.

bash
aiohttp.client_exceptions.ClientOSError: [Errno 32] Broken pipe


- Added multi-output support for ONNX and TensorFlow frameworks.
- Added `from_sample` support to all [IO Descriptors](https://docs.bentoml.org/en/latest/reference/api_io_descriptors.html) in addition to just `bentoml.io.NumpyNdarray` and the sample is reflected in the Swagger UI.

python
Pandas Example
svc.api(
input=PandasDataFrame.from_sample(
pd.DataFrame([1,2,3,4])
),
output=PandasDataFrame(),
)

JSON Example
svc.api(
input=JSON.from_sample(
{"foo": 1, "bar": 2}
),
output=JSON(),
)

![image](https://user-images.githubusercontent.com/861225/200722853-e46cbe94-88f5-47e4-82eb-c56f620082e6.png)


💡 We continue to update the documentation and examples on every release to help the community unlock the full power of BentoML.

- Check out the updated [multi-model inference graph guide](https://docs.bentoml.org/en/latest/guides/graph.html) and [example](https://github.com/bentoml/BentoML/tree/main/examples/inference_graph) to learn how to compose multiple models in the same Bento service.
- Did you know BentoML support [OpenTelemetry tracing](https://opentelemetry.io/docs/concepts/signals/traces/) out-of-the-box? Checkout the [Tracing guide](https://docs.bentoml.org/en/latest/guides/tracing.html) for tracing support for OTLP, Jaeger, and Zipkin.

What's Changed
* feat(cli): log conditional environment variables by aarnphm in https://github.com/bentoml/BentoML/pull/3156
* fix: ensure conda not use pipefail and unset variables by aarnphm in https://github.com/bentoml/BentoML/pull/3171
* fix(templates): ensure to use python3 and pip3 by aarnphm in https://github.com/bentoml/BentoML/pull/3170
* fix(sdk): montioring log output by bojiang in https://github.com/bentoml/BentoML/pull/3175
* feat: make quickstart batchable by sauyon in https://github.com/bentoml/BentoML/pull/3172
* fix: lazy check for stubs via path when install local wheels by aarnphm in https://github.com/bentoml/BentoML/pull/3180
* fix(openapi): remove summary field under Info by aarnphm in https://github.com/bentoml/BentoML/pull/3178
* docs: Inference graph example by ssheng in https://github.com/bentoml/BentoML/pull/3183
* docs: remove whitespaces in migration guides by wellshs in https://github.com/bentoml/BentoML/pull/3185
* fix(build_config): validation when NoneType by aarnphm in https://github.com/bentoml/BentoML/pull/3187
* fix(docs): indentation in migration.rst by aarnphm in https://github.com/bentoml/BentoML/pull/3186
* doc(example): monitoring example for classification tasks by bojiang in https://github.com/bentoml/BentoML/pull/3176
* refactor(sdk): separate default monitoring impl by bojiang in https://github.com/bentoml/BentoML/pull/3189
* fix(ssl): provide default values in configuration by aarnphm in https://github.com/bentoml/BentoML/pull/3191
* fix: don't ignore logging conf by sauyon in https://github.com/bentoml/BentoML/pull/3192
* feat: tensorflow multi outputs support by larme in https://github.com/bentoml/BentoML/pull/3115
* docs: cleanup whitespace and typo by aarnphm in https://github.com/bentoml/BentoML/pull/3195
* chore: cleanup deadcode by aarnphm in https://github.com/bentoml/BentoML/pull/3196
* fix(runner): set uvicorn keep-alive by sauyon in https://github.com/bentoml/BentoML/pull/3198
* perf: refine onnx implementation by larme in https://github.com/bentoml/BentoML/pull/3166
* feat: `from_sample` for IO descriptor by aarnphm in https://github.com/bentoml/BentoML/pull/3143

New Contributors
* wellshs made their first contribution in https://github.com/bentoml/BentoML/pull/3185

**Full Changelog**: https://github.com/bentoml/BentoML/compare/v1.0.8...v1.0.9

What's Changed
* fix: from_sample override logic by aarnphm in https://github.com/bentoml/BentoML/pull/3202


**Full Changelog**: https://github.com/bentoml/BentoML/compare/v1.0.9...v1.0.10

1.0.8

🍱 BentoML `v1.0.8` is released with a list of improvement we hope that you’ll find useful.

- Introduced Bento Client for easy access to the BentoML service over HTTP. Both sync and async calls are supported. See the [Bento Client Guide](https://bentoml--3154.org.readthedocs.build/en/3154/guides/client.html) for more details.

python
from bentoml.client import Client

client = Client.from_url("http://localhost:3000")

Sync call
response = client.classify(np.array([[4.9, 3.0, 1.4, 0.2]]))

Async call
response = await client.async_classify(np.array([[4.9, 3.0, 1.4, 0.2]]))


- Introduced custom metrics support for easy instrumentation of custom metrics over Prometheus. See [Metrics Guide](https://bentoml--3154.org.readthedocs.build/en/3154/guides/metrics.html) for more details.

python
Histogram metric
inference_duration = bentoml.metrics.Histogram(
name="inference_duration",
documentation="Duration of inference",
labelnames=["nltk_version", "sentiment_cls"],
)

Counter metric
polarity_counter = bentoml.metrics.Counter(
name="polarity_total",
documentation="Count total number of analysis by polarity scores",
labelnames=["polarity"],
)


Full Prometheus style syntax is supported for instrumenting custom metrics inside API and Runner definitions.

python
Histogram
inference_duration.labels(
nltk_version=nltk.__version__, sentiment_cls=self.sia.__class__.__name__
).observe(time.perf_counter() - start)

Counter
polarity_counter.labels(polarity=is_positive).inc()


- Improved health checking to also cover the status of runners to avoid returning a healthy status before runners are ready.
- Added SSL/TLS support to gRPC serving.

bash
bentoml serve-grpc --ssl-certfile=credentials/cert.pem --ssl-keyfile=credentials/key.pem --production --enable-reflection


- Added channelz support for easy debugging gRPC serving.
- Allowed nested requirements with the `-r` syntax.

bash
requirements.txt
-r nested/requirements.txt

pydantic
Pillow
fastapi


- Improved the [adaptive batching](https://docs.bentoml.org/en/latest/guides/batching.html) dispatcher auto-tuning ability to avoid sporadic request failures due to batching in the beginning of the runner lifecycle.
- Fixed a bug such that runners will raise a `TypeError` when overloaded. Now an `HTTP 503 Service Unavailable` will be returned when runner is overloaded.

prolog
File "python3.9/site-packages/bentoml/_internal/runner/runner_handle/remote.py", line 188, in async_run_method
return tuple(AutoContainer.from_payload(payload) for payload in payloads)
TypeError: 'Response' object is not iterable



💡 We continue to update the documentation and examples on every release to help the community unlock the full power of BentoML.

- Check out the updated [PyTorch Framework Guide](https://docs.bentoml.org/en/latest/frameworks/pytorch.html#saving-a-trained-model) on how to use `external_modules` to save classes or utility functions required by the model.
- See the [Metrics Guide](https://docs.bentoml.org/en/latest/guides/metrics.html) on how to add custom metrics to your API and custom Runners.
- Learn more about how to use the [Bento Client](https://docs.bentoml.org/en/latest/guides/client.html) to call your BentoML service with Python easily.
- Check out the latest blog post on [why model serving over gRPC matters to data scientists](https://modelserving.com/blog/3-reasons-for-grpc).

🥂 We’d like to thank the community for your continued support and engagement.

- Shout out to judahrand for multiple contributions to BentoML and bentoctl.
- Shout out to phildamore-phdata, quandollar, 2JooYeon, and fortunto2 for their first contribution to BentoML.

1.0.7

🍱 BentoML released `v1.0.7` as a patch to quickly fix a critical module import issue introduced in `v1.0.6`. The import error manifests in the import of any modules under `io.*` or `models.*`. The following is an example of a typical error message and traceback. Please upgrade to `v1.0.7` to address this import issue.

bash
packages/anyio/_backends/_asyncio.py", line 21, in <module>
from io import IOBase
ImportError: cannot import name 'IOBase' from 'bentoml.io'


What's Changed
* test(grpc): e2e + unit tests by aarnphm in https://github.com/bentoml/BentoML/pull/2984
* feat: support multipart upload for large bento and model by yetone in https://github.com/bentoml/BentoML/pull/3044
* fix(config): respect `api_server.workers` by judahrand in https://github.com/bentoml/BentoML/pull/3049
* chore(lint): remove unused import by aarnphm in https://github.com/bentoml/BentoML/pull/3051
* fix(import): namespace collision by aarnphm in https://github.com/bentoml/BentoML/pull/3058

New Contributors
* judahrand made their first contribution in https://github.com/bentoml/BentoML/pull/3049

**Full Changelog**: https://github.com/bentoml/BentoML/compare/v1.0.6...v1.0.7

Page 9 of 21

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.