Inference

Latest version: v0.30.0

Safety actively analyzes 687918 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 2 of 15

0.29.0

Not secure
πŸš€ Added

πŸ“§ Slack and Twilio notifications in Workflows

We've just added two notification blocks to Worfklows ecosystem - [Slack](https://inference.roboflow.com/workflows/blocks/slack_notification/) and [Twilio](https://inference.roboflow.com/workflows/blocks/twilio_sms_notification/). Now, there is nothing that can stop you from sending notifications from your Workflows!

https://github.com/user-attachments/assets/52ac8a94-69e4-4304-a0b8-8c77695e688f

`inference-cli` 🀝 Workflows

We are happy to share that `inference-cli` has now a new command - `inference workflows` that make it possible to process data with Workflows without any additional Python scripts needed πŸ˜„

πŸŽ₯ Video files processing
* Input a video path, specify an output directory, and run any workflow.
* Frame-by-frame results saved as CSV or JSONL.
* Your Workflow outputs images? Get an output video out from them if you wanted

πŸ–ΌοΈ Process images and directories of images πŸ“‚
* Outputs stored in subdirectories with JSONL/CSV aggregation available.
* Fault-tollerant processing:
* βœ… Resume after failure (tracked in logs).
* πŸ”„ Option to force re-processing.

Review our [πŸ“– docs](https://inference.roboflow.com/inference_helpers/cli_commands/workflows/) to discover all options!

<details>
<summary>πŸ‘‰ <b>Try the command</b></summary>

To try the command, simply run:
bash
pip install inference

inference workflows process-images-directory \
-i {your_input_directory} \
-o {your_output_directory} \
--workspace_name {your-roboflow-workspace-url} \
--workflow_id {your-workflow-id} \
--api-key {your_roboflow_api_key}

</details>


https://github.com/user-attachments/assets/383e5300-da44-4526-b99f-9a301d944557

πŸ”‘ Secrets provider block in Workflows

Many Workflows blocks require credential to work correctly, but so far, the ecosystem only provided one secure option for passing those credentials - using workflow parameters, forcing client applications to manipulate secret values.

Since this is not handy solution, we decided to create [Environment Secrets Store block](https://inference.roboflow.com/workflows/blocks/environment_secrets_store/) which is capable of fetching credentials from environmental variables of `inference` server. Thanks to that, admins can now set up the server and client's code do not need to handle secrets ✨

⚠️ Security Notice:
For enhanced security, always use secret providers or Workflow parameters to handle credentials. Hardcoding secrets into your Workflows is strongly discouraged.

πŸ”’ Limitations:
This block is designed for self-hosted inference servers only. Due to security concerns, exporting environment variables is not supported on the hosted Roboflow Platform.

🌐 OPC Workflow block πŸ“‘

The OPC Writer block provides a versatile set of integration options that enable enterprises to seamlessly connect with OPC-compliant systems and incorporate real-time data transfer into their workflows. Here’s how you can leverage the block’s flexibility for various integration scenarios that industry-class solutions require.

✨ Key features
* **Seamless OPC Integration:** Easily send data to OPC servers, whether on local networks or cloud environments, ensuring your workflows can interface with industrial control systems, IoT devices, and SCADA systems.
* **Cross-Platform Connectivity**: Built with [asyncua](https://github.com/FreeOpcUa/opcua-asyncio), the block enables smooth communication across multiple platforms, enabling integration with existing infrastructure and ensuring compatibility with a wide range of OPC standards.

> [!IMPORTANT]
> This Workflow block is released under [Roboflow Enterprise License](https://github.com/roboflow/inference/blob/main/inference/enterprise/LICENSE.txt) and is not available by default on Roboflow Hosted Platform.
> Anyone interested in integrating Workflows with industry systems through OPC - please [contact Roboflow Sales](https://roboflow.com/sales)

See grzegorz-roboflow's change in https://github.com/roboflow/inference/pull/842

πŸ› οΈ Fixed

0.28.2

Not secure
πŸ”§ Fixed issue with `inference` package installation

26.11.2024 there was a release `0.20.4` of `tokenizers` library which is dependency of `inference` dependencies introducing breaking change for those `inference` clients who use Python 3.8 - causing the following errors while installation of recent (and older) versions of `inference`:

<details>
<summary>πŸ‘‰ MacOS</summary>


Downloading tokenizers-0.20.4.tar.gz (343 kB)
Installing build dependencies ... done
Getting requirements to build wheel ... done
Preparing metadata (pyproject.toml) ... error
error: subprocess-exited-with-error

Γ— Preparing metadata (pyproject.toml) did not run successfully.
β”‚ exit code: 1
╰─> [6 lines of output]

Cargo, the Rust package manager, is not installed or is not on PATH.
This package requires Rust and Cargo to compile extensions. Install it through
the system's package manager or via https://rustup.rs/

Checking for Rust toolchain....
[end of output]

note: This error originates from a subprocess, and is likely not a problem with pip.
error: metadata-generation-failed

Γ— Encountered error while generating package metadata.
╰─> See above for output.

note: This is an issue with the package mentioned above, not pip.
hint: See above for details


</details>


<details>
<summary>πŸ‘‰ Linux</summary>

After installation, the following error was presented

/opt/hostedtoolcache/Python/3.8.18/x64/lib/python3.8/site-packages/transformers/utils/import_utils.py:1778: in _get_module
return importlib.import_module("." + module_name, self.__name__)
/opt/hostedtoolcache/Python/3.8.18/x64/lib/python3.8/importlib/__init__.py:127: in import_module
return _bootstrap._gcd_import(name[level:], package, level)
<frozen importlib._bootstrap>:[101](https://github.com/roboflow/inference/actions/runs/12049175470/job/33595408508#step:7:102)4: in _gcd_import
???
<frozen importlib._bootstrap>:991: in _find_and_load
???
<frozen importlib._bootstrap>:961: in _find_and_load_unlocked
???
<frozen importlib._bootstrap>:219: in _call_with_frames_removed
???
<frozen importlib._bootstrap>:1014: in _gcd_import
???
<frozen importlib._bootstrap>:991: in _find_and_load
???
<frozen importlib._bootstrap>:975: in _find_and_load_unlocked
???
<frozen importlib._bootstrap>:671: in _load_unlocked
???
<frozen importlib._bootstrap_external>:843: in exec_module
???
<frozen importlib._bootstrap>:219: in _call_with_frames_removed
???
/opt/hostedtoolcache/Python/3.8.18/x64/lib/python3.8/site-packages/transformers/models/__init__.py:15: in <module>
from . import (
/opt/hostedtoolcache/Python/3.8.18/x64/lib/python3.8/site-packages/transformers/models/mt5/__init__.py:36: in <module>
from ..t5.tokenization_t5_fast import T5TokenizerFast
/opt/hostedtoolcache/Python/3.8.18/x64/lib/python3.8/site-packages/transformers/models/t5/tokenization_t5_fast.py:23: in <module>
from ...tokenization_utils_fast import PreTrainedTokenizerFast
/opt/hostedtoolcache/Python/3.8.18/x64/lib/python3.8/site-packages/transformers/tokenization_utils_fast.py:26: in <module>
import tokenizers.pre_tokenizers as pre_tokenizers_fast
/opt/hostedtoolcache/Python/3.8.18/x64/lib/python3.8/site-packages/tokenizers/__init__.py:78: in <module>
from .tokenizers import (
E ImportError: /opt/hostedtoolcache/Python/3.8.18/x64/lib/python3.8/site-packages/tokenizers/tokenizers.abi3.so: undefined symbol: PyInterpreterState_Get

The above exception was the direct cause of the following exception:
tests/inference/models_predictions_tests/test_owlv2.py:4: in <module>
from inference.models.owlv2.owlv2 import OwlV2
inference/models/owlv2/owlv2.py:11: in <module>
from transformers import Owlv2ForObjectDetection, Owlv2Processor
<frozen importlib._bootstrap>:[103](https://github.com/roboflow/inference/actions/runs/12049175470/job/33595408508#step:7:104)9: in _handle_fromlist
???
/opt/hostedtoolcache/Python/3.8.18/x64/lib/python3.8/site-packages/transformers/utils/import_utils.py:1766: in __getattr__
module = self._get_module(self._class_to_module[name])
/opt/hostedtoolcache/Python/3.8.18/x64/lib/python3.8/site-packages/transformers/utils/import_utils.py:1780: in _get_module
raise RuntimeError(
E RuntimeError: Failed to import transformers.models.owlv2 because of the following error (look up to see its traceback):
E /opt/hostedtoolcache/Python/3.8.18/x64/lib/python3.8/site-packages/tokenizers/tokenizers.abi3.so: undefined symbol: PyInterpreterState_Get


</details>

> [!CAUTION]
> **We are fixing the problem in `inference` 0.28.2**, but it is not possible to be fixed older releases - for those who need to fix that
in their environments, please modify the build such that installing `inference` you also install `tokenizers<=0.20.3`.
> bash
> pip install inference "tokenizers<=0.20.3"
>

πŸ”§ Fixed issue with CUDA and stream management API

While running `inference` server and using [stream management API](https://inference.roboflow.com/workflows/video_processing/overview/) to run Workflows against video inside docker container, it was not possible to use CUDA due to bug present from the very start of the feature. We are fixing it now.



**Full Changelog**: https://github.com/roboflow/inference/compare/v0.28.1...v0.28.2

0.28.1

Not secure
πŸ”§ Fixed broken Workflows loader

> [!CAUTION]
> In `0.28.0` we had bug causing this error:
>
> ModuleNotFoundError: No module named 'inference.core.workflows.core_steps.sinks.roboflow.model_monitoring_inference_aggregator'
>
> We've junked version `0.28.0` of [`inference`](https://pypi.org/project/inference/), [`inference-core`](https://pypi.org/project/inference-core/), [`inference-cpu`](https://pypi.org/project/inference-cpu/) and [`inference-gpu`](https://pypi.org/project/inference-gpu) and **we recommend our clients to upgrade**.

What's Changed
* Add init.py to fix docs generation by PawelPeczek-Roboflow in https://github.com/roboflow/inference/pull/830
* Add missing static landing page outputs by PawelPeczek-Roboflow in https://github.com/roboflow/inference/pull/832
* Release ARM CPU builds by alexnorell in https://github.com/roboflow/inference/pull/831
* Remove debug print from owlv2 by alexnorell in https://github.com/roboflow/inference/pull/833
* Bump version to 0.28.1 by PawelPeczek-Roboflow in https://github.com/roboflow/inference/pull/835


**Full Changelog**: https://github.com/roboflow/inference/compare/v0.28.0...v0.28.1

0.28.0

Not secure
πŸš€ Added

πŸŽ₯ New Video Processing Cookbook! πŸ’ͺ

We’re excited to introduce a new cookbook showcasing a custom video-processing use case: **Creating a Video-Based Fitness Trainer!** πŸš€ This is not only really nice example on how to use Roboflow tools, but also **a great Open Source community contribution from Matvezy** πŸ₯Ή. Just take look at the [notebook](https://github.com/roboflow/inference/blob/main/examples/community/fitness-gpt-coach/Squat_Supervision_Inference_Cookbook.ipynb).

https://github.com/user-attachments/assets/650dd512-8aae-4887-a9de-86ab0f939e59


🎯 Purpose
This cookbook demonstrates how `inference` enhances foundational models like **GPT-4o** by adding powerful vision capabilities for accurate, data-driven insights. Perfect for exploring fitness applications or custom video processing workflows.

πŸ” What’s inside?
* πŸƒ **Body Keypoint Tracking:** Use `inference` to detect and track body keypoints in real time.
* πŸ“ **Joint Angle Calculation:** Automatically compute and annotate joint angles on video frames.
* πŸ€– **AI-Powered Fitness Advice:** Integrates GPT to analyze movements and provide personalized fitness tips based on video data.
* πŸ› οΈ **Built with [supervision](https://github.com/roboflow/supervision):** for efficient annotation and processing.

✨ New Workflows Block for Model Monitoring! πŸ“Š

We’re thrilled to announce a new block that takes inference data reporting to the next level by integrating seamlessly with [Roboflow Model Monitoring](https://docs.roboflow.com/deploy/model-monitoring) - all thanks to robiscoding πŸš€

Take look at [πŸ“– documentation](https://inference.roboflow.com/workflows/blocks/model_monitoring_inference_aggregator/) to learn more.


πŸ‹οΈ Why to use?

* 🏭 Monitor your model processing video
* ⏱️ Track and validate model performance effortlessly over time
* πŸ”§ Gain understanding on how to improve your models over time

πŸ”§ Fixed
* Change the platform tests assertions to compensate for PR 798 by PawelPeczek-Roboflow in https://github.com/roboflow/inference/pull/816
* Set hosted to True when running on dedicated deployment by grzegorz-roboflow in https://github.com/roboflow/inference/pull/817
* Fix issue with Workflows blocks for Roboflow models v2 not using base64 by PawelPeczek-Roboflow in https://github.com/roboflow/inference/pull/823
* Bug which turned out not to be bug by PawelPeczek-Roboflow in https://github.com/roboflow/inference/pull/824
* Fix bug with primitive types parsing in Worklfows by PawelPeczek-Roboflow in https://github.com/roboflow/inference/pull/825
* Bump cross-spawn from 7.0.3 to 7.0.6 in /inference/landing in the npm_and_yarn group by dependabot in https://github.com/roboflow/inference/pull/828

πŸ—οΈ Changed

* Handle internal roboflow service name env by grzegorz-roboflow in https://github.com/roboflow/inference/pull/826
* Always include internal envs if set by grzegorz-roboflow in https://github.com/roboflow/inference/pull/827
* Add support for preloading models by alexnorell in https://github.com/roboflow/inference/pull/822
* Make TURN server config optional by grzegorz-roboflow in https://github.com/roboflow/inference/pull/829


πŸ… New Contributors
* Matvezy made their first contribution in https://github.com/roboflow/inference/pull/820

**Full Changelog**: https://github.com/roboflow/inference/compare/v0.27.0...v0.28.0

0.27.0

Not secure
πŸš€ Added

🧠 Your own fine-tuned Florence 2 in Workflows πŸ”₯

Have you been itching to dive into the world of Vision-Language Models (VLMs)? Maybe you've explored SkalskiP's incredible tutorial on training [your own VLM](https://blog.roboflow.com/fine-tune-florence-2-object-detection/). Well, now you can take it a step furtherβ€”train your own VLM directly on the Roboflow platform!

But that’s not all: thanks to probicheaux, you can seamlessly integrate your VLM into Workflows for real-world applications.

Check out the [πŸ“– docs](https://inference.roboflow.com/workflows/blocks/florence2_model/#version-v2) and try it yourself!


<div align="center">
<img src="https://github.com/user-attachments/assets/8f30dbe9-9d76-4ec8-9a7c-a6924311c08d" width="50%" />
</div>

> [!NOTE]
> This workflow block is not available on Roboflow platform - you need to run inference server on your machine (preferably with GPU).
> bash
> pip install inference-cli
> inference server start
>

🎨 Classification results visualisation in Workflows

The Workflows ecosystem offers a variety of blocks to visualize model predictions, but we’ve been missing a dedicated option for classificationβ€”until now! πŸŽ‰

Thanks to the incredible work of reiffd7, we’re excited to introduce the [Classification Label Visualization](https://inference.roboflow.com/workflows/blocks/classification_label_visualization/) block to the ecosystem.

Dive in and bring your classification results to life! πŸš€

<table>
<tr>
<td><img src="https://github.com/user-attachments/assets/477c7203-e552-477d-897c-1cb7b917fac2"/></td>
<td><img src="https://github.com/user-attachments/assets/9bfddf14-6f08-4ba9-909b-a39be8505e86"/></td>
<td><img src="https://github.com/user-attachments/assets/bc9a1167-ad6b-4c64-9e6d-3151abb8e799"/></td>
</tr>
</table>

0.26.1

Not secure
What's Changed
* Make skypilot optional for inference-cli by sberan in https://github.com/roboflow/inference/pull/792
* Add usage_billable to BaseRequest by grzegorz-roboflow in https://github.com/roboflow/inference/pull/793
* Handle malformed usage_fps by grzegorz-roboflow in https://github.com/roboflow/inference/pull/795
* Feature/extend line counter block outputs by grzegorz-roboflow in https://github.com/roboflow/inference/pull/797
* Add turn server configuration to webrtc connection by grzegorz-roboflow in https://github.com/roboflow/inference/pull/799


**Full Changelog**: https://github.com/roboflow/inference/compare/v0.26.0...v0.26.1

Page 2 of 15

Β© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.