Inference

Latest version: v0.29.1

Safety actively analyzes 685670 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 7 of 14

0.12.0

Not secure
πŸ”¨ Fixed

πŸ”₯ `YOLOv10` in `inference` now has pre- and post-processing issues solved

Thanks to jameslahm we have inconsistencies in results from `YOLOv10` model in `inference` package sorted out. PR https://github.com/roboflow/inference/pull/437

<p align="center">
<img src="https://github.com/roboflow/inference/assets/146137186/beaafb53-8c99-433c-8a53-45ca002238e6" width="70%"/>
</p>

🌱 Changed

❗`breaking change`❗Inference from PaliGemma
PaliGemma models changes model category from foundation one into Roboflow model. That implies the following change in a way how it is exposed by `inference server`:

**Before:**
python
def do_gemma_request(prompt: str, image_path: str):
infer_payload = {
"image": {
"type": "base64",
"value": encode_bas64(image_path),
},
"api_key": "<ROBOFLOW-API-KEY>",
"prompt": prompt,
}
response = requests.post(
f'http://localhost:{PORT}/llm/paligemma',
json=infer_payload,
)
resp = response.json()


**Now:**
python
def do_gemma_request(prompt: str, image_path: str):
infer_payload = {
"image": {
"type": "base64",
"value": encode_bas64(image_path),
},
"prompt": prompt,
"model_id": "paligemma-3b-mix-224",
}
response = requests.post(
f'http://localhost:{PORT}/infer/lmm',
json=infer_payload,
)
resp = response.json()


PR https://github.com/roboflow/inference/pull/436
Other changes
* Replaced `sv.BoxAnnotator` with `sv.BoundingBoxAnnotator` combined with `sv.LabelAnnotator` to be prepare for `sv.BoxAnnotator` deprecation by grzegorz-roboflow in https://github.com/roboflow/inference/pull/434
* Add PaliGemma documentation, update table of contents by capjamesg in https://github.com/roboflow/inference/pull/429
* Add http get support for legacy model inference by PacificDou in https://github.com/roboflow/inference/pull/449
* Fix dead supported blocks link by LinasKo in https://github.com/roboflow/inference/pull/448
* Docs: Remove banner saying Sv Keypoint annotators are experimental by LinasKo in https://github.com/roboflow/inference/pull/450

πŸ₯‡ New Contributors
* jameslahm made their first contribution in https://github.com/roboflow/inference/pull/437

**Full Changelog**: https://github.com/roboflow/inference/compare/v0.11.2...v0.12.0

0.11.2

Not secure
What's Changed
* Add YOLOv10 Object Detection Support by NickHerrig and probicheaux in https://github.com/roboflow/inference/pull/431

New Contributors
* NickHerrig made their first contribution in https://github.com/roboflow/inference/pull/431

**Full Changelog**: https://github.com/roboflow/inference/compare/v0.11.1...v0.11.2

0.11.1

Not secure
πŸ”¨ Fixed

❗ `setuptools>=70.0.0` breaks `CLIP` and `YoloWorld` models in `inference`

Using `setuptools` in version `70.0.0` and above breaks usage of Clip and YoloWorld models. That impacts historical version of inference package installed in python environments with newest `setuptools`. Problem may affect clients using `inference` as Python package in their environments, docker builds are not impacted.

**Symptoms of the problem:**
* `ImportError` while attempting `from inference.models import YOLOWorld`, despite previous `pip install inference[yolo-world]`
* `ImportError` while attempting `from inference.models import Clip`


We release change pinning `setuptools` version into compatible ones. This is not the ultimate solution for that problem (as some time in the future it may be needed to unblock `setuptools`), that's why we will need to take actions in the future releases - stay tuned.

As a solution for now, we recommend enforcing `setuptools<70.0.0` in all environments using `inference`, so if you are impacted restrict `setuptools` in your build:

pip install setuptools>=65.5.1,<70.0.0


πŸ—οΈ docker image for Jetson with Jetpack 4.5 is now fixed
We had issues with builds on Jetpack 4.5 which should be solved now. Details: https://github.com/roboflow/inference/pull/393

🌱 Changed
* In `workflows`, one can now define selectors to runtime inputs (`$inputs.<name>`) in outputs definitions, making it possible to pass input data through the `workflow`.

**Full Changelog**: https://github.com/roboflow/inference/compare/v0.11.0...v0.11.1

0.11.0

Not secure
πŸš€ Added

πŸŽ‰ PaliGemma in `inference`! πŸŽ‰

You've probably heard about [new PaliGemma model](https://blog.roboflow.com/paligemma-multimodal-vision/), right? We have it supported in new release of `inference` thanks to probicheaux.

To run the model, you need to build and `inference` server your GPU machine using the following commands:
bash
clone the inference repo
git clone https://github.com/roboflow/inference.git

navigate into repository root
cd inference

build inference server with PaliGemma dependencies
docker build -t roboflow/roboflow-inference-server-paligemma -f docker/dockerfiles/Dockerfile.paligemma .

run server
docker run -p 9001:9001 roboflow/roboflow-inference-server-paligemma


<details>
<summary>πŸ‘‰ To prompt the model visit our <a href="https://github.com/roboflow/inference/blob/main/examples/paligemma/paligemma_client.py">examples πŸ“– <a/> or use the following code snippet:</summary>

python
import base64
import requests
import os

PORT = 9001
API_KEY = os.environ["ROBOFLOW_API_KEY"]
IMAGE_PATH = "<PATH-TO-YOUR>/image.jpg"

def encode_bas64(image_path: str):
with open(image_path, "rb") as image:
x = image.read()
image_string = base64.b64encode(x)
return image_string.decode("ascii")

def do_gemma_request(image_path: str, prompt: str):
infer_payload = {
"image": {
"type": "base64",
"value": encode_bas64(image_path),
},
"api_key": API_KEY,
"prompt": prompt
}
response = requests.post(
f'http://localhost:{PORT}/llm/paligemma',
json=infer_payload,
)
return response.json()


print(do_gemma_request(
image_path=IMAGE_PATH,
prompt="Describe the image"
))


</details>

🌱 Changed
* documentations updates:
* document source_id parameter of VideoFrame by sberan in https://github.com/roboflow/inference/pull/395
* fix workflows specification URL and other docs updates by SolomonLake in https://github.com/roboflow/inference/pull/398
* add link to Roboflow licensing by capjamesg in https://github.com/roboflow/inference/pull/403

πŸ”¨ Fixed
* Bug introduced into `InferencePipeline.init_with_workflow(...)` in `v0.10.0` causing import errors yielding misleading error message informing about broken dependencies:

inference.core.exceptions.CannotInitialiseModelError: Could not initialise workflow processing due to lack of dependencies required. Please provide an issue report under https://github.com/roboflow/inference/issues

Fixed with this PR https://github.com/roboflow/inference/pull/407


**Full Changelog**: https://github.com/roboflow/inference/compare/v0.10.0...v0.11.0

0.10.0

Not secure
πŸš€ Added

🎊 Core modules of `workflows` are `Apache-2.0` now

We're excited to announce that the core of `workflows` is now open-source under the Apache-2.0 license! We invite the community to explore the `workflows` ecosystem and contribute to its growth. We have plenty of ideas for improvements and would love to hear your feedback.

Feel free to check out our [examples](https://github.com/roboflow/inference/blob/main/examples/notebooks/workflows.ipynb) and [docs πŸ“– ](https://inference.roboflow.com/workflows/about/).

πŸ—οΈ Roboflow `workflows` are changing before our eyes

We've undergone a major refactor of the `workflows` Execution Engine to make it more robust:
* `blocks` can now be stand-alone modules - what makes them separated from Execution Engine
* `bocks` now expose OpenAPI manifests for automatic parsing and validation
* custom `plugins` with `blocks` can be created, installed via pip, and integrated with our core library `blocks`.

Thanks to SkalskiP and stellasphere we've made the documentation much better. Relying on new blocks self-describing capabilities we can now automatically generate `workflows` docs - you can now see exactly how to connect different blocks and how JSON definitions should look like.

![image](https://github.com/roboflow/inference/assets/146137186/aa4ff885-189d-4a34-9860-6112d0d425cc)

**Visit our [docs πŸ“– ](https://inference.roboflow.com/workflows/about/) to discover more**

❗ There are minor breaking changes in manifests of some steps (`DetectionsFilter`, `DetectionsConsensus`, `ActiveLearningDataCollector`) as we needed to fix shortcuts made in initial version. Migration would require plugging output of another `step` into fields `image_metadata`, `prediction_type` of mentioned blocks.

πŸ”§ `inference --version`

Thanks to Griffin-Sullivan we have now a new command in `inference-cli` available to show details on what version of `inference*` packages are installed.

bash
inference --version


🌱 Changed
* Huge general docs upgrade by LinasKo (https://github.com/roboflow/inference/pull/385, https://github.com/roboflow/inference/pull/378, https://github.com/roboflow/inference/pull/372) fixing broken links, general structure and aliases for keypoints coco-models

πŸ”¨ Fixed
* Inconsistency in builds due to release of `fastapi` package by grzegorz-roboflow https://github.com/roboflow/inference/pull/374
* Middleware error in `inference server` - making every response not getting `HTTP 2xx` into `HTTP 500` 😒 - introduced in [v0.9.23](https://github.com/roboflow/inference/releases/tag/v0.9.23) - thanks probicheaux for taking the effort to fix it
* bug that was present in post-processing of all `instance-segmentation` models making batch inference faulty when some image yields zero predictions - huge kudos to grzegorz-roboflow for spotting the problem and [fixing it](https://github.com/roboflow/inference/pull/387).

πŸ… New Contributors
* Griffin-Sullivan made their first contribution in https://github.com/roboflow/inference/pull/339

**Full Changelog**: https://github.com/roboflow/inference/compare/v0.9.23...v0.10.0

0.9.23

Not secure
What's Changed
* Improve benchmark output; fix exception handling by grzegorz-roboflow in https://github.com/roboflow/inference/pull/354
* Minor docs update, API key in InferenceHTTPClient by LinasKo in https://github.com/roboflow/inference/pull/357
* Add api key fallback for model monitoring by hansent in https://github.com/roboflow/inference/pull/366
* Downgrade transformers to avoid faulty release of that package by PawelPeczek-Roboflow in https://github.com/roboflow/inference/pull/363
* Upped skypilot version by bigbitbus in https://github.com/roboflow/inference/pull/367
* Lock Grounding DINO package version to 0.2.0 by skylargivens in https://github.com/roboflow/inference/pull/368

New Contributors
* LinasKo made their first contribution in https://github.com/roboflow/inference/pull/357

**Full Changelog**: https://github.com/roboflow/inference/compare/v0.9.22...v0.9.23

Page 7 of 14

Β© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.