🚀 Added
🎉 PaliGemma in `inference`! 🎉
You've probably heard about [new PaliGemma model](https://blog.roboflow.com/paligemma-multimodal-vision/), right? We have it supported in new release of `inference` thanks to probicheaux.
To run the model, you need to build and `inference` server your GPU machine using the following commands:
bash
clone the inference repo
git clone https://github.com/roboflow/inference.git
navigate into repository root
cd inference
build inference server with PaliGemma dependencies
docker build -t roboflow/roboflow-inference-server-paligemma -f docker/dockerfiles/Dockerfile.paligemma .
run server
docker run -p 9001:9001 roboflow/roboflow-inference-server-paligemma
<details>
<summary>👉 To prompt the model visit our <a href="https://github.com/roboflow/inference/blob/main/examples/paligemma/paligemma_client.py">examples 📖 <a/> or use the following code snippet:</summary>
python
import base64
import requests
import os
PORT = 9001
API_KEY = os.environ["ROBOFLOW_API_KEY"]
IMAGE_PATH = "<PATH-TO-YOUR>/image.jpg"
def encode_bas64(image_path: str):
with open(image_path, "rb") as image:
x = image.read()
image_string = base64.b64encode(x)
return image_string.decode("ascii")
def do_gemma_request(image_path: str, prompt: str):
infer_payload = {
"image": {
"type": "base64",
"value": encode_bas64(image_path),
},
"api_key": API_KEY,
"prompt": prompt
}
response = requests.post(
f'http://localhost:{PORT}/llm/paligemma',
json=infer_payload,
)
return response.json()
print(do_gemma_request(
image_path=IMAGE_PATH,
prompt="Describe the image"
))
</details>
🌱 Changed
* documentations updates:
* document source_id parameter of VideoFrame by sberan in https://github.com/roboflow/inference/pull/395
* fix workflows specification URL and other docs updates by SolomonLake in https://github.com/roboflow/inference/pull/398
* add link to Roboflow licensing by capjamesg in https://github.com/roboflow/inference/pull/403
🔨 Fixed
* Bug introduced into `InferencePipeline.init_with_workflow(...)` in `v0.10.0` causing import errors yielding misleading error message informing about broken dependencies:
inference.core.exceptions.CannotInitialiseModelError: Could not initialise workflow processing due to lack of dependencies required. Please provide an issue report under https://github.com/roboflow/inference/issues
Fixed with this PR https://github.com/roboflow/inference/pull/407
**Full Changelog**: https://github.com/roboflow/inference/compare/v0.10.0...v0.11.0