Oshepherd

Latest version: v0.0.16

Safety actively analyzes 725255 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 2 of 2

0.0.9

Commits

- [f1a198b5] Merge pull request 12 from mnemonica-ai/development
- [40f1d046] version bumped to 0.0.9
- [0ca227f8] Merge pull request 11 from mnemonica-ai/only_redis
- [222a7529] Update README.md
- [97da6b1f] better endpoints parity section
- [21c5ebd2] typo fixed
- [77cfd289] api parity section added
- [e4d55590] pytest added to requirements
- [6eb20d16] direct reference to rabbitmq protocol removed
- [26d75491] rabbit references removed from readme, using redis only
- [6612b98e] broker and backend references changed to be general instead of direct references to rabbit and redis
- [ecd3d762] Smol typo fixes on README file
- [5e117fa9] Merge pull request 7 from mnemonica-ai/feature_embeddings
- [15e31d2f] basic test added for raw http request
- [2651fdca] e2e basic test for embeddings endpoint added
- [6c481dea] blueprint available in flask server
- [0dc98864] worker support for embeddings executiong added
- [3cab2505] embeddings endpoint implementation added, including request and response types, along with endpoint blueprint for flask

0.0.6

Commits

- [7e8a0527] Merge pull request 6 from mnemonica-ai/development
- [5785191e] version bumped to 0.0.6
- [7517bbc3] Merge pull request 5 from mnemonica-ai/feature_chat
- [7da46073] linter applied
- [ad05ac05] better style for readme, formatted
- [573ce58a] comments improved in ollama endpoints definitions
- [c91e5509] readme improved
- [5184bcc5] better defaul values for redis backend configuration
- [adcf0a15] better definition of default values for chat and generate request payloads types
- [4ff05cff] tests added for chat completion basic cases using ollama python package and http requests
- [4854abb6] ollama chat response type added, default values for chat request fixed accoding to what is expected by server
- [bbeb9781] generalization of completions execution into the same task, ready for generate and chat endpoints for now oshepherd.worker.tasks.exec_completion
- [0282b9ad] basic generalization of generate completion endpoint implemented
- [8ae0fe3a] chat completion endpoint added to main api definitions
- [15955d9c] basic chat completion flask endpoint implemented

0.0.5

Commits

- [e454f17f] Merge pull request 4 from mnemonica-ai/development
- [a369a097] version bumped to 0.0.5
- [c21acace] Merge pull request 3 from mnemonica-ai/fix_connections
- [077255e1] better comments for ollama tasks
- [cb58ca44] ollama celery task abstracted to its own module
- [202d3ef9] Merge branch 'development' into fix_connections

0.0.4

Commits

- [407b7449] Merge pull request 2 from mnemonica-ai/development
- [37a5f265] version bumped to 0.0.4
- [6b3b51c7] Merge pull request 1 from mnemonica-ai/versioning
- [ac14f548] bump & release gh action added pointing to main branch

0.0.3

First Release: https://pypi.org/project/oshepherd/0.0.3/

* Basic **API** behavior: serving one http endpoint as a copy of the original `generate()` Ollama server [endpoint](https://github.com/ollama/ollama/blob/114c932a8e872846fc714353c65d041feb886027/server/routes.go#L996), receiving incoming request parameters, queuing a message with those parameters in RabbitMQ through Celery, waiting for the Celery Task to finish, extracting the returned response from Redis backend, and then responding to http Ollama client the response from a remote Ollama server.
* Basic **WORKER** behavior: Respond to messages queued in RabbitMQ using Celery, fire `generate()` request pointing to local Ollama server within worker instance, also using Ollama python package client, and return response to Redis backend.

Page 2 of 2

© 2025 Safety CLI Cybersecurity Inc. All Rights Reserved.