Tuneapi

Latest version: v8.0.7

Safety actively analyzes 714860 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 1 of 7

8.0.7

-----

- Add replace ``requests.Session`` with ``httpx.Client`` in all the ``tuneapi.apis`` models
- ``tuneapi.types.chats.ModelInterface`` is a class now instead of a protocol so must be initialised like
``super().__init__()``
- Make client common for a single instance of ``tuneapi.types.chats.ModelInterface`` instead of multuple clients. As is
suggested in the httpx documentation, never create multiple clients in the ``hotloop``. This has increased the speed
of ``distributed_chat`` by 8.6% in our benchmarks. It also means when running 100s of prompts in parallel in a server
this will reduce the chance of API failure.

8.0.6

-----

- Standardise all the implementations of the ``tuneapi.types.chats.ModelInterface``

8.0.5

-----

- Adding support for Batches API in Openai and Anthropic models
- Have not tested function calling on batches API
- Have not tested structured generation on batches API
- Removed dependency on ``openai`` package for audio transcribing by figuring out uploading of files
- Added new ``Ollama`` class which works with ``OpenAIProtocol``

8.0.4

-----

- Bug fixes

8.0.3

-----

- Fix bug in ``tools`` that was causing ever increasing number of tools in the ``Thread`` object
- OpenAI protocol abstracted away as a ``OpenAIProtocol`` class in ``tuneapi.apis.openai``. This is to make it easier to
add new endpoints in the future.

8.0.2

-----

- Added usage tracking for OpenAI and Anthropic

Page 1 of 7

© 2025 Safety CLI Cybersecurity Inc. All Rights Reserved.