Friendli-client

Latest version: v1.5.8

Safety actively analyzes 724206 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 5 of 7

1.1.0

- The features related to Friendli Dedicated Endpoints are temporarily removed from the client package. Please use [`periflow-client`](https://pypi.org/project/periflow-client/) instead to use the features.

> [!NOTE]
> We are actively integrating the features in Friendli Dedicated Endpoints (previously known as PeriFlow Cloud) to Friendli Suite.
> Please use `periflow-client` pacakge instead of `friendli-client` to use features for Friendli Dedicated Endpoints.
> Those features will be supported with `friendli-client` very shortly.

- CLI commands are updated.
- Commands related to the Friendli Dedicated Endpoints are removed temporarily.
- Commands for API calls to the serverless endpoints are added. Check the example usage below:
sh
friendli api chat-completions create \
-g "user Tell me how to make a delicious pancake" \
-m llama-2-13b-chat

1.0.0

🌟 Exciting Major Version Update: Introducing Friendli Suite! 🌈

We're thrilled to announce the official release of Friendli Suite, bringing a wealth of enhancements and features to your fingertips.
With this major update, we've given our package and GitHub repository a facelift, transitioning from `periflow-client` to the all-new and improved `friendli-client`.

**Here's a rundown of the key changes:**

🔄 CLI Command Prefix Update
The CLI command prefix has undergone a transformation! Say goodbye to the old `pf` and embrace the fresh `friendli`. Now, to sign in, simply use `friendli login` instead of the previous `pf login`.

🐍 Python SDK Breaking Change
In the Python SDK, we've introduced a breaking change that aligns its semantics with the OpenAI Python SDK v1. Fear not, as this update brings more consistency and compatibility. Dive into the details and explore examples on our [documentation page](https://docs.periflow.ai/guides/serverless_endpoints/text_generation).

python
import os
from friendli import Friendli

client = Friendli(api_key=os.environ.get("FRIENDLI_API_KEY"))

chat_completion = client.chat.completions.create(
model="llama-2-13b-chat",
messages=[
{
"role": "user",
"content": "Tell me how to make a delicious pancake"
}
],
stream=False,
)

print(chat_completion.choices[0].message.content)


🚀 Upgrade now to unlock a world of possibilities with Friendli Suite! If you encounter any challenges or have questions, don't hesitate to reach out. Happy coding! 🤖💬

0.1.13

- Support checkpoints with safetensors format.

**Full Changelog**: https://github.com/friendliai/periflow-client/compare/0.1.12...0.1.13

0.1.12

- Now you can convert HuggingFace adapter models to PeriFlow-compliant format. Run `pf convert-adapter --help` to see details.

0.1.11

- Minor: Strict package version check is disabled.
- The "offloading" option is added to the quantization config file. This option enables the GPU to CPU offloading to save the GPU memory usage.

0.1.10

This release includes a hotfix of 0.1.9.

Page 5 of 7

© 2025 Safety CLI Cybersecurity Inc. All Rights Reserved.