Friendli-client

Latest version: v1.5.3

Safety actively analyzes 666166 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 4 of 6

1.2.1

- Update package dependencies (no more exact version matching).
- Add Mixtral model type
- Add a `stop` option to completions and chat completions SDK/CLI.

1.2.0

Features 🌟

1. Distinguish Merged QKV for more precise and efficient handling of query, key, and value transformations.
2. Now LoRA can be applied to the MPT model.
3. Introduced support for Mixtral model checkpoint conversion.

Bug Fixes and Improvements 🐛🔨

1. AWQ bug during using gpt-j.
2. Addressed a critical issue where CUDA Out of Memory (OOM) errors occurred while using AWQ.
3. Minor update to phi model type.

1.1.0

- The features related to Friendli Dedicated Endpoints are temporarily removed from the client package. Please use [`periflow-client`](https://pypi.org/project/periflow-client/) instead to use the features.

> [!NOTE]
> We are actively integrating the features in Friendli Dedicated Endpoints (previously known as PeriFlow Cloud) to Friendli Suite.
> Please use `periflow-client` pacakge instead of `friendli-client` to use features for Friendli Dedicated Endpoints.
> Those features will be supported with `friendli-client` very shortly.

- CLI commands are updated.
- Commands related to the Friendli Dedicated Endpoints are removed temporarily.
- Commands for API calls to the serverless endpoints are added. Check the example usage below:
sh
friendli api chat-completions create \
-g "user Tell me how to make a delicious pancake" \
-m llama-2-13b-chat

1.0.0

🌟 Exciting Major Version Update: Introducing Friendli Suite! 🌈

We're thrilled to announce the official release of Friendli Suite, bringing a wealth of enhancements and features to your fingertips.
With this major update, we've given our package and GitHub repository a facelift, transitioning from `periflow-client` to the all-new and improved `friendli-client`.

**Here's a rundown of the key changes:**

🔄 CLI Command Prefix Update
The CLI command prefix has undergone a transformation! Say goodbye to the old `pf` and embrace the fresh `friendli`. Now, to sign in, simply use `friendli login` instead of the previous `pf login`.

🐍 Python SDK Breaking Change
In the Python SDK, we've introduced a breaking change that aligns its semantics with the OpenAI Python SDK v1. Fear not, as this update brings more consistency and compatibility. Dive into the details and explore examples on our [documentation page](https://docs.periflow.ai/guides/serverless_endpoints/text_generation).

python
import os
from friendli import Friendli

client = Friendli(api_key=os.environ.get("FRIENDLI_API_KEY"))

chat_completion = client.chat.completions.create(
model="llama-2-13b-chat",
messages=[
{
"role": "user",
"content": "Tell me how to make a delicious pancake"
}
],
stream=False,
)

print(chat_completion.choices[0].message.content)


🚀 Upgrade now to unlock a world of possibilities with Friendli Suite! If you encounter any challenges or have questions, don't hesitate to reach out. Happy coding! 🤖💬

0.1.13

- Support checkpoints with safetensors format.

**Full Changelog**: https://github.com/friendliai/periflow-client/compare/0.1.12...0.1.13

0.1.12

- Now you can convert HuggingFace adapter models to PeriFlow-compliant format. Run `pf convert-adapter --help` to see details.

Page 4 of 6

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.