Lmql

Latest version: v0.7.3

Safety actively analyzes 623883 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 1 of 2

0.7.3

Fix: Include the assets required for the LMQL Chat API in the PyPI package.

0.7.2

Fix: Make sure lmql playground ships as part of the pypi package.

0.7.1

This is a bug fix release, addressing smaller issues with 0.7.

* Fix issue with distribution clause and inference tracing
* make 'random' model independent of chunk_size
* optimize automatic chunk_size selection to reduce the number of LLM calls (max_tokens hinting)
* support for direct generation based on a list of OpenAI Chat format dictionaries

0.7

Release Notes: https://lmql.ai/blog/posts/release-0.7.html

0.7b3

This is a pre-release, for the latest stable release please refer back to 0.0.6.6.

0.7b2

This is a pre-release, for the latest full release please refer back to 0.0.6.6.

* Fixes llama.cpp logit normalisation in the respective LMTP backend
* Allow instruct=False for built-in action function `inline_use` and `reAct`

Page 1 of 2

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.