Llm-perplexity

Latest version: v0.5

Safety actively analyzes 623965 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

0.5

What's Changed
* Add mixtral-8x22b-instruct, llama-3-8b-instruct, llama-3-70b-instruct by simonw in https://github.com/hex/llm-perplexity/pull/3

New Contributors
* simonw made their first contribution in https://github.com/hex/llm-perplexity/pull/3

**Full Changelog**: https://github.com/hex/llm-perplexity/compare/0.4...0.5

0.4

- update system message handling

0.3

- added default max tokens per model
- added model options `temperature`, `top_p`, `top_k`, `presence_penalty`, `frequency_penalty`

0.2

- No changes from v0.1. Just a version number bump to solve some PyPI publishing issues.

0.1

- Initial release. Added support for:

llm -m sonar-small-chat "prompt"
llm -m sonar-small-online "prompt"
llm -m sonar-medium-chat "prompt"
llm -m sonar-medium-online "prompt"
llm -m codellama-70b-instruct "prompt"
llm -m mistral-7b-instruct "prompt"
llm -m mixtral-8x7b-instruct "prompt"

Links

Releases

Has known vulnerabilities

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.