Llm-mistral

Latest version: v0.9

Safety actively analyzes 688053 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 2 of 2

0.4

- Documentation for the `llm mistral refresh` command, which can be used to refresh the list of available Mistral API models.
- New default aliases: `llm -m codestral` for the latest release of [Codestral](https://mistral.ai/news/codestral/) and `llm -m codestral-mamba` for the latest release of the new [Codestral Mamba](https://mistral.ai/news/codestral-mamba/). [#8](https://github.com/simonw/llm-mistral/issues/8)

0.3.1

- No longer raises an error if you run `llm models` without first setting a `mistral` API key. [6](https://github.com/simonw/llm-mistral/issues/6)
- Mixtral 8x22b is now available as `llm -m mistral/open-mixtral-8x22b 'say hello'`. New installations will get this model automatically - if you do not see the model in the `llm models` list you should run `llm mixtral refresh` to update your local cache of available models. [7](https://github.com/simonw/llm-mistral/issues/7)

0.3

- Support for the new [Mistral Large](https://mistral.ai/news/mistral-large/) model - `llm -m mistral-large "prompt goes here"`. [#5](https://github.com/simonw/llm-mistral/issues/5)
- All Mistral API models are now supported automatically - LLM fetches a list of models from their API the first time the plugin is installed, and that list can be refreshed at any time using the new `llm mistral refresh` command.
- When using the Python API a model key can now be set using `model.key = '...'` - thanks, [Alexandre Bulté](https://github.com/abulte). [#4](https://github.com/simonw/llm-mistral/pull/4)

0.2

- Mistral LLM models now support options: `-o temperature 0.7`, `-o top_p 0.1`, `-o max_tokens 20`, `-o safe_mode 1`, `-o random_seed 12`. [2](https://github.com/simonw/llm-mistral/issues/2)
- Support for the Mistral embeddings model, available via `llm embed -m mistral-embed -c 'text goes here'`. [3](https://github.com/simonw/llm-mistral/issuse/3)
- The `--no-stream` option now uses the non-streaming Mistral API.

0.1

- Initial release. Provides models `mistral-tiny`, `mistral-small` and `mistral-medium` via the [Mistral API](https://docs.mistral.ai/). [#1](https://github.com/simonw/llm-mistral/issuse/1)

Page 2 of 2

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.