Llm-mlc

Latest version: v0.5

Safety actively analyzes 681881 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

0.5

- New `-o max_gen_len 100` option for setting the maximum length of the generated text. [10](https://github.com/simonw/llm-mlc/issues/10)

0.4

- The `llm mlc download-model` command now takes zero or more optional `-a/--alias` options to configure aliases for the model once it has been installed. [4](https://github.com/simonw/llm-mlc/issues/4):
bash
llm mlc download-model Llama-2-7b-chat --alias llama2

- Installation instructions are clearer, and show how to install required dependencies first. [6](https://github.com/simonw/llm-mlc/issues/6)
- The plugin no longer crashes `llm` if it cannot find the `dist/prebuilt` folder. [9](https://github.com/simonw/llm-mlc/issues/9)
- New options for `temperature`, `top_p` and `repetition_penalty`: [7](https://github.com/simonw/llm-mlc/issues/7)
bash
llm -m Llama-2-7b-chat \
-o temperature 0.5 \
-o top_p 0.9 \
-o repetition_penalty 0.9 \
'five names for a cute pet ferret'

0.3

- Conversation mode now works, so you can continue a conversation with an MLC model with `llm -c "follow-up prompt"`. [3](https://github.com/simonw/llm-mlc/issues/3)

0.2

- Token streaming now works. [2](https://github.com/simonw/llm-mlc/issues/2)

0.1

- Initial release. Tools for installing and running [MLC](https://mlc.ai/mlc-llm/docs/) models using LLM. [#1](https://github.com/simonw/llm-mlc/issues/1)

Links

Releases

Has known vulnerabilities

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.