Llm-rs

Latest version: v0.2.15

Safety actively analyzes 682471 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 2 of 4

0.2.9

AutoModel` compatible models will now use the official `tokenizers` library, which improves the decoding accuracy, especially for all non llama based models.

If you want to specify a tokenizer manually, it can be set via the `tokenizer_path_or_repo_id` parameter. If you want to use the default GGML tokenizer the huggingface support can be disabled via `use_hf_tokenizer`.

0.2.8

0.2.7

Added support for `q5_0`,`q5_1` and `q8_0` formats.

0.2.6

Added the `stream` method to each model, which returns a generator that can be consumed to generate a response.

0.2.5

⚠️ The GGML quantization format was updated again, old models will be incompatible ⚠️

0.2.4

AutoModel` can now automatically download GGML converted models and normal Transformer models from the Huggingface Hub.

Page 2 of 4

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.