Mlipy

Latest version: v0.1.57

Safety actively analyzes 681881 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 6 of 10

0.1.27

Added:
- Added `llama.cpp` parameters: `seed`, `threads`, `grammar`, `grammar_file`, `cfg_negative_prompt`, `cfg_scale`, `rope_scaling`, `rope_scale`, `rope_freq_base`, `rope_freq_scale`, `cont_batching`.

Changed:
- `ctx_size` is not `0`, size of the prompt context (default: 512, 0 = loaded from model)

0.1.26

Added:
- `pyproject.toml` initial support for `torch` (WIP).

Fixed:
- `examples/sync_demo.py` now uses correct models for `llama.cpp`.

0.1.25

Changed:
- `format_messages` using `creator_model_id`

Fixed:
- Formatter will use `creator_model_id` if `model_id` is misssing.
- New "simple" default/fallback formatting if `model_id`/`creator_model_id` is missing.

0.1.24

Fixed:
- `executable` was not patched in `msg`.

0.1.23

Fixed:
- Supported old `kind` paramater which can be used for `executable`.

0.1.22

Changed:
- params: `kind` is not `executable`.
- `langchain` package is optional now, client code is move to `langchain_client.py`.
- `uvloop` package is optional now.

Fixed:
- Formatting of chat messages (role-based) using `transformers.AutoTokenizer.apply_chat_template`.

Page 6 of 10

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.