Llama-cpp-python

Latest version: v0.3.5

Safety actively analyzes 688587 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 10 of 22

0.2.42

Not secure
- feat: Update llama.cpp to ggerganov/llama.cppea9c8e11436ad50719987fa23a289c74b7b40d40
- fix: sample idx off-by-one error for logit_processors by lapp0 in 1179
- fix: chat formatting bugs in `chatml-function-calling` by abetlen in 4b0e3320bd8c2c209e29978d0b21e2e471cc9ee3 and 68fb71b6a26a1e57331868f959b47ab4b87851e1

0.2.41

Not secure
- feat: Update llama.cpp to ggerganov/llama.cpp895407f31b358e3d9335e847d13f033491ec8a5b
- fix: Don't change order of json schema object properties in generated grammar unless prop_order is passed by abetlen in d1822fed6b706f38bd1ff0de4dec5baaa3cf84fa

0.2.40

Not secure
- feat: Update llama.cpp to ggerganov/llama.cpp3bdc4cd0f595a6096cca4a64aa75ffa8a3503465
- feat: Generic chatml Function Calling using chat_format="chatml-function-calling"` by abetlen in 957
- fix: Circular dependancy preventing early Llama object free by notwa in 1176
- docs: Set the correct command for compiling with syscl support by akarshanbiswas in 1172
- feat: use gpu backend for clip if available by iamlemec in 1175

0.2.39

Not secure
- feat: Update llama.cpp to ggerganov/llama.cppb08f22c882a1443e6b97081f3ce718a4d1a741f8
- fix: Fix destructor logging bugs by using llama_log_callback to avoid suppress_stdout_stderr by abetlen in 59760c85eddc72dfcc1839f43760ef72c23d6874

0.2.38

Not secure
- feat: Update llama.cpp to ggerganov/llama.cpp1cfb5372cf5707c8ec6dde7c874f4a44a6c4c915
- feat: Add speculative decoding by abetlen in 1120
- fix: Pass raise_exception and add_generation_prompt to jinja2 chat template by abetlen in 078cca0361bf5a94d2cf52ed04980d20e32d6f95

0.2.37

Not secure
- feat: Update llama.cpp to ggerganov/llama.cppfea4fd4ba7f6b754ac795387b275e1a014a77bde
- feat: Automatically set chat format from gguf by abetlen in 1110

Page 10 of 22

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.