Llama-cpp-python

Latest version: v0.3.5

Safety actively analyzes 688600 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 15 of 22

0.2.12

Not secure
- Update llama.cpp to ggerganov/llama.cpp50337961a678fce4081554b24e56e86b67660163
- Fix missing `n_seq_id` in `llama_batch` by NickAlgra in 842
- Fix for shared libraries on Windows that start with `lib` prefix by sujeendran in 848
- Fix exception raised in `__del__` when freeing models by cebtenzzre in 846
- Performance improvement for logit bias by zolastro in 851
- Fix suffix check arbitrary code execution bug by mtasic85 in 854
- Fix typo in `function_call` parameter in `llama_types.py` by akatora28 in 849
- Fix streaming not returning `finish_reason` by gmcgoldr in 798
- Fix `n_gpu_layers` check to allow values less than 1 for server by hxy9243 in 826
- Supppress stdout and stderr when freeing model by paschembri in 803
- Fix `llama2` chat format by delock in 808
- Add validation for tensor_split size by eric1932 820
- Print stack trace on server error by abetlen in d6a130a052db3a50975a719088a9226abfebb266
- Update docs for gguf by johnccshen in 783
- Add `chatml` chat format by abetlen in 305482bd4156c70802fc054044119054806f4126

0.2.11

Not secure
- Fix bug in `llama_model_params` object has no attribute `logits_all` by abetlen in d696251fbe40015e8616ea7a7d7ad5257fd1b896

0.2.10

Not secure
- Fix bug 'llama_model_params' object has no attribute 'embedding' by abetlen in 42bb721d64d744242f9f980f2b89d5a6e335b5e4

0.2.9

Not secure
- Fix critical bug in pip installation of v0.2.8 due to `.git` directory in ac853e01e1a217a578080a4e1b851d2d08450adf

0.2.8

Not secure
- Update llama.cpp to ggerganov/llama.cpp40e07a60f9ce06e79f3ccd4c903eba300fb31b5e
- Add configurable chat formats by abetlen in 711
- Fix rope scaling bug by Josh-XT in 767
- Fix missing numa parameter in server by abetlen in d9bce17794d0dd6f7962d10aad768fedecf3ab89

0.2.7

Not secure
- Update llama.cpp to ggerganov/llama.cppa98b1633d5a94d0aa84c7c16e1f8df5ac21fc850
- Install required runtime dlls to package directory on windows by abetlen in 8d75016549e2ff62a511b1119d966ffc0df5c77b
- Add openai-processing-ms to server response header by Tradunsky in 748
- Bump minimum version of scikit-build-core to 0.5.1 to fix msvc cmake issue by abetlen in 1ed0f3ebe16993a0f961155aa4b2c85f1c68f668
- Update `llama_types.py` to better match the openai api, old names are aliased to new ones by abetlen in dbca136feaaf7f8b1182c4c3c90c32918b1d0bb3

Page 15 of 22

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.