Llama-cpp-python

Latest version: v0.2.76

Safety actively analyzes 634675 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 5 of 18

0.2.52

Not secure
- feat: Update llama.cpp to ggerganov/llama.cppa33e6a0d2a66104ea9a906bdbf8a94d050189d91
- fix: Llava15ChatHandler (this function takes at least 4 arguments) by abetlen in 8383a9e5620f5df5a88f62da16813eac200dd706

0.2.51

Not secure
- feat: Update llama.cpp to ggerganov/llama.cppc39373398803c669056304090050fe3f44b41bf9
- fix: Restore type hints for low-level api by abetlen in 19234aa0dbd0c3c87656e65dd2b064665371925b

0.2.50

Not secure
- docs: Update Functionary OpenAI Server Readme by jeffrey-fong in 1193
- fix: LlamaHFTokenizer now receives pre_tokens by abetlen in 47bad30dd716443652275099fa3851811168ff4a

0.2.49

Not secure
- fix: module 'llama_cpp.llama_cpp' has no attribute 'c_uint8' in Llama.save_state by abetlen in db776a885cd4c20811f22f8bd1a27ecc71dba927
- feat: Auto detect Mixtral's slightly different format by lukestanley in 1214

0.2.48

Not secure
- feat: Update llama.cpp to ggerganov/llama.cpp15499eb94227401bdc8875da6eb85c15d37068f7
- feat: Add Google's Gemma formatting via chat_format="gemma" by alvarobartt in 1210
- feat: support minItems/maxItems in JSON grammar converter by nopperl in 3921e10770996d95a9eb22c8248bacef39f69365
- fix: Update from_pretrained defaults to match hf_hub_download and pull to local cache folder by abetlen in e6d6260a91b7831733f7d1f73c7af46a3e8185ed
- fix: Raise exceptions when llama model or context fails to load by abetlen in dd22010e85265ae840c76ec835d67a29ed852722
- docs: Update README.md to fix pip install llama cpp server by audip in 1187

0.2.47

Not secure
- feat: Update llama.cpp to ggerganov/llama.cpp973053d8b0d04809836b3339a50f68d9c842de90

Page 5 of 18

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.