Llama-cpp-python

Latest version: v0.3.8

Safety actively analyzes 723756 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 13 of 22

0.2.27

Not secure
- feat: Update llama.cpp to ggerganov/llama.cppb3a7c20b5c035250257d2b62851c379b159c899a
- feat: Add `saiga` chat format by femoiseev in 1050
- feat: Added `chatglm3` chat format by xaviviro in 1059
- fix: Correct typo in README.md by qeleb in (1058)

0.2.26

Not secure
- feat: Update llama.cpp to ggerganov/llama.cppf6793491b5af6da75edad34d6f503ef86d31b09f

0.2.25

Not secure
- feat(server): Multi model support by D4ve-R in 931
- feat(server): Support none defaulting to infinity for completions by swg in 111
- feat(server): Implement openai api compatible authentication by docmeth2 in 1010
- fix: text_offset of multi-token characters by twaka in 1037
- fix: ctypes bindings for kv override by phiharri in 1011
- fix: ctypes definitions of llama_kv_cache_view_update and llama_kv_cache_view_free. by e-c-d in 1028

0.2.24

Not secure
- feat: Update llama.cpp to ggerganov/llama.cpp0e18b2e7d0b5c0a509ea40098def234b8d4a938a
- feat: Add offload_kqv option to llama and server by abetlen in 095c65000642a3cf73055d7428232fb18b73c6f3
- feat: n_ctx=0 now uses the n_ctx_train of the model by DanieleMorotti in 1015
- feat: logits_to_logprobs supports both 2-D and 3-D logits arrays by kddubey in 1002
- fix: Remove f16_kv, add offload_kqv fields in low level and llama apis by brandonrobertz in 1019
- perf: Don't convert logprobs arrays to lists by kddubey in 1021
- docs: Fix README.md functionary demo typo by evelynmitchell in 996
- examples: Update low_level_api_llama_cpp.py to match current API by jsoma in 1023

0.2.23

Not secure
- Update llama.cpp to ggerganov/llama.cpp948ff137ec37f1ec74c02905917fa0afc9b97514
- Add qwen chat format by yhfgyyf in 1005
- Add support for running the server with SSL by rgerganov in 994
- Replace logits_to_logprobs implementation with numpy equivalent to llama.cpp by player1537 in 991
- Fix UnsupportedOperation: fileno in suppress_stdout_stderr by zocainViken in 961
- Add Pygmalion chat format by chiensen in 986
- README.md multimodal params fix by zocainViken in 967
- Fix minor typo in README by aniketmaurya in 958

0.2.22

Not secure
- Update llama.cpp to ggerganov/llama.cpp8a7b2fa528f130631a5f43648481596ab320ed5a
- Fix conflict with transformers library by kddubey in 952

Page 13 of 22

© 2025 Safety CLI Cybersecurity Inc. All Rights Reserved.