Llama-cpp-python

Latest version: v0.3.5

Safety actively analyzes 688600 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 12 of 22

0.2.30

Not secure
- feat: Update llama.cpp to ggerganov/llama.cpp57e2a7a52a819883f40dada8a2edc24ecf48186b
- feat(server): Add ability to load chat format from huggingface autotokenizer or tokenizer_config.json files by abetlen in b8fc1c7d83ad4a9207c707ba1d954fe580286a01
- feat: Integration of Jinja2 Templating for chat formats by teleprint-me in 875
- fix: Offload KQV by default by abetlen in 48c3b77e6f558a9899de0e1155c7dc0c7958d8e8
- fix: Support Accept text/event-stream in chat and completion endpoints, resolves 1083 by aniljava in 1088
- fix(cli): allow passing n_ctx=0 to openAI API server args to use model n_ctx_train field per 1015 by K-Mistele in 1093

0.2.29

Not secure
- feat: Update llama.cpp to ggerganov/llama.cpp4483396751c79dea540808b9cb9238245d06da2b
- feat: Add split_mode option by abetlen in 84615adbc6855c8384807c42f0130f9a1763f99d
- feat: Implement GGUF metadata KV overrides by phiharri in 1011
- fix: Avoid "LookupError: unknown encoding: ascii" when open() called in a destructor by yieldthought in 1012
- fix: Fix low_level_api_chat_cpp example to match current API by aniljava in 1086
- fix: Fix Pydantic model parsing by DeNeutoy in 1087

0.2.28

Not secure
- feat: Update llama.cpp to ggerganov/llama.cpp6efb8eb30e7025b168f3fda3ff83b9b386428ad6
- feat: Add ability to pass in penalize_nl param by shankinson in 1068
- fix: print_grammar to stderr by turian in 1052

0.2.27

Not secure
- feat: Update llama.cpp to ggerganov/llama.cppb3a7c20b5c035250257d2b62851c379b159c899a
- feat: Add `saiga` chat format by femoiseev in 1050
- feat: Added `chatglm3` chat format by xaviviro in 1059
- fix: Correct typo in README.md by qeleb in (1058)

0.2.26

Not secure
- feat: Update llama.cpp to ggerganov/llama.cppf6793491b5af6da75edad34d6f503ef86d31b09f

0.2.25

Not secure
- feat(server): Multi model support by D4ve-R in 931
- feat(server): Support none defaulting to infinity for completions by swg in 111
- feat(server): Implement openai api compatible authentication by docmeth2 in 1010
- fix: text_offset of multi-token characters by twaka in 1037
- fix: ctypes bindings for kv override by phiharri in 1011
- fix: ctypes definitions of llama_kv_cache_view_update and llama_kv_cache_view_free. by e-c-d in 1028

Page 12 of 22

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.