Llama-cpp-python

Latest version: v0.3.5

Safety actively analyzes 688600 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 6 of 22

0.2.66

Not secure
- feat: Update llama.cpp to ggerganov/llama.cpp8843a98c2ba97a25e93319a104f9ddfaf83ce4c4
- feat: Generic Chat Formats, Tool Calling, and Huggingface Pull Support for Multimodal Models (Obsidian, LLaVA1.6, Moondream) by abetlen in 1147
- ci(fix): Workflow actions updates and fix arm64 wheels not included in release by Smartappli in 1392
- ci: Add support for pre-built cuda 12.4.1 wheels by Smartappli in 1388
- feat: Add support for str type kv_overrides by abetlen in a411612b385cef100d76145da1fbd02a7b7cc894
- fix: Functionary bug fixes by jeffrey-fong in 1385
- examples: fix quantize example by iyubondyrev in 1387
- ci: Update dependabot.yml by Smartappli in 1391

0.2.65

Not secure
- feat: Update llama.cpp to ggerganov/llama.cpp46e12c4692a37bdd31a0432fc5153d7d22bc7f72
- feat: Allow for possibly non-pooled embeddings by iamlemec in 1380

0.2.64

Not secure
- feat: Update llama.cpp to ggerganov/llama.cpp4e96a812b3ce7322a29a3008db2ed73d9087b176
- feat: Add `llama-3` chat format by andreabak in 1371
- feat: Use new llama_token_is_eog in create_completions by abetlen in d40a250ef3cfaa8224d12c83776a2f1de96ae3d1
- feat(server): Provide ability to dynamically allocate all threads if desired using -1 by sean-bailey in 1364
- ci: Build arm64 wheels by gaby in 611781f5319719a3d05fefccbbf0cc321742a026
- fix: Update scikit-build-core build dependency avoid bug in 0.9.1 by evelkey in 1370

0.2.63

Not secure
- feat: Update llama.cpp to ggerganov/llama.cpp0e4802b2ecbaab04b4f829fde4a3096ca19c84b5
- feat: Add stopping_criteria to ChatFormatter, allow stopping on arbitrary token ids, fixes llama3 instruct by abetlen in cc81afebf04d26ca1ac3cf72f23f18da6ab58588

0.2.62

Not secure
- feat: Update llama.cpp to ggerganov/llama.cpp3b8f1ec4b18770531d0b1d792f3edf08254e4f0c
- feat: update grammar schema converter to match llama.cpp by themrzmaster in 1353
- feat: add disable_ping_events flag by khimaros in 1257
- feat: Make saved state more compact on-disk by tc-wolf in 1296
- feat: Use all available CPUs for batch processing by ddh0 in 1345

0.2.61

Not secure
- feat: Update llama.cpp to ggerganov/llama.cppba5e134e073ec6837078c874aba44a702944a676
- fix: pass correct type to chat handlers for chat completion logprobs by abetlen in bb65b4d76411112c6fb0bf759efd746f99ef3c6b
- feat: Add support for yaml based server configs by abetlen in 060bfa64d529ade2af9b1f4e207a3937bbc4138f
- feat: Add typechecking for ctypes structure attributes by abetlen in 1347e1d050fc5a9a32ffe0bb3e22858da28003bd

Page 6 of 22

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.