Llama-cpp-python

Latest version: v0.3.5

Safety actively analyzes 688600 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 20 of 22

0.1.68

Not secure
- (llama.cpp) Update llama.cpp

0.1.67

Not secure
- Fix performance bug in Llama model by pre-allocating memory tokens and logits.
- Fix bug in Llama model where the model was not free'd after use.

0.1.66

Not secure
- (llama.cpp) New model API

- Performance issue during eval caused by looped np.concatenate call
- State pickling issue when saving cache to disk

0.1.65

Not secure
- (llama.cpp) Fix struct misalignment bug

0.1.64

Not secure
- (llama.cpp) Update llama.cpp
- Fix docs for seed. Set -1 for random.

0.1.63

Not secure
- (llama.cpp) Add full gpu utilisation in CUDA
- (llama.cpp) Add get_vocab
- (llama.cpp) Add low_vram parameter
- (server) Add logit_bias parameter

Page 20 of 22

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.