Chatglm-cpp

Latest version: v0.3.2

Safety actively analyzes 623883 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 1 of 3

0.3.1

* Support function calling in OpenAI api server
* Faster repetition penalty sampling
* Support max_new_tokens generation option

0.3.0

* Full functionality of ChatGLM3 including system prompt, function call and code interpreter
* Brand new OpenAI-style chat API
* Add token usage information in OpenAI api server to be compatible with LangChain frontend
* Fix conversion error for chatglm3-6b-32k

0.2.10

* Support ChatGLM3 in conversation mode.
* Coming soon: new prompt format for system message and function call.

0.2.9

* Support InternLM 7B & 20B model architectures

0.2.8

* Metal backend support for all models (ChatGLM & ChatGLM2 & Baichuan-7B & Baichuan-13B)
* Fix GLM generation on CUDA for long context

0.2.7

* Support Baichuan-7B model architecture (works for both Baichuan v1 & v2).
* Minor bug fix and enhancement.

Page 1 of 3

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.