Qllm

Latest version: v0.2.2.1

Safety actively analyzes 722460 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 1 of 2

0.2.2.1

What's Changed
* add more example on colab by wejoncy in https://github.com/wejoncy/QLLM/pull/153
* fix colab params by wejoncy in https://github.com/wejoncy/QLLM/pull/154
* urgent fix by wejoncy in https://github.com/wejoncy/QLLM/pull/155
* urgent 0.2.2.post by wejoncy in https://github.com/wejoncy/QLLM/pull/156
* urgent 0.2.2.1 by wejoncy in https://github.com/wejoncy/QLLM/pull/157


**Full Changelog**: https://github.com/wejoncy/QLLM/compare/v0.2.2...v0.2.2.1

0.2.1

What's Changed
* more Awq models && onnx kernel bug when g=-1 by wejoncy in https://github.com/wejoncy/QLLM/pull/138
* feat: support new quantization algorithm 'Vptq' by wejoncy in https://github.com/wejoncy/QLLM/pull/141
* vptq: polish vptq config by wejoncy in https://github.com/wejoncy/QLLM/pull/142
* bump to 0.2.1 by wejoncy in https://github.com/wejoncy/QLLM/pull/143
* fix package by wejoncy in https://github.com/wejoncy/QLLM/pull/144
* fix ci by wejoncy in https://github.com/wejoncy/QLLM/pull/145
* support auto dtype by wejoncy in https://github.com/wejoncy/QLLM/pull/146
* quick fix by wejoncy in https://github.com/wejoncy/QLLM/pull/147
* fix package name by wejoncy in https://github.com/wejoncy/QLLM/pull/148


**Full Changelog**: https://github.com/wejoncy/QLLM/compare/v0.2.0...v0.2.1

0.2.0

What's Changed
* fix llama3.1 by wejoncy in https://github.com/wejoncy/QLLM/pull/132
* support transformers-lib loading by wejoncy in https://github.com/wejoncy/QLLM/pull/134
* bump to 0.2.0 by wejoncy in https://github.com/wejoncy/QLLM/pull/135


**Full Changelog**: https://github.com/wejoncy/QLLM/compare/v0.1.9.1...v0.2.0

0.1.9.1

What's Changed
* add assert message && ci upgrade torch 2.2.2 by wejoncy in https://github.com/wejoncy/QLLM/pull/124
* Update README.md by wejoncy in https://github.com/wejoncy/QLLM/pull/125
* fix version match erros by wejoncy in https://github.com/wejoncy/QLLM/pull/128
* add macro GENERAL_TORCH to get rid of OptionalCUDAGuard by wejoncy in https://github.com/wejoncy/QLLM/pull/129
* quick fix by wejoncy in https://github.com/wejoncy/QLLM/pull/130
* v0.1.9.1 by wejoncy in https://github.com/wejoncy/QLLM/pull/131


**Full Changelog**: https://github.com/wejoncy/QLLM/compare/v0.1.9...v0.1.9.1

0.1.9

What's Changed
* Bump to 0.1.8 by wejoncy in https://github.com/wejoncy/QLLM/pull/109
* new autogptq config format && parallel load by wejoncy in https://github.com/wejoncy/QLLM/pull/110
* bugfix by wejoncy in https://github.com/wejoncy/QLLM/pull/111
* fix issue by wejoncy in https://github.com/wejoncy/QLLM/pull/113
* Fix 112 by wejoncy in https://github.com/wejoncy/QLLM/pull/114
* Fix typos by emphasis10 in https://github.com/wejoncy/QLLM/pull/115
* minor fix, attn_implementation by wejoncy in https://github.com/wejoncy/QLLM/pull/120
* Bump to 0.1.9 by wejoncy in https://github.com/wejoncy/QLLM/pull/121
* -allow-unsupported-compiler by wejoncy in https://github.com/wejoncy/QLLM/pull/122

New Contributors
* emphasis10 made their first contribution in https://github.com/wejoncy/QLLM/pull/115

**Full Changelog**: https://github.com/wejoncy/QLLM/compare/v0.1.8...v0.1.9

0.1.8

What's Changed
* Update README.md by wejoncy in https://github.com/wejoncy/QLLM/pull/102
* buf fix. by wejoncy in https://github.com/wejoncy/QLLM/pull/103
* Onnx fix qzeros odd-shape by wejoncy in https://github.com/wejoncy/QLLM/pull/104
* Refactor by wejoncy in https://github.com/wejoncy/QLLM/pull/105
* support `MARLIN` pack_mode by wejoncy in https://github.com/wejoncy/QLLM/pull/106
* support awq sym by wejoncy in https://github.com/wejoncy/QLLM/pull/107
* Refactor by wejoncy in https://github.com/wejoncy/QLLM/pull/108


**Full Changelog**: https://github.com/wejoncy/QLLM/compare/v0.1.7.1...v0.1.8

Page 1 of 2

© 2025 Safety CLI Cybersecurity Inc. All Rights Reserved.