Brevitas

Latest version: v0.10.2

Safety actively analyzes 625051 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 1 of 3

0.10.2

What's Changed
* Fix (QuantLayer): make bias for QuantLayer optional by fabianandresgrob in https://github.com/Xilinx/brevitas/pull/846
* Fix (examples/llm): set `group_size` only for groupwise quantization by nickfraser in https://github.com/Xilinx/brevitas/pull/853
* Fix (gpfq): updating input processing and L1-norm constraints for GPFA2Q by i-colbert in https://github.com/Xilinx/brevitas/pull/852
* ImageNet PTQ example fix by Giuseppe5 in https://github.com/Xilinx/brevitas/pull/863
* feat (gen/quantize): Added device flag to `quantize_model` by nickfraser in https://github.com/Xilinx/brevitas/pull/860
* Docs: update README for 0.10.2 release by Giuseppe5 in https://github.com/Xilinx/brevitas/pull/865


**Full Changelog**: https://github.com/Xilinx/brevitas/compare/v0.10.1...v0.10.2

0.10.1

Highlights
* A2Q+ support [paper](https://arxiv.org/abs/2401.10432)
* A2Q+ examples with CIFAR10 and Super Resolution
* Support for concatenation equalization for weights and activations
* Support for GPFQ + A2Q L1 Norm bound
* Possibility to explicitly export Q node for weights in QCDQ export
* Support for float16 and bfloat16 for QCDQ export
* Support for Dynamic Activation Quantization for ONNX QDQ export
* Support for channel-splitting [paper](https://arxiv.org/pdf/1901.09504.pdf)
* (Beta) Better compatibility with Huggingface accelerate and optimum
* (Beta) Improved support and testing for minifloat quantization

What's Changed
* Fix (examples/generative): set weight_bit_width in weight_quant by Giuseppe5 in https://github.com/Xilinx/brevitas/pull/783
* Feat (graph/equalize): improvements for llm equalization by Giuseppe5 in https://github.com/Xilinx/brevitas/pull/784
* [graph] Fix typo in class name by nickfraser in https://github.com/Xilinx/brevitas/pull/765
* Fix (graph/equalize): refactor for act equalization by Giuseppe5 in https://github.com/Xilinx/brevitas/pull/787
* [quant_tensor] Updates `__truediv__` behaviour to match "standard fixed point rules" by nickfraser in https://github.com/Xilinx/brevitas/pull/769
* Feat (export): (b)float16 support for qcdq export by Giuseppe5 in https://github.com/Xilinx/brevitas/pull/776
* Feat (ptq): Adding A2Q Upper Bound clipping to GPFQ by fabianandresgrob in https://github.com/Xilinx/brevitas/pull/734
* Extended equalization by Giuseppe5 in https://github.com/Xilinx/brevitas/pull/778
* Better Bfloat16 support by Giuseppe5 in https://github.com/Xilinx/brevitas/pull/777
* Fix (stats): add return statement in state_dict by Giuseppe5 in https://github.com/Xilinx/brevitas/pull/792
* Fix (equalize): improved cat eq checks by Giuseppe5 in https://github.com/Xilinx/brevitas/pull/793
* Fix (export): add CastMixin by Giuseppe5 in https://github.com/Xilinx/brevitas/pull/794
* Dynamic Act Quant support by Giuseppe5 in https://github.com/Xilinx/brevitas/pull/796
* Fix (examples/quantizers): correct dynamic zero point handling by Giuseppe5 in https://github.com/Xilinx/brevitas/pull/806
* Feat (a2q+): improving accumulator-aware weight quantization by i-colbert in https://github.com/Xilinx/brevitas/pull/797
* Feat (a2q+): adding new super resolution models to brevitas_examples by i-colbert in https://github.com/Xilinx/brevitas/pull/811
* Feat (Channel-Splitting): sets up first skeleton for channel-splitting by fabianandresgrob in https://github.com/Xilinx/brevitas/pull/772
* Feat: support for optimum by Giuseppe5 in https://github.com/Xilinx/brevitas/pull/826
* Fix (tests): adding tests for FloatQuant by fabianandresgrob in https://github.com/Xilinx/brevitas/pull/815
* Fix (export): correct q node export by Giuseppe5 in https://github.com/Xilinx/brevitas/pull/829
* Fix (examples/llm): correct groupwise export by Giuseppe5 in https://github.com/Xilinx/brevitas/pull/832
* Fix (examples/super_res): updating README by i-colbert in https://github.com/Xilinx/brevitas/pull/828
* Fix (examples/export): improved export by Giuseppe5 in https://github.com/Xilinx/brevitas/pull/838
* Fix (graph/equalize): cleanup and device management by Giuseppe5 in https://github.com/Xilinx/brevitas/pull/840
* Feat (examples/a2q): adding CIFAR10 example by i-colbert in https://github.com/Xilinx/brevitas/pull/813
* Fix (export): check for Per Group quantization by Giuseppe5 in https://github.com/Xilinx/brevitas/pull/848

**Full Changelog**: https://github.com/Xilinx/brevitas/compare/v0.10.0...v0.10.1

a2q_cifar10_r1
This release contains training code and pre-trained weights to demonstrate accumulator-aware quantization (A2Q) on an image classification task. Code is also provided to demonstrate Euclidean projection-based weight initialization (EP-init) as proposed in our paper ["A2Q+: Improving Accumulator-Aware Weight Quantization"](https://arxiv.org/abs/2401.10432).

Find the associated docs at [https://github.com/Xilinx/brevitas/tree/a2q_cifar10_r1/src/brevitas_examples/imagenet_classification/a2q](https://github.com/Xilinx/brevitas/tree/a2q_cifar10_r1/src/brevitas_examples/imagenet_classification/a2q).

super_res_r2
A2Q+ Super Resolution Experiments with Brevitas

This release contains training code and pre-trained weights to demonstrate accumulator-aware quantization (A2Q+) as proposed in our paper "[A2Q+: Improving Accumulator-Aware Weight Quantization](https://arxiv.org/abs/2401.10432)" on a super resolution task.

Find the associated docs at [https://github.com/Xilinx/brevitas/tree/super_res_r2/src/brevitas_examples/super_resolution](https://github.com/Xilinx/brevitas/tree/super_res_r2/src/brevitas_examples/super_resolution).

0.10.0

* Fix (docs): README.md for pre-commit by volcacius in https://github.com/Xilinx/brevitas/pull/781

New Contributors
* fabianandresgrob made their first contribution in https://github.com/Xilinx/brevitas/pull/717
* saadulkh made their first contribution in https://github.com/Xilinx/brevitas/pull/741

**Full Changelog**: https://github.com/Xilinx/brevitas/compare/v0.9.1...v0.10.0

super_res_r1
Integer-Quantized Super Resolution Experiments with Brevitas

This release contains scripts demonstrating how to train integer-quantized super resolution models using Brevitas.
Code is also provided to demonstrate accumulator-aware quantization (A2Q) as proposed in our ICCV 2023 paper "[A2Q: Accumulator-Aware Quantization with Guaranteed Overflow Avoidance](https://arxiv.org/abs/2308.13504)".

Find the associated docs at https://github.com/Xilinx/brevitas/tree/super_res_r1/src/brevitas_examples/super_resolution .

0.9.1

What's Changed
* Setup: add requirements-dev with pre-commit by Giuseppe5 in https://github.com/Xilinx/brevitas/pull/581
* CI update by Giuseppe5 in https://github.com/Xilinx/brevitas/pull/570
* Fix (brevitas_examples/bnn_pynq): missing 4b resnet18 link and hash fn by volcacius in https://github.com/Xilinx/brevitas/pull/583
* Docs: update READMEs by Giuseppe5 in https://github.com/Xilinx/brevitas/pull/584


**Full Changelog**: https://github.com/Xilinx/brevitas/compare/v0.9.0...v0.9.1

0.9.0

Highlights
* Initial support for graph quantization to programmatically generate a quantized model from a floating-point one. ImageNet examples with PTQ can be found here: https://github.com/Xilinx/brevitas/tree/master/src/brevitas_examples/imagenet_classification/ptq .
* Initial support for QuantMultiheadAttention, which is leveraged for e.g. ViT support above.
* Various improvements to graph equalization, which are leveraged in the PTQ examples above.
* New accumulation-aware quantizers, to train for low-precision accumulation, based on our A2Q paper https://arxiv.org/abs/2301.13376 .
* Experimental support for BatchQuant quantizer, based on https://arxiv.org/abs/2105.08952 , currently still untested.
* Initial support for learned rounding.

Overview of changes

Graph quantization

* Initial graph quantization support by Giuseppe5 in https://github.com/Xilinx/brevitas/pull/549 https://github.com/Xilinx/brevitas/pull/574 https://github.com/Xilinx/brevitas/pull/532 https://github.com/Xilinx/brevitas/pull/579

Quantized layers

* Initial support for QuantMultiheadAttention https://github.com/Xilinx/brevitas/pull/568
* Breaking change: rename Quant(Adaptive)AvgPool to Trunc(Adaptive)AvgPool by volcacius in https://github.com/Xilinx/brevitas/pull/562

Quantizers

* Weight normalization-based integer quantizers by i-colbert in https://github.com/Xilinx/brevitas/pull/559
* Accumulator-aware weight quantization by i-colbert in https://github.com/Xilinx/brevitas/pull/567
* BatchQuant quantizers support by volcacius in https://github.com/Xilinx/brevitas/pull/563

QuantTensor

* Support to move QuantTensor across devices by Giuseppe5 in https://github.com/Xilinx/brevitas/pull/528
* Initial support for interpolate and pixel_shuffle by volcacius in https://github.com/Xilinx/brevitas/pull/578

PTQ

* Batch Norm support in graph equalization by Giuseppe5 in https://github.com/Xilinx/brevitas/pull/531
* Mul support in graph equalization by Giuseppe5 in https://github.com/Xilinx/brevitas/pull/530
* Learned round support by Giuseppe5 in https://github.com/Xilinx/brevitas/pull/573
* MultiheadAttention and LayerNorm support in graph equalization by Giuseppe5 in https://github.com/Xilinx/brevitas/pull/555
* Fix calibration over large number of batches by Giuseppe5 in https://github.com/Xilinx/brevitas/pull/523

Export

* Itemize scalar quantize args only in TorchScript QCDQ by volcacius in https://github.com/Xilinx/brevitas/pull/561
* Round avgpool export fixes by volcacius in https://github.com/Xilinx/brevitas/pull/562

CI, linting

* Linter isort by Giuseppe5 in https://github.com/Xilinx/brevitas/pull/505
* CI: bump isort from 5.10.1 to 5.11.5 by Giuseppe5 in https://github.com/Xilinx/brevitas/pull/540
* Test: enable parallelism with pytest-xdist by Giuseppe5 in https://github.com/Xilinx/brevitas/pull/513
* GHA workflow improvement by Giuseppe5 in https://github.com/Xilinx/brevitas/pull/507
* Add support for yapf by Giuseppe5 in https://github.com/Xilinx/brevitas/pull/511

FX

* Disable FX backport on 1.8.1+ by volcacius in https://github.com/Xilinx/brevitas/pull/504

Examples
* Pretrained Resnet18 example on CIFAR10 targeting FINN by volcacius in https://github.com/Xilinx/brevitas/pull/577
* Graph quantization + PTQ examples and benchmarking scripts by Giuseppe5 in https://github.com/Xilinx/brevitas/pull/547 https://github.com/Xilinx/brevitas/pull/575 https://github.com/Xilinx/brevitas/pull/576

For the **Full Changelog** please check : https://github.com/Xilinx/brevitas/compare/v0.8.0...v0.9.0

bnn_pynq-r2
Model definition and pretrained 4b variant of ResNet18 for FINN deployment. Available under the bnn_pynq examples:

python
from brevitas_examples.bnn_pynq.models import resnet18_4w4a
quant_model = resnet18_4w4a(pretrained=True)

0.8.0

What's Changed
* Add support for PyTorch 1.11-1.13.1. Brevitas 0.8 supports PyTorch 1.5.1 to 1.13.1, with 1.10+ suggested.
* Deprecate support for Python 3.6, 3.7+ is now required.
* Add support for export to ONNX QCDQ for <= int8 quantization, for out of the box execution with onnxruntime or similar backends.
* Extend support for export to ONNX QOps to <= int8 quantization, for out of the box execution with onnxruntime or similar backends.
* Add experimental support for export to torch QCDQ for <= int32 quantization, as an entry point for future MLIR integration with torch-mlir.
* Add support for QuantRNN, QuantLSTM, w/ support for CIFG, bidirectional layers, shared input-hidden gates, shared quantizers, training-time JIT compilation, and partial export support to ONNX (QONNX and QCDQ).
* Improve support for zero-point for both weights and activations quantization.
* New default asymmetric activation quantizer based on percentile rather than min/max.
* Add more built-in quantizers (symmetric per-channel, asymmetric per-channel, symmetric decoupled per-channel).
* Simplify interface for activation calibration.
* Simplify interface for bias correction.
* Initial support for QuantEmbedding.
* Deprecate support for XIR and PyXIR export flows.
* Many bug fixes and minor improvements.

New Contributors
* fd0r made their first contribution in https://github.com/Xilinx/brevitas/pull/434
* omarperacha made their first contribution in https://github.com/Xilinx/brevitas/pull/483
* andrei-stoian-zama made their first contribution https://github.com/Xilinx/brevitas/pull/470

**Full Changelog**: https://github.com/Xilinx/brevitas/compare/v0.7.1...v0.8.0

Page 1 of 3

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.