Added
- new models:
- [MeliusNet](bitorch/models/meliusnet.py)
- [BinaryDenseNet](bitorch/models/densenet.py)
- [QuickNet](bitorch/models/quicknet.py)
- simple example script for MNIST
- support for integration of bitorch's inference engine for the following layers
- QLinear
- QConv
- a quantized DLRM version, derived from [this](https://github.com/facebookresearch/dlrm) implementation
- example code for training the quantized DLRM model
- new quantization function: [Progressive Sign](bitorch/quantizations/progressive_sign.py)
- new features in PyTorch Lightning example:
- training with Knowledge Distillation
- improved logging
- callback to update Progressive Sign module
- option to integrate custom models, datasets, quantization functions
- a quantization scheduler which lets you change quantization methods during training
- a padding layer
Changed
- requirements changed:
- code now depends on torch 1.12.x and torchvision 0.13.x
- requirements for examples are now stored at their respective folders
- optional requirements now install everything needed to run all examples
- code is now formatted with the black code formatter
- using PyTorch's implementation of RAdam
- renamed the `bitwidth` attribute of quantization functions to `bit_width`
- moved the image datasets out of the bitorch core package into the image classification example
Fixed
- fix error from updated protobuf package