This release includes support for quantization of all the Bayesian Convolutional layers listed below in addition to Conv2dReparameterization and Conv2dFlipout.
Conv1dReparameterization,
Conv3dReparameterization,
ConvTranspose1dReparameterization,
ConvTranspose2dReparameterization,
ConvTranspose3dReparameterization,
Conv1dFlipout,
Conv3dFlipout,
ConvTranspose1dFlipout,
ConvTranspose2dFlipout,
ConvTranspose3dFlipout
This release also includes the fixes for the following issues:
Issue https://github.com/IntelLabs/bayesian-torch/issues/27
Issue https://github.com/IntelLabs/bayesian-torch/issues/21
Issue https://github.com/IntelLabs/bayesian-torch/issues/24
Issue https://github.com/IntelLabs/bayesian-torch/issues/34
What's Changed
* Add quant prepare functions 342ca39b61814d702a6a6bef15981ca2e139dd8f
* Fix bug in post-training quantization evaluation due to Jit trace f5c7126cb80ed7dc86b3c6dd55bc5c006d64e25a
* Add quantization example for ImageNet/ResNet-50 3e749142f9ff9eaf2e93701348884d88cc2b6375
* Correcting the order of group and dilation parameters in Conv transpose layers 97ba16ad044022035ba22b17aa279f2d389129eb
**Full Changelog**: https://github.com/IntelLabs/bayesian-torch/compare/v0.4.0...v0.5.0