What's Changed
* Added issue templates by andreped in https://github.com/andreped/GradientAccumulator/pull/59
* Fixed bug in AccumBatchNormalizer - identical results to Keras BN by andreped in https://github.com/andreped/GradientAccumulator/pull/61
* Docs: Added AccumBN example + docs README + minor fixes by andreped in https://github.com/andreped/GradientAccumulator/pull/62
* bump v0.4.1 by andreped in https://github.com/andreped/GradientAccumulator/pull/63
New API
You can now use gradient accumulation with the `AccumBatchNormalization` layer:
from gradient_accumulator import GradientAccumulateModel, AccumBatchNormalization
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Dense
define model and add accum BN layer
model = Sequential()
model.add(Dense(32, activation="relu"))
model.add(AccumBatchNormalization(accum_steps=8))
model.add(Dense(10))
add gradient accumulation to the rest of the model
model = GradientAccumulateModel(accum_steps=8, inputs=model.input, outputs=model.output)
More information about remarks and usage can be found at [gradientaccumulator.readthedocs.io](https://gradientaccumulator.readthedocs.io/en/latest/examples/batch_normalization.html)
**Full Changelog**: https://github.com/andreped/GradientAccumulator/compare/v0.4.0...v0.4.1