- Improvements for Björck initializers. - Stride handling in convolutionnal layers.
Bug fixes ---------- - Fixed bug with `ScaledL2NormPooling` which was causing `nan` values to appear after the first training step.
1.0.0
Controlling the Lipschitz constant of a layer or a whole neural network has many applications ranging from adversarial robustness to Wasserstein distance estimation.
This library provides implementation of k-Lispchitz layers for keras.