Implemented various layers for convenience.
In addition to `direct_norm` now we have native Lipschitz and Monotonic layers:
The `LipschitzLinear` class is a linear layer with a Lipschitz constraint on its weights.
The `MonotonicLayer` class is a linear layer with a Lipschitz constraint on its weights and monotonicity constraints that can be specified for each input dimension, or for each input-output pair.
The `MonotonicWrapper` class is a wrapper around a module with a Lipschitz constant. It adds a term to the output of the module which enforces monotonicity constraints given by monotone_constraints. The class returns a module that is monotonic and Lipschitz with constant lipschitz_const.
The `SigmaNet` class is a deprecated class that is equivalent to the MonotonicWrapper class.
The `RMSNorm` class is a class that implements the RMSNorm normalization layer. It can help when training
a model with many Lipschitz/MontonicLayers.
Updated the main README.md with more details and examples.
Updated plots and added code for the toy figures in the paper.
*Breaking Changes!*
`monotonenorm` has been renamed to `monotonicnetworks`.
New PyPI name following the same change. The old package name will still exist but will be marked as deprecated.