- Gradient checking added to check the correctness of gradients calculated by autograd
- Optimization algorithms like Momentum, RMSProp, Adam added
- Convolutional Neural Networks added with MaxPooling
- Save models, weights to disk and load them whenever required
- Add checkpoints while training to prevent loss of trained weights in case of catastrophes like running out of battery
- Tests added for layers, loss functions, activations
- Documentation available at neograd.readthedocs.io/