---------------------------------------
New features:
* Downbeat tracking based on Recurrent Neural Network (RNN) and Dynamic
Bayesian Network (DBN) (130)
* Convolutional Neural Networks (CNN) and CNN onset detection (133)
* Linear-Chain Conditional Random Field (CRF) implementation (144)
* Deep Neural Network (DNN) based chroma vector extraction (148)
* CRF chord recognition using DNN chroma vectors (148)
* CNN chord recognition using CRF decoding (152)
* Initial Windows support (Python 2.7 only, no pip packages yet) (157)
* Gated Recurrent Unit (GRU) network layer (167)
Bug fixes:
* Fix downbeat output bug (128)
* MIDI file creation bug (166)
API relevant changes:
* Refactored the `ml.rnn` to `ml.nn` and converted the models to pickles (110)
* Reordered the dimensions of comb_filters to time, freq, tau (135)
* `write_notes` uses `delimiter` instead of `sep` to separate columns (155)
* `LSTMLayer` takes `Gate` as arguments, all layers are callable (161)
* Replaced `online` parameter of `FramedSignalProcessor` by `origin` (169)
Other changes:
* Added classes for onset/note/beat detection with RNNs to `features.*` (118)
* Add examples to docstrings of classes (119)
* Converted `madmom.modules` into a Python package (125)
* `match_files` can handle inexact matches (137)
* Updated beat tracking models to MIREX 2015 ones (146)
* Tempo and time signature can be set for created MIDI files (166)