- GpRegressor now supports multi-start gradient-based hyper-parameter optimisation using the [L-BFGS-B algorithm](https://docs.scipy.org/doc/scipy/reference/generated/scipy.optimize.fmin_l_bfgs_b.html), in addition to [differential evolution](https://docs.scipy.org/doc/scipy/reference/generated/scipy.optimize.differential_evolution.html#scipy.optimize.differential_evolution), which was previously available. Which of these approaches is used can be selected using the new "optimizer" keyword argument, with L-BFGS-B being the default.
- GpRegressor now supports distributed hyper-parameter optimisation using sub-process based parallelism. The number of sub-processes over which the optimisation is distributed is set by the new "n_processes" keyword argument. Currently only the multi-start L-BFGS-B optimiser can take advantage of this, so this keyword is ignored when using the differential evolution optimizer.
- Fixed a bug in GaussianKDE which caused a crash when fewer than 10 samples were given as input.