Two main features this release:
Hyperparameter Optimization
Hyperparameters can be optimized on the best found pipeline via the `skplumber.SKPlumber.crank(..., tune=True)` API or the on any single pipeline using the `skplumber.tuners.ga.ga_tune` method. This is accomplished via the `flexga` package and hyperparameter annotations which have been added to all machine learning primitives.
Custom Evaluation
Previously, `skplumber.SKPlumber.crank` could only do k-fold cross validation. Now, by passing in a custom evaluator e.g. `skplumber.SKPlumber.crank(..., evaluator=my_evaluator)`, any other pipeline evaluation method can be used. `skplumber` provides evaluators for k-fold cross validation, simple train/test splitting, and down-sampled train/test splitting.