What’s Changed
* Allow basic operations in evaluation parameters, with or without evaluation parameters.
* When unexpected exceptions occur (e.g., during data docs rendering), the user will see detailed error messages, providing information about the specific issue as well as the stack trace.
* Remove the "project new" option from the command line (since it is not implemented; users can only run "init" to create a new project).
* Update type detection for bigquery based on driver changes in pybigquery driver 0.4.14. Added a warning for users who are running an older pybigquery driver
* added execution tests to the NotebookRenderer to mitigate codegen risks
* Add option "persist", true by default, for SparkDFDataset to persist the DataFrame it is passed. This addresses 1133
in a deeper way (thanks tejsvirai for the robust debugging support and reproduction on spark).
- Disabling this option should *only* be done if the user has *already* externally persisted the DataFrame, or if the
dataset is too large to persist but *computations are guaranteed to be stable across jobs*.
* Enable passing dataset kwargs through datasource via dataset_options batch_kwarg.
* Fix AttributeError when validating expectations from a JSON file
* Data Docs: fix bug that was causing erratic scrolling behavior when table of contents contains many columns
* Make expectation suite json deterministically human-parseable (1247) mzjp2
* Change the verb of the cli command project in docs (1237) Bussmeyer
* Fix AttributeError when validating expectations from a JSON file (1238) eorbe
* Fixed link to typical_workflow (1227) jvanbuel
* Remove Python 2 tags from setup.py (1229) mzjp2