------------------
* Added ``rel_tol`` and ``abs_tol`` parameters to ``testing.assertions`` to be
consistent with PEP485 and deal with real world testing situations that
required an absolute tolerance.
* Removed calls to ``logging.basicConfig`` throughout pygeoprocessing. Client
applications may need to adjust their logging if pygeoprocessing's log
messages are desired.
* Added a flag to ``aggregate_raster_values_uri`` that can be used to
indicate incoming polygons do not overlap, or the user does not care about
overlap. This can be used in cases where there is a computational or memory
bottleneck in calculating the polygon disjoint sets that would ultimately be
unnecessary if it is known a priori that such a check is unnecessary.
* Fixed an issue where in some cases different nodata values for 'signal' and
'kernel' would cause incorrect convolution results in ``convolve_2d_uri``.
* Added functionality to ``pygeoprocessing.iterblocks`` to iterate over
largest memory aligned block that fits into the number of elements provided
by the parameter. With default parameters, this uses a ceiling around 16MB
of memory per band.
* Added functionality to ``pygeoprocessing.iterblocks`` to return only the
offset dictionary. This functionality would be used in cases where memory
aligned writes are desired without first reading arrays from the band.
* Refactored ``pygeoprocessing.convolve_2d_uri`` to use ``iterblocks`` to take
advantage of large block sizes for FFT summing window method.
* Refactoring source side to migrate source files from [REPO]/pygeoprocessing
to [REPO]/src/pygeoprocessing.
* Adding a pavement script with routines to fetch SVN test data, build a
virtual environment, and clean the environment in a Windows based operating
system.
* Adding ``transform_bounding_box`` to calculate the largest projected
bounding box given the four corners on a local coordinate system.
* Removing GDAL, Shapely from the hard requirements in setup.py. This will
allow pygeoprocessing to be built by package managers like pip without these
two packages being installed. GDAL and Shapely will still need to be
installed for pygeoprocessing to run as expected.
* Fixed a defect in ``pygeoprocessing.testing.assert_checksums_equal``
preventing BSD-style checksum files from being analyzed correctly.
* Fixed an issue in reclassify_dataset_uri that would cause an exception if
the incoming raster didn't have a nodata value defined.
* Fixed a defect in ``pygeoprocessing.geoprocessing.get_lookup_from_csv``
where the dialect was unable to be detected when analyzing a CSV that was
larger than 1K in size. This fix enables the correct detection of comma or
semicolon delimited CSV files, so long as the header row by itself is not
larger than 1K.
* Intra-package imports are now relative. Addresses an import issue for users
with multiple copies of pygeoprocessing installed across multiple Python
installations.
* Exposed cython routing functions so they may be imported from C modules.
* ``get_lookup_from_csv`` attempts to determine the dialect of the CSV instead
of assuming comma delimited.
* Added relative numerical tolerance parameters to the PyGeoprocessing raster
and csv tests with in the same API style as ``numpy.testing.allclose``.
* Fixed an incomparability with GDAL 1.11.3 bindings that expects a boolean
type in ``band.ComputeStatistics``. Before this fix PyGeoprocessing would
crash with a TypeError on many operations.
* Fixed a defect in pygeoprocessing.routing.calculate_transport where the
nodata types were cast as int even though the base type of the routing
rasters were floats. In extreme cases this could cause a crash on a type
that could not be converted to an int, like an ``inf``, and in subtle cases
this would result in nodata values in the raster being ignored during
routing.
* Added functions to construct raster and vectors on disk from reasonable
datatypes (numpy matrices for rasters, lists of Shapely geometries for
vectors).
* Fixed an issue where reproject_datasource_uri would add geometry that
couldn't be projected directly into the output datasource. Function now
only adds geometries that transformed without error and reports if any
features failed to transform.
* Added file flushing and dataset swig deletion in reproject_datasource_uri to
handle a race condition that might have been occurring.
* Fixed an issue when "None" was passed in on new raster creation that would
attempt to directly set that value as the nodata value in the raster.
* Added basic filetype-specific assertions for many geospatial filetypes, and
tests for these assertions. These assertions are exposed in
``pygeoprocessing.testing``.
* Pygeoprocessing package tests can be run by invoking
``python setup.py nosetests``. A subset of tests may also be run from an
installed pygeoprocessing distribution by calling
``pygeoprocessing.test()``.
* Fixed an issue with reclassify dataset that would occur when small rasters
whose first memory block would extend beyond the size of the raster thus
passing in "0" values in the out of bounds area. Reclassify dataset
identified these as valid pixels, even though vectorize_datsets would mask
them out later. Now vectorize_datasets only passes memory blocks that
contain valid pixel data to its kernel op.
* Added support for very small AOIs that result in rasters less than a pixel
wide. Additionally an ``all_touched`` flag was added to allow the
ALL_TOUCHED=TRUE option to be passed to RasterizeLayer in the AOI mask
calculation.
* Added watershed delineation routine to
pygeoprocessing.routing.delineate_watershed. Operates on a DEM and point
shapefile, optionally snaps outlet points to nearest stream as defined by a
thresholded flow accumulation raster and copies the outlet point fields into
the constructed watershed shapefile.
* Fixing a memory leak in block caches that held on to dataset, band, and
block references even after the object was destroyed.
* Add an option to route_flux that lets the current pixel's source be included
in the flux, or not. Previous version would include on the source no matter
what.
* Now using natcap.versioner for versioning instead of local versioning logic.