Deepcell

Latest version: v0.12.10

Safety actively analyzes 688619 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 4 of 12

0.10.2

Not secure
🐛 Bug Fixes

<details>
<summary>Optimize hole filling algorithm ngreenwald (114)</summary>

This PR closes 113. It replaces the previous hole-filling algorithm with an optimized, regionprops-based version. The remaining bottleneck in the post-processing is now peak finding, so the warning message for large images is now raised when `h_maxima` is selected for large images instead of `peak_local_max`
</details>


🧰 Maintenance

<details>
<summary>Bump version to `0.10.2`. willgraf (115)</summary>


</details>

<details>
<summary>Update name from Deepcell to DeepCell to match other projects. willgraf (112)</summary>


</details>

<details>
<summary>Cache the entire Python environment to speed up build times. willgraf (111)</summary>

What
* Cache the entire Python environment in the testing GitHub Action workflow.

Why
* [Drastically speed up build times](https://medium.com/ai2-blog/python-caching-in-github-actions-e9452698e98d).

</details>

0.10.1

Not secure
🧰 Maintenance

<details>
<summary>Support Python 3.9 willgraf (110)</summary>
</details>

0.10.0

Not secure
🚀 Features

<details>
<summary>Combine `deep_watershed` and `deep_watershed_mibi` willgraf (108)</summary>

This PR combines `deep_watershed` and `deep_watershed_mibi` into a single function, `deep_watershed`.

The following arguments have been deprecated in favor of their `deep_watershed_mibi` alternatives (due to readability/understandability):

- `min_distance` deprecated in favor of `radius`
- `distance_threshold` deprecated in favor of `interior_threshold`
- `detection_threshold` deprecated in favor of `maxima_threshold`

Additionally, new arguments have been moved over from `deep_watershed_mibi`:

- `maxima_smooth` and `interior_smooth` to smooth the inputs with a gaussian filter.
- `fill_holes_threshold` to fill holes smaller than this many pixels
- `pixel_expansion` to expand the `interior` array
- `maxima_algorithm` to switch from `h_maxima` peak finder to `peak_local_max`. By default, `deep_watershed` now uses `h_maxima` to find the peaks in an image. However, to use the previous algorithm, the user can pass `peak_local_max` instead. This is found to be less accurate but faster, and may be suitable for unambiguous peaks.

The input to the function is still a list of two numpy arrays, though more can be passed as long as the indices of the maxima and interior arrays are passed with `maxima_index` and `interior_index`.

Finally, `label_erosion` was added as a parameter to enable eroding labels (as performed by `deep_watershed` by default).

---

This PR leaves `deep_watershed_3D` and `deep_watershed_mibi` as importable functions, but they will just use `deep_watershed` internally. These will be removed in a future version, along with the deprecated arguments.

---

Fixes 51
</details>

<details>
<summary>Refactor `deepcell_toolbox.metrics` for better reproducibility and performance. willgraf (106)</summary>

This PR significantly refactors `deepcell_toolbox.metrics.Metrics`. Now, a `Detection` object is created for each combination of overlapping objects in each `y_pred` and `y_true` image. The `Detections` allow for easier calculation of all metrics and error types on the fly. These metrics/error types are now tracked as properties on `ObjectMetrics` rather than each having an individual attribute that is mutated with each function call. `Metrics` also now has a `summarize_object_metrics_df` method to help convert the data frame of every metric into summary statistics. This has greatly cleaned up the existing attributes and their instantiation; the pattern of adding attributes when helper functions are called has been removed entirely.
Extra attention was also given to cases where there may be division by 0 or other NaN issues, which should prevent the spammy warnings that we had seen in previous versions.

The performance has also been greatly improved - from 60 batches/s to 300 batches/s (or from 45s to 9s for my 2880 test batches). This was primarily accomplished through:

- Creating a single `pd.DataFrame` using `from_records`, rather than iteratively creating and appending data frames for each `ObjectMetrics` calculated.
- Using `count_nonzero` instead of `sum` where possible, which prevents coercing the datatype from boolean to float.

⚠️ Breaking Changes ⚠️

This PR has several breaking changes to `deepcell_toolbox.metrics`:

- `seg` has been deprecated, but not removed (yet). The SEG score is always calculated.
- `stats` and `output` are no longer attributes
- `ObjectAccuracy` is now `ObjectMetrics`
- `stats_pixelbased` has been replaced with `PixelMetrics`
- `all_pixel_stats` has been replaced with `calc_pixel_stats` to be consistent with `calc_object_stats`.
- `calc_pixel_confusion_matrix` has been replaced with `PixelMetrics.get_confusion_matrix`
- `pixel_df_to_dict` has been replaced with `df_to_dict`
- `save_error_ids`, `assign_plot_values`, and `plot_errors` have all been removed in favor of `ObjectMetrics.plot_errors`.
- `to_precision` has been removed as it is seemingly redundant to the `round` built-in.

I hope these breaking changes do not disrupt many downstream processes - I only found the use of `Metrics.calc_object_stats`, which should be unaffected.

---

Fixes 26 (Object stats and `force_event_links` should likely resolve this. If not we can re-open it or make a new issue)
Fixes 68 (Performance has improved by a factor of ~5)
</details>


🐛 Bug Fixes

<details>
<summary>Combine `deep_watershed` and `deep_watershed_3D` \& deprecate `deep_watershed_3D`. willgraf (107)</summary>

The only difference in these two functions was the dimensionality of the `coords` found in `peak_local_max`. Switching to a comprehension allows us to easily use 2D or 3D data with the same function.

`deep_watershed_3D` is now deprecated (and throws a Deprecation Warning), but is kept for now for backward compatibility.
</details>

<details>
<summary>Import `watershed` from `skimage.segmentation`. willgraf (105)</summary>

`skimage.morphology.watershed` is deprecated and removed in `skimage` v0.19. This PR uses `skimage.segmentation.watershed` instead, which is supported by at least 0.14+.

This resolves an annoying deprecation warning we get every time we use the post-processing function.
</details>


🧰 Maintenance

<details>
<summary>Bump version to 0.10.0 willgraf (109)</summary>


</details>

0.9.2

Not secure
Bugfixes

* Upgrade `tensorflow` to [v2.4.3](https://github.com/tensorflow/tensorflow/releases/tag/v2.4.3) to fix several CVEs.

0.9.1

Not secure
<details>
<summary>Add release-drafter to automate release drafts and changelogs. willgraf (100)</summary>

Adding a template and GitHub workflow for the [release-drafter](https://github.com/release-drafter/release-drafter). This should automatically create and update a draft for the next release of the project.
</details>


🚀 Features

<details>
<summary>Cache caluclated windows to avoid unnecessary computation. willgraf (99)</summary>
</details>

<details>
<summary>Check for divide by zero ngreenwald (102)</summary>

We currently don't check that there are valid predictions when computing precision, which can lead to divide by zero errors.
</details>

0.9.0

Not secure
Features

* Add F1, precision, recall to metrics package. (95)

Bugfixes

* Add warning to Metrics when relabeling (70)
* Cast inputs to relabel_sequential as ints for `scikit-image` 0.17+ compatibility. (92)

Breaking Changes

* Remove `RetinaNet` and `RetinaMask` utility functions. (94)
* Remove `multiplex_utils.py`. (97)
* Dropped support for Python 2.7 (95)

Page 4 of 12

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.