Opentsne

Latest version: v1.0.2

Safety actively analyzes 681775 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 1 of 2

1.0.2

General maintenance to keep openTSNE up to date with Python versions and dependencies.

Changes

- build wheels for Python 3.12 (255)
- update minimum Python version to 3.9 (4e86511b1a2c041d122cb2869480b0c96af79d63)
- add numpy 2.x support (aa3d76c2d86055caae0601cec10dd53db7769b8e)

1.0.1

Changes

- setup.py maintenance (249)
- drop Python 3.6 support (249)
- correctly implement dof parameter in exact BH implementation (246)

1.0.0

Given the longtime stability of openTSNE, it is only fitting that we release a v1.0.0.

Changes

- Various documentation fixes involving initialization, momentum, and learning rate (243)
- Include Python 3.11 in the test and build matrix
- Uniform affinity kernel now supports `mean` and `max` mode (242)

0.7.1

Bug Fixes

- (urgent) Fix memory error on data with duplicated rows (236)

0.7.0

Changes
- By default, we now add jitter to non-random initialization schemes. This has almost no effect on the resulting visualizations, but helps avoid potential problems when points are initialized at identical positions (225)
- By default, the learning rate is now calculated as `N/exaggeration`. This speeds up convergence of the resulting embedding. Note that the learning rate during the EE phase will differ from the learning rate during the standard phase. Additionally, we set `momentum=0.8` in both phases. Before, it was 0.5 during EE and 0.8 during the standard phase. This, again, speeds up convergence. (220)
- Add `PrecomputedAffinities` to wrap square affinity matrices (217)

Build changes
- Build `universal2` macos wheels enabling ARM support (226)

Bug Fixes
- Fix BH collapse for smaller data sets (235)
- Fix `updates` in optimizer not being stored correctly between optimization calls (229)
- Fix `inplace=True` optimization changing the initializations themselves in some rare use-cases (225)

As usual, a special thanks to dkobak for helping with practically all of these bugs/changes.

0.6.2

Changes
- By default, we now use the `MultiscaleMixture` affinity model, enabling us to pass in a list of perplexities instead of a single perplexity value. This is fully backwards compatible.
- Previously, perplexity values would be changed according to the dataset. E.g. we pass in `perplexity=100` with N=150. Then `TSNE.perplexity` would be equal to 50. Instead, keep this value as is and add an `effective_perplexity_` attribute (following the convention from scikit-learn, which puts in the corrected perplexity values.
- Fix bug where interpolation grid was being prepared even when using BH optimization during transform.
- Enable calling `.transform` with precomputed distances. In this case, the data matrix will be assumed to be a distance matrix.

Build changes
- Build with `oldest-supported-numpy`
- Build linux wheels on `manylinux2014` instead of `manylinux2010`, following numpy's example
- Build MacOS wheels on `macOS-10.15` instead of `macos-10.14` Azure VM
- Fix potential problem with clang-13, which actually does optimization with infinities using the `-ffast-math` flag

Page 1 of 2

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.