- Added pynnd=True option to use pynn descent for coarse-grained affinity matrix computation (caveat: runs into a weird pickling error on Colab: https://github.com/lmcinnes/pynndescent/issues/133)
- Noticed that storing the agkm embeddings as [(agkm_string_representation, value), ...] seemed to take up a lot of space (possibly because representing the agkms as strings is space-consuming? So now they get converted to [(agkm_idx, value)...] before being stored. This seems to bring down the memory consumption.
- Other minor changes pertaining to reporting some internal hit-scoring-related metrics (exclude_self excludes the self when benchmarking how well the fann_perclass (finegrained-affinity nearest-neighbors) method works for recovering the true class for motif hits, since the fine-grained affinity to the self is always 1; also added benchmarking of how well simply using aggregate similarity works)
- Also did some reorganization of example notebooks that I mainly use to test out stuff - put some of the more experimental notebooks under "examples/simulated_TAL_GATA_deeplearning/other"
- Updating Leiden version to avoid the segfault bug (https://github.com/vtraag/leidenalg/issues/68)