Nearest Neighbour Contrastive Learning of Representations (NNCLR)
New NNCLR model
NNCLR[0] is basically SimCLR, but replaces samples by their nearest neighbours as an additional "augmentation" step.
As part of it, a Nearest Neighbour Memory Bank Module was implemented, which could also be used for other models.
[0] [With a Little Help from My Friends: Nearest-Neighbor Contrastive Learning of Visual Representations](https://arxiv.org/abs/2104.14548v1)
python
resnet = torchvision.models.resnet18()
backbone = nn.Sequential(
*list(resnet.children())[:-1],
nn.AdaptiveAvgPool2d(1),
)
NNCLR
model = NNCLR(backbone)
criterion = NTXentLoss()
Prefer SimSiam with nearest neighbour?
model = SimSiam(backbone)
criterion = SymNegCosineSimilarityLoss()
Prefer BYOL with nearest neighbour?
model = BYOL(backbone)
criterion = SymNegCosineSimilarityLoss()
nn_replacer = NNMemoryBankModule(size=2 ** 16)
forward pass
(z0, p0), (z1, p1) = model(x0, x1)
z0 = nn_replacer(z0.detach(), update=False)
z1 = nn_replacer(z1.detach(), update=True)
loss = 0.5 * (criterion(z0, p1) + criterion(z1, p0))
Models
- [Bootstrap your own latent: A new approach to self-supervised Learning, 2020](https://arxiv.org/abs/2006.07733)
- [Barlow Twins: Self-Supervised Learning via Redundancy Reduction, 2021](https://arxiv.org/abs/2103.03230)
- [SimSiam: Exploring Simple Siamese Representation Learning, 2020](https://arxiv.org/abs/2011.10566)
- [MoCo: Momentum Contrast for Unsupervised Visual Representation Learning, 2019](https://arxiv.org/abs/1911.05722)
- [SimCLR: A Simple Framework for Contrastive Learning of Visual Representations, 2020](https://arxiv.org/abs/2002.05709)
- [NNCLR: Nearest-Neighbor Contrastive Learning of Visual Representations, 2021](https://arxiv.org/pdf/2104.14548.pdf)