Bump version to V0.2.6 with new features as [20](https://github.com/Westlake-AI/openmixup/issues/20). Update new features and documents of `OpenMixup` v0.2.6 as issue [#24](https://github.com/Westlake-AI/openmixup/issues/24), fix relevant issue [#25](https://github.com/Westlake-AI/openmixup/issues/25), issue [#26](https://github.com/Westlake-AI/openmixup/issues/26), issue [#27](https://github.com/Westlake-AI/openmixup/issues/27), issue [#31](https://github.com/Westlake-AI/openmixup/issues/31), and issue [#33](https://github.com/Westlake-AI/openmixup/issues/33).
New Features
- Support new backbone architectures ([EdgeNeXt](https://arxiv.org/abs/2206.10589), [EfficientFormer](https://arxiv.org/abs/2206.01191), [HorNet](https://arxiv.org/abs/2207.14284), ([MogaNet](https://arxiv.org/abs/2211.03295), [MViT.V2](https://arxiv.org/abs/2112.01526), [ShuffleNet.V1](https://arxiv.org/abs/1707.01083), [DeiT-3](https://arxiv.org/abs/2204.07118)), and provide relevant network modules in `models/utils/layers`. Config files and README.md are updated.
- Support new self-supervised method [BEiT](https://arxiv.org/abs/2106.08254) with ViT-Base on ImageNet-1K, and fix bugs of [CAE](https://arxiv.org/abs/2202.03026), [MaskFeat](https://arxiv.org/abs/2112.09133), and [SimMIM](https://arxiv.org/abs/2111.09886) in `Dataset`, `Model`, and `Head`. Note that we added `HOG` feature implementation borrowed from the [original repo](https://github.com/facebookresearch/SlowFast) for [MaskFeat](https://arxiv.org/abs/2112.09133). Update pre-training and fine-tuning config files, and documents for the relevant masked image modeling (MIM) methods ([BEiT](https://arxiv.org/abs/2106.08254), [MaskFeat](https://arxiv.org/abs/2111.06377), [CAE](https://arxiv.org/abs/2202.03026), and [A2MIM](https://arxiv.org/abs/2205.13943)). Support more fine-tuning setting on ImageNet for MIM pre-training based on various backbones (e.g., ViTs, ResNets, ConvNeXts).
- Fix the updated arXiv.V2 version of [VAN](https://arxiv.org/pdf/2202.09741v2.pdf) by adding architecture configurations.
- Support [ArcFace](https://arxiv.org/abs/1801.07698) loss for metric learning and the relevant `NormLinearClsHead`. And support [SeeSaw](https://arxiv.org/abs/2008.10032) loss for long-tail classification tasks.
- Update the issue template with more relevant links and emojis.
- Support Grad-CAM visualization tools [vis_cam.py](tools/visualizations/vis_cam.py) of supported architectures.
Update Documents
- Update our `OpenMixup` tech report on [arXiv](https://arxiv.org/abs/2209.04851), which provides more technical details and benchmark results.
- Update self-supervised learning [Model_Zoo_selfsup.md](https://github.com/Westlake-AI/openmixup/tree/main/docs/en/model_zoos/Model_Zoo_selfsup.md). And update documents of the new backbone and self-supervised methods.
- Update supervised learning [Model_Zoo_sup.md](https://github.com/Westlake-AI/openmixup/tree/main/docs/en/model_zoos/Model_Zoo_sup.md) as provided in [AutoMix](https://arxiv.org/abs/2103.13027) and support more mixup benchmark results.
- Update the template and add the latest paper lists of mixup and MIM methods in [Awesome Mixups](docs/en/awesome_selfsup/MIM.md) and [Awesome MIM](docs/en/awesome_selfsup/MIM.md). We provide teaser figures of most papers as illustrations.
- Update [documents](docs/en/tools) of `tools`.
Bug Fixes
- Fix raising error notification of `torch.fft` for *PyTorch 1.6* or lower versions in backbones and heads.
- Fix `README.md` (new icons, fixing typos) and support pytest in `tests`.
- Fix the classification heads and update implementations and config files of [AlexNet](https://dl.acm.org/doi/10.1145/3065386) and [InceptionV3](https://arxiv.org/abs/1512.00567).