Openmixup

Latest version: v0.2.9

Safety actively analyzes 688746 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 1 of 3

0.2.9

Bump version to V0.2.9 with new mixup augmentations and various optimizers.

New Features

- Support new mixup augmentation methods, including [AdAutoMix](https://arxiv.org/abs/2312.11954) and [SnapMix](https://arxiv.org/abs/2012.04846). Config files and models & logs were provided and are on updating.
- Support more PyTorch optimizers implemented, including Adam variants (e.g., AdaBelief, AdaFactor) and SGD variants (e.g., SGDP).
- Support evaluation tools for mixup augmentations, including robustness testing (corruption and adversiral attack robustness) and calibration evaluation.
- Provide more config files for self-supervised learning methods on small-scale datasets (CIFAR-100 and STL-10).

0.2.8

Bump version to V0.2.8 with new features in [MMPreTrain](https://github.com/open-mmlab/mmpretrain).

New Features

- Support more backbone architectures, including [MobileNetV3](https://arxiv.org/abs/1905.02244), [EfficientNetV2](https://arxiv.org/abs/2104.00298), [HRNet](https://arxiv.org/abs/1908.07919), [CSPNet](https://arxiv.org/abs/1911.11929), [LeViT](https://arxiv.org/abs/2104.01136), [MobileViT](http://arxiv.org/abs/2110.02178), [DaViT](https://arxiv.org/abs/2204.03645), and [MobileOne](http://arxiv.org/abs/2206.04040), etc.
- Support CIFAR-100 benchmarks of Metaformer architectures and Mixup variants with Transformers, detailed in [cifar100/advanced](https://github.com/Westlake-AI/openmixup/blob/main/configs/classification/cifar100/advanced) and [cifar100/mixups](https://github.com/Westlake-AI/openmixup/blob/main/configs/classification/cifar100/mixups). Models and logs of various CIFAR-100 mixup benchmarks are on updating.
- Support regression tasks with relavent datasets, metrics, and [configs](https://github.com/Westlake-AI/openmixup/blob/main/configs/regression). Datasets include [AgeDB](https://ieeexplore.ieee.org/document/8014984), [IMDB-WIKI](https://link.springer.com/article/10.1007/s11263-016-0940-3), and [RCFMNIST](https://arxiv.org/abs/2210.05775).
- Support Switch EMA in image classification, contrastive learning (BYOL, MoCo variants), and regression tasks.
- Support optimizers implemented in timm, including AdaBelief, AdaFactor, Lion, etc.

Update Documents

- Update formats of awesome lists in [Awesome Mixups](docs/en/awesome_selfsup/MIM.md) and [Awesome MIM](docs/en/awesome_selfsup/MIM.md) and provide the latest methods (updated to 30/09/2023).

Bug Fixes

- Fix the `by_epoch` setting in `CustomSchedulerHook` and update `DecoupleMix` in `soft_mix_cross_entropy` to support label smoothing settings.
- Fix bugs of Vision Transformers in [cls_mixup_head](https://github.com/Westlake-AI/openmixup/blob/main/openmixup/models/heads/cls_mixup_head.py) and [reg_head](https://github.com/Westlake-AI/openmixup/blob/main/openmixup/models/heads/reg_head.py).

0.2.7

Bump version to V0.2.7 with new features as [35](https://github.com/Westlake-AI/openmixup/issues/35). Update new features of `OpenMixup` v0.2.7 as issue [#36](https://github.com/Westlake-AI/openmixup/issues/36).

Code Refactoring

- Refactor `openmixup.core` (instead of `openmixup.hooks`) and `openmixup.models.augments` (contains mixup augmentation methods which are originally implemented in `openmixup.models.utils`). After code refactoring, the macro design of `OpenMixup` is similar to most projects of MMLab.
- Support deployment of `ONNX` and `TorchScript` in `openmixup.core.export` and `tools/deployment`. We refactored the abstract class `BaseModel` (implemented in `openmixup/models/classifiers/base_model.py`) to support `forward_inference` (for custom inference and visualization). We also refactored `openmixup.models.heads` and `openmixup.models.losses` to support `forward_inference`. You can deploy the classification models in `OpenMixup` according to [deployment tutorials](https://github.com/Westlake-AI/openmixup/tree/main/docs/en/tools).
- Support testing API methods in `openmixup/apis/test.py` for evaluation and deployment of classification models.
- Refactor `openmixup.core.optimizers` to separate optimizers and builders and support the latest [Adan](https://arxiv.org/abs/2208.06677) optimizer.
- Refactor [`mixup_classification.py`](https://github.com/Westlake-AI/openmixup/blob/main/openmixup/models/classifiers/mixup_classification.py) to support label mixup methods, add `return_mask` for mixup methods in [`augments`](https://github.com/Westlake-AI/openmixup/tree/main/openmixup/models/augments) and add `return_attn` in ViT backbone.
- Refactor `ValidateHook` to support new features as `EvalHook` in mmcv, e.g., `save_best="auto"` during training.
- Refactor `ClsHead` with `BaseClsHead` to support MLP classification head variants in modern network architectures.

New Features

- Support detailed usage instructions in README of config files for image classification methods in `configs/classification`, e.g., [mixups on ImageNet](https://github.com/Westlake-AI/openmixup/tree/main/configs/classification/imagenet/mixups/README.md). READMEs of other methods in `configs/selfsup` and `configs/semisup` will also be updated.
- Refine the origianzation of README files according to [README-Template](https://github.com/othneildrew/Best-README-Template).
- Support the new mixup augmentation method ([AlignMix](https://arxiv.org/abs/2103.15375)) and provide the relevant config files in various datasets.
- Refine the setup for the local installation and PyPi release in `setup.py` and `setup.cfg`. View PyPi project of [OpenMixup](https://pypi.org/project/openmixup).
- Support a new mixup method [TransMix](https://arxiv.org/abs/2111.09833) and provide config files in [mixups/deit](https://github.com/Westlake-AI/openmixup/tree/main/configs/classification/imagenet/mixups/deit).
- Update config files. Provide full config files of mixup methods based on ViT-T/S/B on ImageNet and update [RSB A3](https://arxiv.org/abs/2110.00476) config files for popular backbones.
- Update `target_generators` to support the latest MIM pre-training methods (fixed requirements).
- Update config files and scripts for SSL downstream tasks benchmarks (classification, detection, and segmentation).
- Update and fix bugs in visualization tools ([vis_loss_landscape](https://github.com/Westlake-AI/openmixup/tree/main/tools/visualizations/vis_loss_landscape.py)). Fix [model converters](https://github.com/Westlake-AI/openmixup/tree/main/tools/model_converters) tools.
- Support [Semantic-Softmax](https://arxiv.org/abs/2104.10972) loss and [ImageNet-21K-P (Winter)](https://openreview.net/forum?id=Zkj_VcZ6ol&noteId=1oUacUMpIbg) pre-training.
- Support more backbone architectures, including [BEiT](https://arxiv.org/abs/2106.08254), [MetaFormer](https://arxiv.org/abs/2210.13452), [ConvNeXtV2](http://arxiv.org/abs/2301.00808), [VanillaNet](https://arxiv.org/abs/2305.12972), and [CoC](https://arxiv.org/abs/2303.01494).

Update Documents

- Update documents of mixup benchmarks on ImageNet in [Model_Zoo_sup.md](https://github.com/Westlake-AI/openmixup/tree/main/docs/en/model_zoos/Model_Zoo_sup.md). Update config files for supported mixup methods.
- Update formats (figures, introductions and content tables) of awesome lists in [Awesome Mixups](docs/en/awesome_selfsup/MIM.md) and [Awesome MIM](docs/en/awesome_selfsup/MIM.md) and provide the latest methods (updated to 18/03/2023).
- Update `api` that describes the overall code structures in `docs/en/api` for the readthedocs page.
- Reorganize and update tutorials for SSL downstream tasks benchmarks (classification, detection, and segmentation).

0.2.6

Bump version to V0.2.6 with new features as [20](https://github.com/Westlake-AI/openmixup/issues/20). Update new features and documents of `OpenMixup` v0.2.6 as issue [#24](https://github.com/Westlake-AI/openmixup/issues/24), fix relevant issue [#25](https://github.com/Westlake-AI/openmixup/issues/25), issue [#26](https://github.com/Westlake-AI/openmixup/issues/26), issue [#27](https://github.com/Westlake-AI/openmixup/issues/27), issue [#31](https://github.com/Westlake-AI/openmixup/issues/31), and issue [#33](https://github.com/Westlake-AI/openmixup/issues/33).

New Features

- Support new backbone architectures ([EdgeNeXt](https://arxiv.org/abs/2206.10589), [EfficientFormer](https://arxiv.org/abs/2206.01191), [HorNet](https://arxiv.org/abs/2207.14284), ([MogaNet](https://arxiv.org/abs/2211.03295), [MViT.V2](https://arxiv.org/abs/2112.01526), [ShuffleNet.V1](https://arxiv.org/abs/1707.01083), [DeiT-3](https://arxiv.org/abs/2204.07118)), and provide relevant network modules in `models/utils/layers`. Config files and README.md are updated.
- Support new self-supervised method [BEiT](https://arxiv.org/abs/2106.08254) with ViT-Base on ImageNet-1K, and fix bugs of [CAE](https://arxiv.org/abs/2202.03026), [MaskFeat](https://arxiv.org/abs/2112.09133), and [SimMIM](https://arxiv.org/abs/2111.09886) in `Dataset`, `Model`, and `Head`. Note that we added `HOG` feature implementation borrowed from the [original repo](https://github.com/facebookresearch/SlowFast) for [MaskFeat](https://arxiv.org/abs/2112.09133). Update pre-training and fine-tuning config files, and documents for the relevant masked image modeling (MIM) methods ([BEiT](https://arxiv.org/abs/2106.08254), [MaskFeat](https://arxiv.org/abs/2111.06377), [CAE](https://arxiv.org/abs/2202.03026), and [A2MIM](https://arxiv.org/abs/2205.13943)). Support more fine-tuning setting on ImageNet for MIM pre-training based on various backbones (e.g., ViTs, ResNets, ConvNeXts).
- Fix the updated arXiv.V2 version of [VAN](https://arxiv.org/pdf/2202.09741v2.pdf) by adding architecture configurations.
- Support [ArcFace](https://arxiv.org/abs/1801.07698) loss for metric learning and the relevant `NormLinearClsHead`. And support [SeeSaw](https://arxiv.org/abs/2008.10032) loss for long-tail classification tasks.
- Update the issue template with more relevant links and emojis.
- Support Grad-CAM visualization tools [vis_cam.py](tools/visualizations/vis_cam.py) of supported architectures.

Update Documents

- Update our `OpenMixup` tech report on [arXiv](https://arxiv.org/abs/2209.04851), which provides more technical details and benchmark results.
- Update self-supervised learning [Model_Zoo_selfsup.md](https://github.com/Westlake-AI/openmixup/tree/main/docs/en/model_zoos/Model_Zoo_selfsup.md). And update documents of the new backbone and self-supervised methods.
- Update supervised learning [Model_Zoo_sup.md](https://github.com/Westlake-AI/openmixup/tree/main/docs/en/model_zoos/Model_Zoo_sup.md) as provided in [AutoMix](https://arxiv.org/abs/2103.13027) and support more mixup benchmark results.
- Update the template and add the latest paper lists of mixup and MIM methods in [Awesome Mixups](docs/en/awesome_selfsup/MIM.md) and [Awesome MIM](docs/en/awesome_selfsup/MIM.md). We provide teaser figures of most papers as illustrations.
- Update [documents](docs/en/tools) of `tools`.

Bug Fixes

- Fix raising error notification of `torch.fft` for *PyTorch 1.6* or lower versions in backbones and heads.
- Fix `README.md` (new icons, fixing typos) and support pytest in `tests`.
- Fix the classification heads and update implementations and config files of [AlexNet](https://dl.acm.org/doi/10.1145/3065386) and [InceptionV3](https://arxiv.org/abs/1512.00567).

0.2.5

Bump version to V0.2.5 with new features and updating documents as [10](https://github.com/Westlake-AI/openmixup/issues/10). Update features and fix bugs in V0.2.5 as [#17](https://github.com/Westlake-AI/openmixup/issues/17). Update features and documents in V0.2.5 as [#18](https://github.com/Westlake-AI/openmixup/issues/18) and [#19](https://github.com/Westlake-AI/openmixup/issues/19).

New Features

- Support new attention mechanisms in backbone architectures ([Anti-Oversmoothing](https://arxiv.org/abs/2203.05962), `FlowAttention` in [FlowFormer](https://arxiv.org/abs/2202.06258) and `PoolAttention` in [MViTv2](https://arxiv.org/abs/2112.01526)).
- Update code intergration testing in [tests](https://github.com/Westlake-AI/openmixup/tests/).

Update Documents

- Recognize `README` and `README` for various methods.
- Update [Awesome Mixups](docs/en/awesome_selfsup/MIM.md) and [Awesome MIM](docs/en/awesome_selfsup/MIM.md).
- Update [get_started.md](docs/en/get_started.md) and [Tutorials](docs/en/tutorials) for better usage of `OpenMixup`.
- Update mixup benchmarks in [model_zoos](docs/en/model_zoos/Model_Zoo_sup.md): providing configs, weights, and more details.
- Update latest methods in [Awesome Mixups](docs/en/awesome_selfsup/MIM.md) and [Awesome MIM](docs/en/awesome_selfsup/MIM.md).
- Update `README.md` and fix `auto_train_mixups.py` for various datasets.

Bug Fixes

- Fix visualization of the reconstruction results in `MAE`.
- Fix the normalization bug in config files and `plot_torch.py` as mentioned in [16](https://github.com/Westlake-AI/openmixup/issues/16).
- Fix the random seeds in `tools/train.py` as mentioned in [14](https://github.com/Westlake-AI/openmixup/issues/14).

0.2.4

Update new features and fix bugs as [7](https://github.com/Westlake-AI/openmixup/issues/7).

New Features

- Support new backbone architectures ([LITv2](https://arxiv.org/abs/2205.13213)).
- Refactor code structures weight initialization in various network modules (using `BaseModule` in `mmcv`).
- Refactor code structures of `openmixup.models.utils.layers` to support more network structures.

Bug Fixes

- Fix bugs that cause degenerate performances of pure Transformer backbones (DeiT and Swin) in `OpenMixup`. The main reason might be the old version of `auto_fp16` and `DistOptimizerHook` implementations, since `PyTorch=>1.6.0` has better support of fp16 training than `mmcv`.
- Fix the bug of ViT fine-tuning for MIM methods (e.g., MAE, SimMIM). The original `MIMVisionTransformer` in `openmixup.models.mim_vit` has frozen all the backbone parameters during fine-tuning.
- Fix the initialization of Transformer-based architectures (e.g., ViT, Swin) to reproduce the train-from-scratch performances.
- Fix the weight initialization of Transformer-based architectures (e.g., ViT, Swin) to reproduce the train-from-scratch performance. Update weight initialization, parameter-wise weight decay, and fp16 settings in relevant config files.

Page 1 of 3

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.