Torchdistill

Latest version: v1.1.2

Safety actively analyzes 714815 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 4 of 5

0.1.6

Example updates
- Add an example to show how to import models via PyTorch Hub (PR 83)
- Add an option to set random seed for reproducibility (PR 85)
- Add an example of segmentation model training (PR 86)

Restructuring
- Refactor function util (PR 84)

Typo fixes
- Fix typos in dataset util and examples (PR 88)

0.1.5

Minor updates
- Make IoU type selection model-free (PR 74)
- Update loss string (PR 74)
- Disable DDP when no params are updatable (PR 77)
- Update README (PR 78)

Bug/Typo fixes
- Fix typos in example commands (PR 76)
- Fix typos in sample configs (PR 79)
- Fix bugs for clip grad norm (PR 80)

0.1.4

Minor updates
- Update functions for object detection models (PR 59)
- Update README (PRs 61 62)

Minor bug fixes
- Rename (PR 60)
- Bug fixes (PR 73)

0.1.3

Updated official README and configs
- More detailed instructions (PRs 55, 56)
- Restructured official configs (PR 55)
- Updated FT config for ImageNet (PR 55)

Support detailed training configurations
- Step-wise parameter update besides epoch-wise parameter update (PR 58)
- Gradient accumulation (PR 58)
- Max gradient norm (PR 58)

Bug/Typo fixes
- Bug fixes (PRs 54, 57)
- Typo fixes (PRs 53, 58)

0.1.2

New examples
- Added sample configs for CIFAR-10 and CIFAR-100 datasets
1. Training without teacher (i.e., using `TrainingBox`) for [CIFAR-10](https://github.com/yoshitomo-matsubara/torchdistill/tree/master/configs/sample/cifar10/ce) and [CIFAR-100](https://github.com/yoshitomo-matsubara/torchdistill/tree/master/configs/sample/cifar100/ce) (PR #48)
2. Knowledge distillation for [CIFAR-10](https://github.com/yoshitomo-matsubara/torchdistill/tree/master/configs/sample/cifar10/kd) and [CIFAR-100](https://github.com/yoshitomo-matsubara/torchdistill/tree/master/configs/sample/cifar100/kd) (PR #50)
- Added Google Colab examples (PR 51)
1. [Training without teacher for CIFAR-10 and CIFAR-100](https://colab.research.google.com/github/yoshitomo-matsubara/torchdistill/blob/master/demo/cifar_training.ipynb)
2. [Knowledge distillation for CIFAR-10 and CIFAR-100](https://colab.research.google.com/github/yoshitomo-matsubara/torchdistill/blob/master/demo/cifar_kd.ipynb)

Bug fixes
- Fixed a bug in init of DenseNet-BC (PR 48)
- Resolved checkpoint name conflicts (PR 49)

0.1.1

New features
- Added TrainingBox to train models without teachers (PR 39)
- Supported PyTorch Hub in registry (PR 40)
- Supported random split e.g., split training dataset into training and validation datasets (PR 41)
- Added reimplemented models for CIFAR-10 and CIFAR-100 datasets (PR 41)

Pretrained models
Referred to the following repositories for training methods.
- ResNet: https://github.com/facebookarchive/fb.resnet.torch
- WRN (Wide ResNet): https://github.com/szagoruyko/wide-residual-networks
- DenseNet-BC: https://github.com/liuzhuang13/DenseNet

Note that there are some accuracy gaps between these and those reported in their original studies.

| | CIFAR-10 | CIFAR-100 |
|-------------------------------|---------:|----------:|
| ResNet-20 | 91.92 | N/A |
| ResNet-32 | 93.03 | N/A |
| ResNet-44 | 93.20 | N/A |
| ResNet-56 | 93.57 | N/A |
| ResNet-110 | 93.50 | N/A |
| WRN-40-4 | 95.24 | 79.44 |
| WRN-28-10 | 95.53 | 81.27 |
| WRN-16-8 | 94.76 | 79.26 |
| DenseNet-BC (k=12, depth=100) | 95.53 | 77.14 |

Page 4 of 5

© 2025 Safety CLI Cybersecurity Inc. All Rights Reserved.