Torchdistill

Latest version: v1.1.1

Safety actively analyzes 685525 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 1 of 5

1.1.1

New method
- Add KD with logits standardization (PR 460)

YAML configs
- Fix the official config for SRD (Issue 471, PR 473)
- Fix SRD config (Issue 471, PR 472)
- Add os.path YAML constructors (PR 454)

Logs
- Disable an auto-configuration for def_logger (Issue 465, PR 469)
- Use warning (PR 468)

Documentation
- Add a new benchmark (PR 464)
- Update Projects page (PRs 456, 475)

Misc
- Update README (PRs 461, 470)
- Update a URL (PR 459)
- Update GH Action vers (PRs 457, 458)
- Update CITATION (PR 455)
- Add a new DOI badge (PR 453)
- Update version (PRs 452, 479)

1.1.0

New methods
- Add SRD method (PRs 436, 444, 446)
- Add Knowledge Distillation from A Stronger Teacher method (PR 433)
- Add Inter-Channel Correlation for Knowledge Distillation method (PR 432)

YAML constructor
- Update functions in yaml_util (PR 447)
- Fix docstrings and add import_call_method & yaml constructor (PR 442)

Distillation/Training boxes
- Enable auxiliary model wrapper builder to redesign input model (PR 437)

Registries
- Add low-level registry and get functions (PR 426)

Documentation
- Update benchmarks (PR 435)
- Fix a typo (PR 424)

Examples
- Replace dst with src (Issue https://github.com/roymiles/Simple-Recipe-Distillation/issues/1, PR #445)
- Add Amazon SageMaker Studio Lab badges (PR 422)

Tests
- Add a test case for import_call_method (PR 443)
- Add import test (PR 441)

Misc
- Update citation info (PRs 438, 439, 440)
- Update publication links (PR 430)
- Update version (PR 425, 449, 451)
- Update README (PRs 423, 434, 450)
- Update image url (PR 421)

1.0.0

This major release supports PyTorch 2.0 and contains a lot of new features, documentation support, and breaking changes.

PyYAML configurations and executable scripts with torchdistill <= v0.3.3 should be considered "legacy" and are no longer supported by torchdistill >= v1.0.0. New PyYAML configurations and executable scripts are provided for the major release.

This release adds support for Python 3.10 and 3.11, and Python 3.7 is no longer supported.

Documentation
- Update documents (PRs 400, 408)
- Add docstrings (PRs 392, 393, 394, 395, 396, 397)
- Add torchdistill logos (PRs 401, 402, 403)

Dependencies & Instantiation
- Add getattr constructor (PR 325)
- Make package arg optional (PR 322)
- Enable dynamic module import/get/call (PR 319)
- Add a function to import dependencies e.g., to register modules (PR 265)

Module registry
- Add *args (PR 345)
- Fix default value-related issues (PR 327)
- No longer use lowered keys (PR 326, 332)
- Disable lowering by default (PR 323)
- Rename type/name key (PR 312)
- Rename registry dicts and arguments for registry key (PR 269)
- Raise errors when requested module keys are not registered (PR 263)
- Enable naming modules to be registered (PR 262)

Distillation/Training boxes
- Remove default forward_proc for transparency (PR 417)
- Rename a forward_proc function (PR 414)
- Simplify (D)DP wrapper init (PR 410)
- Change the timing to print model setup info (PR 335)
- Add an option to specify find_unused_parameters for DDP (PR 334)
- Do not touch teacher model by default (PR 333)
- Training box does not have to inherit nn.Module class (PR 317)
- Add interfaces package to core (PR 310)
- Update forward interfaces (PR 307, 308)
- Rename post_process post_epoch_process for consistency (PR 306)
- Consider CosineAnnealingWarmRestarts in default post-epoch process functions (PR 305)
- Make some common procedures in training box registrable/replaceable (PR 304)
- Introduce {pre,post}-{epoch,forward} processes and registries (PR 274)
- Rename post_forward functions (PR 272)
- Make loss as kwarg (PR 273)

Forward hooks
- Fix initialization issues in IO dict for SELF_MODULE_PATH (PR 328)

Dataset modules
- Redesign split_dataset and remove unused functions (PR 360)
- Update CRD dataset wrapper (PR 352)
- Fix a bug (PR 351)
- Add default args and kwargs (PR 347)
- Add get_dataset (PR 324)

Loss modules
- Fix a typo (PR 413, 415)
- Add doc artifacts and an option to pass pre-instantiated loss module (PR 399)
- Add DictLossWrapper (PR 337)
- Rename an old function name PR 309)
- Rename single loss middle-level loss (PR 300)
- Explicitly define criterion wrapper (PR 298)
- Change concepts of OrgLoss and org_term (PR 296)
- Rename loss-related classes and functions (PR 294)
- Add default forward process function and KDLoss back as a single loss (PR 275)
- Remove org loss module and introduce self-module path (PR 271)

Model modules
- Support parameter operations (Discussion 387, PR 388)
- Replace pretrained with weights (PR 354)

Auxiliary model wrapper modules
- Add find_unused_parameters arg (PR 340)
- Rename special in configs to auxiliary_model_wrapper (PR 291)
- Rename special module for clarity (PR 276)

Optimizer/Scheduler modules
- Fix bugs around optimizer/scheduler (PR 358)
- epoch arg is deprecated for some LR schedulers (PR 338)

Examples
- Revert legacy file paths to non-legacy ones (PR 419)
- Update kwargs and scripts (PR 382)
- Update yaml util and sample configs (CIFAR-10, CIFAR-100) for the next major release (PR 361)
- Update sample script and configs (GLUE) for the next major release (PR 259)
- --log was replaced with --run_log (PR 350)
- dst_ckpt should be used when using -test_only (PR 349)
- Simplify the semantic segmentation script (PR 339)
- Move hardcoded-torchvision-specific code to local custom package (PR 331)
- Update world_size, cudnn configs, and checkpoint message (PR 330)
- Rename log argument due to the (abstract) conflict with torchrun (PR 329)
- Restructure examples and export some example-specific packages (PR 320)
- Add an option to disable torch.backend.cudnn.benchmark (PR 316)
- Support stage-wise loading/saving checkpoints (PR 315)
- Support src_ckpt and dst_ckpt for initialization and saving checkpoints respectively (PR 314)
- Use legacy configs and scripts tentatively (PR 292, 295)
- Add legacy examples and configs (PR 289)

Configs
- Declare forward_proc explicitly (PR 416)
- Add configs used in NLP-OSS 2023 paper (PR 407)
- Fix value based on log (PR 284)
- Update sample configs (ILSVRC 2012, COCO 2017, and PASCAL VOC 2012) for the next major release (PR 357)
- Update official configs for the next major release (PR 355)
- Merge single_/multi_stage directories (PR 346)
- Rename variables (PR 344)
- Rename "factor" "weight" (PR 302)
- Restructure criterion (PR 301)
- Consistently use "params" to indicate learnable parameters, not hyperparameters (PR 297)

Misc.
- Add Google Analytics ID (PR 406)
- Add sitemap.xml (PR 405)
- Update timm repo (PR 375)
- Add acknowledgments (PR 369)
- Update file paths (PR 356)
- Fix a typo and replace pretrained with weights (PR 353)
- Remove the dict option as it is not intuitive for building transform(s) (PR 303)
- Temporarily remove registry test (PR 293)
- Add an important notice (PR 286)
- Add read permission for content, following the new template (PR 284)
- Refactor (PRs 268, 270, 283, 343)
- Update README (PRs 252, 290, 299, 341, 342, 348, 364, 400, 409, 418)
- Update versions (PRs 251, 391, 420)

Workflows
- Add a GitHub Action for deploying Sphinx documentation (PR 404)

0.3.3

Updates in APIs/scripts
- Add square-sized random crop option (PR 224)
- Replace torch.no_grad() with torch.inference_mode() (PR 245)
- Terminate apex support due to its maintenance mode (PRs 248, 249)

Bug fixes
- Add a default value (Discussion 229, PR 230)
- Fix a bug raised in torchvision (PR 231)
- Fix a default parameter (PR 235)

Misc.
- Fix a typo (PR 232)
- Update Travis (PR 236)
- Update README (PRs 228, 238, 240
- Update versions (PRs 223, 250)

0.3.2

Bug fix
- Fix a potential bug in split_dataset (Issue 209, PR 210)

Misc.
- Update GitHub workflow (PR 217)
- Add local epoch for LambdaLR (PR 219)
- Update versions (PRs 208, 220)

0.3.1

Minor updates
- Freeze module before rebuild if applicable (PR 205)
- Refactor and improve result summary message (PR 206)
- Update version (PRs 204, 207)

Page 1 of 5

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.