Opacus

Latest version: v1.5.3

Safety actively analyzes 722491 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 1 of 4

1.5.3

New features

Improvments to ghost clipping
* The interface for ghost clipping is now similar to that of PyTorch and vanilla DP-SGD (668)
* Updated a tutorial on training language models with DP-SGD to include ghost clipping (667) and LoRA (698)
* Added adaptive clipping support for ghost clipping (711)
* Add ghost clipping support for embedding layers (694)
* Support generative NLP tasks with ghost clipping (722)
* Add functionality to access per-sample gradients with ghost clipping (724)

Enabling more external contributions
* Added a research folder for external contribution of promising new methods for PPML (700)
* DP-SGD optimizers with Kalman filters are now available in the research folder (706)
* Made it easier to define custom extensions of PrivacyEngine (703, 704, 710)

Bug fixes
* Fix the clipping operation for ghost clipping when using the PrivacyEngine interface (664)
* Fix issue with ghost clipping and BatchMemoryManager
* Add `strict` and `force_functorch` parameters in initaliziation of `GradSampleModuleFastGradientClipping` (675)
* Fix failing tests (e.g., 726, 713, 727, 674)

Miscellaneous
* Switch from testing with CircleCI to GithubActions CI (701)
* Website and Github improvements (723, 721, 677, 712)
* Added multi-gpu test for ghost clipping (665)

1.5.2

New features
* Add a function of "double_backward" simplifying the training loop (661)

Bug fixes
* Fix issue with setting of param_group for the DPOptimizer wrapper (issue 649) (660)
* Fix issue of DDP optimizer for FGC. The step function incorrectly called "original_optimizer.original_optimizer" (662)
* Replace "opt_einsum.contract" by "torch.einsum"(663)

1.5.1

Bug fixes
* Make the import of opt_einsum.contract (linear.py) explicit (658)

1.5

New features
Fast Gradient Clipping and Ghost Clipping (656)

Bug fixes
* Fix gradient shape error for DPMultiheadAttention (issue 650) (651)
* Pass kwargs from make_private to _prepare_optimizer (648)
* Fix BatchMemoryManager length (641)
* Fix GPU-CPU device mismatch error in util filter_dilated_rows (633)
* Fix Opacus's runtime error with an empty batch (issue 612) (631)

1.4.1

Bug fixes
* Fix DP MultiheadAttention (598)
* Fix: make prv accountant robust to larger epsilons (606)
* Fix the corner case when the optimizer has no trainable parameters (619)

1.4

Highlight: Upgraded to PyTorch 1.13+ as required dependency

New features
* Added clipping schedulers (556)
* Util to check per sample gradients (532)

Bug fixes
* Align DataLoader interface with vanilla PyTorch (543)
* Fix GDP accountant epsilon retrieval changing internal state (541)
* Add option to specify number of steps in UniformSampler (550)
* Fix privacy computation script (565)

Page 1 of 4

© 2025 Safety CLI Cybersecurity Inc. All Rights Reserved.