Only the PyTorch API is updated in this version.
PyTorch API
There are three main improvements to the PyTorch AutoClip implementation. They are as follows:
- New Optimizer Wrapping Pattern
- Parameter Value and Type Checking
- Testing with Pytest
Optimizer Wrapping
There were a couple of cases where the previous AutoClip API could not be used, primarily when the training code was imported from an outside library. To address this friction, the optimizer wrapping pattern was added, along with the new `OptimizerWithClipper` class. This class is a true wrapper for the optimizer in question, meaning it will function almost identically, even if the optimizer has special fields and functions. This allows us to shim in clipping behavior anywhere we create an optimizer. A marked improvement!
Parameter Value and Type Checking
Clipper classes now must implement the `verify_parameter_settings` function. This function will be called during `__init__` for the clipper defaults, and during `add_param_group`. This should help give users much more accurate and useful error messages about invalid parameter inputs.
Testing
We now have tests! Hate to write them, love to have them. Not much more to say about this other than the reliability this should impose.