Major Revamp:
1. Replaced spyx.axn.Axon with spyx.axn.custom
2. All preimplemented surrogate gradients are now standalone activation functions, no need to wrap them with another function
3. spyx.axn.ActivityRegularization was moved to spyx.nn because it is a hk.Module and therefore a layer.
4. Loss and accuracy functions converted to higher order funcs which return the version used in training loops. Time axis arg added.
5. Notebooks in the docs were updated to reflect syntax changes.
6. time constants are now constrained via jnp.clip which is cleaner and more efficient.
7. Fixed bug in data shuffling and optimized it to remove an unnecessary permutation call.
v.0.1.18
I changed the nn definitions to use jnp.clip (hopefully faster) and changed it so that user specified beta values are single learnable constants for the layer - might change again in the future to allow flexible specification of either scalar or vectors for beta.
camera-ready
stable
an updated release of Spyx.
beta
Stable API. Please provide feedback in the Issues or Discussion sections!
alpha
Initial code base. More to follow.