Bio Transformations v0.0.5 Release Notes
We're excited to announce the release of Bio Transformations v0.1.0, featuring substantial improvements to the core components of the library. This release introduces several new biologically inspired mechanisms and enhances existing ones to provide a more powerful and flexible toolkit for neural network modifications.
Major New Features
New Distribution Strategies for Learning Rates
- **Multiple Distribution Types**: Added support for 10 different probability distributions for fuzzy learning rates:
- `BASELINE`: No variability (all parameters = 1.0)
- `UNIFORM`: Uniform distribution around 1.0
- `NORMAL`: Normal distribution centered at 1.0
- `LOGNORMAL`: Log-normal with mean 1.0 (skewed, all positive values)
- `GAMMA`: Gamma distribution (positive, skewed)
- `BETA`: Beta distribution scaled to [1-nu, 1+nu]
- `LAYER_ADAPTIVE`: Layer-dependent variability (decreases with depth)
- `WEIGHT_ADAPTIVE`: Weight-dependent scaling (smaller weights get more variability)
- `TEMPORAL`: Evolves over time
- `ACTIVITY`: Based on neuron activation patterns
Activity-Dependent Learning
- Added support for activity-dependent learning rates that adjust based on neuron activation patterns
- Neurons that are more active become more stable (less variable learning rates)
- Implemented activation tracking for both Linear and Conv2d layers
Dynamic Learning Rate Evolution
- Added `update_fuzzy_learning_rates` method to allow learning rates to evolve during training
- Temporal distribution gradually adapts learning rates throughout training
- Weight-adaptive distribution scales variability based on weight magnitudes
Improvements to Existing Features
Enhanced Weight Rejuvenation
- Improved numerical stability and edge case handling in weight rejuvenation
- Better handling of extreme values and NaN weights
- Preserved original implementation as `rejuvenate_weights_old` for backward compatibility
Refined Weight Splitting
- Better handling of output layers with automatic marking of the last layer
- Renamed tokens for clarity (`last_module_token` instead of `weight_splitting_skip`)
- Improved error messages for invalid configurations
Optimized Dale's Principle Implementation
- Enhanced error handling for Dale's principle enforcement
- Better compatibility with output layers
Code Quality and Documentation
- Extensively documented code with detailed explanations of biological motivations
- Added comprehensive docstrings for all methods and classes
- Improved parameter validation and error messaging throughout
- Enhanced code organization and naming conventions
Configuration Enhancements
- Added bounds for fuzzy learning rates (`fuzzy_lr_min` and `fuzzy_lr_max`)
- Added parameters for controlling temporal evolution (`fuzzy_lr_update_freq`, `fuzzy_lr_decay`)
- New parameters for activity-dependent learning
- Added layer indexing for layer-adaptive distributions
API Changes
- Added new exposed functions:
- `update_fuzzy_learning_rates`: Updates learning rates during training
- `rejuvenate_weights_old`: Legacy implementation preserved for backward compatibility
- Enhanced `BioConverter.from_dict` for easier configuration from dictionaries
- Improved `update_config` method for updating configuration parameters
Documentation Improvements
- Added advanced usage guide with comprehensive examples
- Detailed tutorials for each distribution strategy
- Performance optimization tips
- Troubleshooting common issues
---
This release represents a significant advancement in bio-inspired neural network modifications, bringing more biological realism and flexibility to artificial neural networks. We encourage users to explore the new distribution strategies and dynamic learning rate capabilities.
For detailed usage instructions, please refer to our updated documentation including the tutorials and advanced usage guides.