Aplr

Latest version: v10.7.3

Safety actively analyzes 679296 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 8 of 20

9.0.0

Increased interpretability in APLRRegressor:
- Added methods calculate_feature_importance and calculate_term_importance. These estimate feature and term importance respectively on new data.
- Added method get_term_importance which returns estimated term importance in the training data for each term in the model.
- Terms in the model are now sorted by estimated term importance in the training data.
- Slightly changed feature importance calculation methodology.
- Renamed method calculate_local_feature_contribution_for_terms to calculate_local_term_contribution.

8.0.0

Significantly increased predictiveness by doing cross validation on the training data. A model is fitted for each fold combination. These models are then merged into a final model. This increases predictiveness by reducing model variance (an effect similar to bagging). The drawback is that training takes longer time with the default number of randomly selected cv folds (5). However, the user can specify the amount of randomly selected cv folds or directly specify the folds and how the particular training observations will be used in each fold. The latter can be used to emulate the train/validation split used in prior versions of APLR should the user choose to do so.

Other changes:
- Feature importance calculation methodology should now be more realistic.

Deprecated:
- Pruning of terms.
- "rankability" validation_tuning_metric.
- validation_ratio and validation_indexes fields as well as related methods.

7.8.1

Fixed a minor and rare bug related to sample_weight.

7.8.0

Improved the implementation of loss_function "group_mse_cycle".

7.7.0

Added the loss_function "group_mse_cycle" and the validation_tuning_metric "group_mse_by_prediction".

7.6.2

Bugfix related to the "group_mse" loss_function and validation_tuning_metric.

Page 8 of 20

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.