Xgboostlss

Latest version: v0.4.0

Safety actively analyzes 681775 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

0.4.0

We are excited to announce the release of XGBoostLSS v0.4.0! This release brings a new feature, package updates, stability improvements, and bug fixes. Here are the key highlights of this release:

New Features
**Mixture Distributions**
XGBoostLSS now supports using mixture distributions for modelling univariate targets! Mixture densities, or mixture distributions, extend the concept of traditional univariate distributions by interpreting the observed data as combinations of multiple underlying processes. Due to their high flexibility, mixture densities can portray a diverse range of shapes, making them adaptable to a plethora of datasets. By introducing mixture densities in XGBoostLSS, users get a better understanding of the conditional distribution of the response variable and achieve a more precise representation of the data generation process.

Stability Improvements
**Model Estimation**
We have improved the stability of the model estimation process. This results in more consistent and accurate estimation of parameters, leading to better predictions and increased model reliability.

Bug Fixes
In addition to the new features and stability improvements, we have addressed various bugs reported by the community. These bug fixes enhance the overall reliability and usability of XGBoostLSS.

Package Dependency Updates
We have updated some of the package-dependencies to the latest versions.

General
We appreciate the valuable feedback and contributions from our users, which have helped us in making XGBoostLSS even better. We encourage you to update to this latest version to take advantage of the new features and improvements. To get started, check out the documentation and examples.

Thank you for your continued support, and we look forward to your feedback.

Happy modeling!

0.3.0

We are excited to announce the release of XGBoostLSS v0.3.0! This release brings a new feature, package updates, stability improvements, and bug fixes. Here are the key highlights of this release:

New Features
**Normalizing Flows**
XGBoostLSS now supports using normalizing flows for modelling univariate target variables! This powerful new feature allows users to harness the capabilities of normalizing flows for distributional regression, opening new ways of modeling complex and multi-modal distributions more effectively than parametric distributions.

Stability Improvements
**Model Estimation**
We have improved the stability of the model estimation process. This results in more consistent and accurate estimation of parameters, leading to better predictions and increased model reliability.

Bug Fixes
In addition to the new features and stability improvements, we have addressed various bugs reported by the community. These bug fixes enhance the overall reliability and usability of XGBoostLSS.

Package Dependency Updates
We have updated some of the package-dependencies to the latest versions.

General
We appreciate the valuable feedback and contributions from our users, which have helped us in making XGBoostLSS even better. We encourage you to update to this latest version to take advantage of the new features and improvements. To get started, check out the documentation and examples.

Thank you for your continued support, and we look forward to your feedback.

Happy modeling!

0.2.2

We are excited to announce the release of XGBoostLSS v0.2.2! This release brings several new features, stability improvements, and bug fixes. Here are the key highlights of this release:

New Features
- **Multivariate Distributions**: XGBoostLSS now supports modeling of multivariate response distributions, allowing you to capture complex dependencies among multiple target variables. We believe that the introduction of multivariate distributions in XGBoostLSS opens up new opportunities for modeling in various domains. We encourage you to explore these capabilities and share your feedback with us.
- Benefits for Use Cases: With this feature, you can effectively model and predict joint distributions, gaining deeper insights into your data: by considering dependencies between different risk factors, such as the occurrence of multiple events or the joint severity of claims, you can improve the accuracy of risk assessments.
- **Zero-Adjusted and Zero-Inflated Distribution**: these new features expand the modeling capabilities and offer enhanced flexibility for various use cases.
- A zero-adjusted distribution assumes that the zeros occur due to a separate process from the non-zero values. It models the probability of a value being zero and the probability of it being non-zero separately.
- A zero-inflated distribution assumes that there is a single process generating the data, but with an additional component that accounts for the excess zeros. The zero-inflated distributions combine a standard distribution, such as Poisson or Negative Binomial, with a component that models the probability of excess zeros.
- Benefits for Use Cases: The inclusion of zero-adjusted and zero-inflated distributions empowers users to handle a wide range of scenarios effectively. These distributions can be particularly valuable in areas such as insurance claims modeling, disease count prediction, anomaly detection, and many other domains with skewed or zero-inflated data. By accurately capturing the underlying data patterns, XGBoostLSS enables more precise predictions and improved decision-making.

- **CRPS Score for Training (Experimental)**: In this release, we introduce the experimental implementation of the Continuous Ranked Probability Score (CRPS) for training of univariate distributions. CRPS is a popular probabilistic scoring metric that measures the accuracy of predicted probability distributions. It's important to note that the CRPS score for training is still in the experimental stage. While we believe it has the potential to improve model performance, further testing and evaluation are required to validate its effectiveness across different use cases.

Stability Improvements
- **Model Estimation**: We have also improved the stability of the model estimation process. This results in more consistent and accurate estimation of model parameters, leading to better predictions and increased model reliability.

Bug Fixes
In addition to the new features and stability improvements, we have addressed various bugs reported by the community. These bug fixes enhance the overall reliability and usability of XGBoostLSS.

General
We appreciate the valuable feedback and contributions from our users, which have helped us in making XGBoostLSS even better. We encourage you to update to this latest version to take advantage of the new features and improvements.

Thank you for your continued support, and we look forward to your feedback.

Happy modeling!

0.2.1

We are excited to announce the release of xgboostlss v0.2.1! This release brings several new features, stability improvements, and bug fixes. Here are the key highlights of this release:

New Features
- Flexible Distribution Selection: We have introduced a new function that allows users to choose from a variety of candidate distributions for modeling. You can now select the most suitable distribution for your data, enabling more accurate and customized predictions.

- Expectile Penalty: We have enhanced the expectiles functionality by introducing a penalty that discourages crossing of expectiles during training. This helps to improve the coherence of the expectile predictions, leading to more reliable models.

Stability Improvements
- Parameter Initialization: We have made stability improvements in the parameter initialization process. This ensures that the model starts from a more robust and reliable state, reducing the chances of convergence issues and enhancing the overall performance.

- Model Estimation: We have also improved the stability of the model estimation process. This results in more consistent and accurate estimation of model parameters, leading to better predictions and increased model reliability.

Bug Fixes
In addition to the new features and stability improvements, we have addressed various bugs reported by the community. These bug fixes enhance the overall reliability and usability of xgboostlss.

General
We appreciate the valuable feedback and contributions from our users, which have helped us in making xgboostlss even better. We encourage you to update to this latest version to take advantage of the new features and improvements.

Thank you for your continued support, and we look forward to your feedback.

Happy modeling!

0.2.0

Enhanced Distributional Modeling with PyTorch

- XGBoostLSS now fully relies on PyTorch distributions for distributional modeling.
- The integration with PyTorch distributions provides a more comprehensive and flexible framework for probabilistic modeling and uncertainty estimation.
- Users can leverage the rich set of distributional families and associated functions offered by PyTorch, allowing for a wider range of modeling options.

Automatic Differentiation

- XGBoostLSS now fully leverages PyTorch's automatic differentiation capabilities.
- Automatic differentiation enables efficient and accurate computation of gradients and hessians, resulting in enhanced model performance and flexibility.
- Users can take advantage of automatic differentiation to easily incorporate custom loss functions into their XGBoostLSS workflows.
- This enhancement allows for faster experimentation and easier customization.

Hyper-Parameter Optimization

- XGBoostLSS now enables the optimization of all XGBoost hyper-parameters for enhanced modeling flexibility and performance.

What's Changed:
- The syntax of XGBoostLSS has been updated in this release. We have made improvements to certain aspects of the syntax to provide better clarity and consistency.
- To familiarize yourself with the updated syntax, we kindly refer you to the [example sections](https://github.com/StatMixedML/XGBoostLSS/tree/master/examples). The examples will demonstrate the revised syntax and help you adapt your code accordingly.

Bug Fixes

- Several minor fixes and improvements have been implemented in this release.

Links

Releases

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.