Lightgbmlss

Latest version: v0.4.0

Safety actively analyzes 623965 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

0.4.0

We are excited to announce the release of LightGBMLSS v0.4.0! This release brings a new feature, package updates, stability improvements, and bug fixes. Here are the key highlights of this release:

New Features
**Mixture Distributions**
LightGBMLSS now supports using mixture distributions for modelling univariate targets! Mixture distributions, or mixture densities, extend the concept of traditional univariate distributions by interpreting the observed data as combinations of multiple underlying processes. In its essence, a mixture distribution is a weighted combination of several component distributions, where each component contributes to the overall mixture distribution. Due to their high flexibility, mixture distributions can portray a diverse range of shapes, making them adaptable to a plethora of datasets. By introducing mixture distributions in LightGBMLSS, users get a better understanding of the conditional distribution of the response variable and achieve a more precise representation of the data generation process.

Stability Improvements
**Model Estimation**
We have improved the stability of the model estimation process. This results in more consistent and accurate estimation of parameters, leading to better predictions and increased model reliability.

Bug Fixes
In addition to the new features and stability improvements, we have addressed various bugs reported by the community. These bug fixes enhance the overall reliability and usability of LightGBMLSS.

Package Dependency Updates
We have updated some of the package-dependencies to the latest versions.

General
We appreciate the valuable feedback and contributions from our users, which have helped us in making LightGBMLSS even better. We encourage you to update to this latest version to take advantage of the new features and improvements. To get started, check out the documentation and examples.

Thank you for your continued support, and we look forward to your feedback.

Happy modeling!

0.3.0

We are excited to announce the release of LightGBMLSS v0.3.0! This release brings a new feature, package updates, stability improvements, and bug fixes. Here are the key highlights of this release:

New Features
**Normalizing Flows**
LightGBMLSS now supports using normalizing flows for modelling univariate target variables! This powerful new feature allows users to harness the capabilities of normalizing flows for distributional regression, opening new ways of modeling complex and multi-modal distributions more effectively than parametric distributions.

Stability Improvements
**Model Estimation**
We have improved the stability of the model estimation process. This results in more consistent and accurate estimation of parameters, leading to better predictions and increased model reliability.

Bug Fixes
In addition to the new features and stability improvements, we have addressed various bugs reported by the community. These bug fixes enhance the overall reliability and usability of LightGBMLSS.

Package Dependency Updates
We have updated some of the package-dependencies to the latest versions.

General
We appreciate the valuable feedback and contributions from our users, which have helped us in making LightGBMLSS even better. We encourage you to update to this latest version to take advantage of the new features and improvements. To get started, check out the documentation and examples.

Thank you for your continued support, and we look forward to your feedback.

Happy modeling!

0.2.2

We are excited to announce the release of LightGBMLSS v0.2.2! This release brings several new features, stability improvements, and bug fixes. Here are the key highlights of this release:

New Features
- **Zero-Adjusted and Zero-Inflated Distribution**: these new features expand the modeling capabilities and offer enhanced flexibility for various use cases.
- A zero-adjusted distribution assumes that the zeros occur due to a separate process from the non-zero values. It models the probability of a value being zero and the probability of it being non-zero separately.
- A zero-inflated distribution assumes that there is a single process generating the data, but with an additional component that accounts for the excess zeros. The zero-inflated distributions combine a standard distribution, such as Poisson or Negative Binomial, with a component that models the probability of excess zeros.
- Benefits for Use Cases: The inclusion of zero-adjusted and zero-inflated distributions empowers users to handle a wide range of scenarios effectively. These distributions can be particularly valuable in areas such as insurance claims modeling, disease count prediction, anomaly detection, and many other domains with skewed or zero-inflated data. By accurately capturing the underlying data patterns, LightGBMLSS enables more precise predictions and improved decision-making.

- **CRPS Score for Training (Experimental)**: In this release, we introduce the experimental implementation of the Continuous Ranked Probability Score (CRPS) for training of univariate distributions. CRPS is a popular probabilistic scoring metric that measures the accuracy of predicted probability distributions. It's important to note that the CRPS score for training is still in the experimental stage. While we believe it has the potential to improve model performance, further testing and evaluation are required to validate its effectiveness across different use cases.

Stability Improvements
- **Model Estimation**: We have also improved the stability of the model estimation process. This results in more consistent and accurate estimation of model parameters, leading to better predictions and increased model reliability.

Bug Fixes
In addition to the new features and stability improvements, we have addressed various bugs reported by the community. These bug fixes enhance the overall reliability and usability of LightGBMLSS.

General
We appreciate the valuable feedback and contributions from our users, which have helped us in making LightGBMLSS even better. We encourage you to update to this latest version to take advantage of the new features and improvements.

Thank you for your continued support, and we look forward to your feedback.

Happy modeling!

0.2.1

We are excited to announce the release of LightGBMLSS v0.2.1! This release brings several new features, stability improvements, and bug fixes. Here are the key highlights of this release:

New Features
- Flexible Distribution Selection: We have introduced a new function that allows users to choose from a variety of candidate distributions for modeling. You can now select the most suitable distribution for your data, enabling more accurate and customized predictions.

- Expectile Penalty: We have enhanced the expectiles functionality by introducing a penalty that discourages crossing of expectiles during training. This helps to improve the coherence of the expectile predictions, leading to more reliable models.

Stability Improvements
- Parameter Initialization: We have made stability improvements in the parameter initialization process. This ensures that the model starts from a more robust and reliable state, reducing the chances of convergence issues and enhancing the overall performance.

- Model Estimation: We have also improved the stability of the model estimation process. This results in more consistent and accurate estimation of model parameters, leading to better predictions and increased model reliability.

Bug Fixes
In addition to the new features and stability improvements, we have addressed various bugs reported by the community. These bug fixes enhance the overall reliability and usability of LightGBMLSS.

General
We appreciate the valuable feedback and contributions from our users, which have helped us in making LightGBMLSS even better. We encourage you to update to this latest version to take advantage of the new features and improvements.

Thank you for your continued support, and we look forward to your feedback.

Happy modeling!

0.2.0

Enhanced Distributional Modeling with PyTorch

- LightGBMLSS now fully relies on PyTorch distributions for distributional modeling.
- The integration with PyTorch distributions provides a more comprehensive and flexible framework for probabilistic modeling and uncertainty estimation.
- Users can leverage the rich set of distributional families and associated functions offered by PyTorch, allowing for a wider range of modeling options.

Automatic Differentiation

- LightGBMLSS now fully leverages PyTorch's automatic differentiation capabilities.
- Automatic differentiation enables efficient and accurate computation of gradients and hessians, resulting in enhanced model performance and flexibility.
- Users can take advantage of automatic differentiation to easily incorporate custom loss functions into their LightGBMLSS workflows.
- This enhancement allows for faster experimentation and easier customization.

Hyper-Parameter Optimization

- LightGBMLSS now enables the optimization of all LightGBM hyper-parameters for enhanced modeling flexibility and performance.

What's Changed:
- The syntax of LightGBMLSS has been updated in this release. We have made improvements to certain aspects of the syntax to provide better clarity and consistency.
- To familiarize yourself with the updated syntax, we kindly refer you to the [example sections](https://github.com/StatMixedML/LightGBMLSS/tree/master/examples). The examples will demonstrate the revised syntax and help you adapt your code accordingly.

Bug Fixes
- Several minor fixes and improvements have been implemented in this release.

Links

Releases

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.