Libauc

Latest version: v1.3.3

Safety actively analyzes 623871 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

1.3.0

---

We are thrilled to release **LibAUC 1.3.0**! In this version, we have made improvements and brought new features to our library. We have released a new documentation website at [https://docs.libauc.org/](https://docs.libauc.org/), where you can access our code and comments. We are also happy to announce that our [LibAUC paper](https://arxiv.org/abs/2306.03065) has been accepted by **KDD2023**!

Major Improvements

- Improved the implementations for `DualSampler` and `TriSampler` for better efficiency.
- Merged `DataSampler` for `NDCGLoss` with `TriSampler` and added a new string argument `mode` to switch between **classification** mode for multi-label classification and **ranking** mode for movie recommendations.
- Improved `AUCMLoss` and included a new version **v2** (required **DualSampler**) that removes the class prior **p** required in the previous version **v1**. To use different version, you can set `version='v1'` or `version='v2'` in `AUCMLoss`.
- Improved `CompositionalAUCLoss`, which now allows multiple updates for optimizing inner loss by setting `k` in the loss. Similar to `AUCMLoss`, we introduced **v2** version in this loss without using the class prior `p`. By default, **k** is **1** and version is **v1**.
- Improved code quality for `APLoss` and `pAUCLoss` including `pAUC_CVaR_Loss`, `pAUC_DRO_Loss`, `tpAUC_KL_Loss` for better efficiency and readability.
- API change for all `optimizer` methods. Please pass `model.parameters()` to the optimizer instead of `model`, e.g., `PESG(model.parameters())`.

New Features

- Launched an official documentation site at http://docs.libauc.org/ to access source code and parameter information.
- Introduced a new library [logo](https://docs.libauc.org/) for X-Risk designed by <a href="https://zhuoning.cc">Zhuoning Yuan</a>, <a href="http://people.tamu.edu/~tianbao-yang/index.html">Tianbao Yang</a> .
- Introduced [**MIDAM**](https://arxiv.org/abs/2305.08040) for multi-instance learning. It supports two pooling functions, `MIDAMLoss('softmax')` for using softmax pooling and `MIDAMLoss('attention')` for attention-based pooling.
- Introduced a new `GCLoss` wrapper for contrastive self-supervised learning, which can be optimized by two algorithms in the backend: [**SogCLR**](https://arxiv.org/abs/2202.12387) and [**iSogCLR**](https://arxiv.org/abs/2305.11965).
- Introduced [**iSogCLR**](https://arxiv.org/abs/2305.11965) for automatic temperature individualization in self-supervised contrastive learning. To use `iSogCLR`, you can set `GCLoss('unimodal', enable_isogclr=True)` and `GCLoss('bimodal', enable_isogclr=True)`.
- Introduced three new multi-label losses: `mAPLoss` for optimizing mean AP, `MultiLabelAUCMLoss` for optimizing multi-label AUC loss, and `MultiLabelpAUCLoss` for multi-label partial AUC loss.
- Introduced `PairwiseAUCLoss` to support optimization of traditional pairwise AUC losses.
- Added more evaluation metrics: `ndcg_at_k`, `map_at_k`, `precision_at_k`, and `recall_at_k`.

Acknowledgment
Team: Zhuoning Yuan, Dixian Zhu, Zi-Hao Qiu, Gang Li, Tianbao Yang (Advisor)

Feedback
We value your thoughts and feedback! Please fill out [this brief survey](https://forms.gle/oWNtjN9kLT51CMdf9) to guide our future developments. Thank you for your time! For other questions, please contact us [Zhuoning Yuan](https://zhuoning.cc/) [[yzhuoninggmail.com](mailto:yzhuoninggmail.com)] and [Tianbao Yang](http://people.tamu.edu/~tianbao-yang/) [[tianbao-yangtamu.edu](mailto:tianbao-yangtamu.edu)].

1.2.0

What's New
---
We continuously update our library by making improvements and adding new features. If you use or like our library, please star⭐ this repo. Thank you!

**Major Improvements**
- In this version,`AUCMLoss` can automatically compute `imratio` without requiring this input from users.
- Renamed `gamma` to `epoch_decay` for `PESG` and `PDSCA` optimizers, i.e., `epoch_decay` = `1/gamma`
- Reimplemented `ImbalancedDataGenerator` for constructing imbalanced dataset for benchmarking. Tutorial is available [here](https://github.com/Optimization-AI/LibAUC/blob/main/examples/01_Creating_Imbalanced_Benchmark_Datasets.ipynb).
- Improved implementations of `APLoss` by removing some redundant computations.
- Merged `SOAP_ADAM` and `SOAP_SGD` optimizers into one optimizer `SOAP`. Tutorial is provided [here](https://github.com/Optimization-AI/LibAUC/blob/main/examples/03_Optimizing_AUPRC_Loss_on_Imbalanced_dataset.ipynb).
- Removed dependency of `TensorFlow` and now LibAUC only requires `PyTorch` installed .
- Updated existing tutorials to match the new version of LibAUC. Tutorials are available [here](https://github.com/Optimization-AI/LibAUC/tree/main/examples).

**New Features**
- Introduced `DualSampler`, `TriSampler` for sampling data that best fit the x-risk optimization to balance inner and outer estimation error.
- Introduced `CompositionAUCLoss` and `PDSCA` optimizer. Tutorial is provided [here](https://github.com/Optimization-AI/LibAUC/blob/main/examples/09_Optimizing_CompositionalAUC_Loss_with_ResNet20_on_CIFAR10.ipynb).
- Introduced `SogCLR` with `Dynamic Contrastive Loss` for training **Self-Supervised Learning** models using **small batch size**. Tutorial and code are provided [here](https://github.com/Optimization-AI/SogCLR).
- Introduced `NDCG_Loss` and `SONG` optimizer for optimizing **NDCG**. Tutorials are provided [here](https://github.com/Optimization-AI/LibAUC/blob/main/examples/10_Optimizing_NDCG_Loss_on_MovieLens20M.ipynb).
- Introduced `pAUCLoss` with three optimizers: `SOPA`, `SOPAs`, `SOTAs` for optimizing **Partial AUROC**. Tutorials are provided [here](https://github.com/Optimization-AI/LibAUC/blob/main/examples/11_Optimizing_pAUC_Loss_on_Imbalanced_data_wrapper.ipynb).
- Added three evaluation functions: `auc_roc_score` (binary/multi-task), `auc_prc_score` (binary/multi-task) and `pauc_roc_score`(binary).

**Feedback**
- If you have any feedback/suggestions, please contact us [Zhuoning Yuan](https://zhuoning.cc/) [[yzhuoninggmail.com](mailto:yzhuoninggmail.com)] and [Tianbao Yang](https://homepage.cs.uiowa.edu/~tyng/) [[tianbao-yanguiowa.edu](mailto:tianbao-yanguiowa.edu)].

1.1.8

What's New
---
- Fixed some bugs and improved the training stability

1.1.6

What's New
---
- Added Support for Multi-Label Training. Tutorial for training CheXpert is available [here](https://github.com/Optimization-AI/LibAUC/blob/main/examples/scripts/07_optimizing_multi_label_auroc_loss_with_densenet121_on_chexpert.py)!
- Fixed some bugs and improved the training stability

Links

Releases

Has known vulnerabilities

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.