---
We are thrilled to release **LibAUC 1.3.0**! In this version, we have made improvements and brought new features to our library. We have released a new documentation website at [https://docs.libauc.org/](https://docs.libauc.org/), where you can access our code and comments. We are also happy to announce that our [LibAUC paper](https://arxiv.org/abs/2306.03065) has been accepted by **KDD2023**!
Major Improvements
- Improved the implementations for `DualSampler` and `TriSampler` for better efficiency.
- Merged `DataSampler` for `NDCGLoss` with `TriSampler` and added a new string argument `mode` to switch between **classification** mode for multi-label classification and **ranking** mode for movie recommendations.
- Improved `AUCMLoss` and included a new version **v2** (required **DualSampler**) that removes the class prior **p** required in the previous version **v1**. To use different version, you can set `version='v1'` or `version='v2'` in `AUCMLoss`.
- Improved `CompositionalAUCLoss`, which now allows multiple updates for optimizing inner loss by setting `k` in the loss. Similar to `AUCMLoss`, we introduced **v2** version in this loss without using the class prior `p`. By default, **k** is **1** and version is **v1**.
- Improved code quality for `APLoss` and `pAUCLoss` including `pAUC_CVaR_Loss`, `pAUC_DRO_Loss`, `tpAUC_KL_Loss` for better efficiency and readability.
- API change for all `optimizer` methods. Please pass `model.parameters()` to the optimizer instead of `model`, e.g., `PESG(model.parameters())`.
New Features
- Launched an official documentation site at http://docs.libauc.org/ to access source code and parameter information.
- Introduced a new library [logo](https://docs.libauc.org/) for X-Risk designed by <a href="https://zhuoning.cc">Zhuoning Yuan</a>, <a href="http://people.tamu.edu/~tianbao-yang/index.html">Tianbao Yang</a> .
- Introduced [**MIDAM**](https://arxiv.org/abs/2305.08040) for multi-instance learning. It supports two pooling functions, `MIDAMLoss('softmax')` for using softmax pooling and `MIDAMLoss('attention')` for attention-based pooling.
- Introduced a new `GCLoss` wrapper for contrastive self-supervised learning, which can be optimized by two algorithms in the backend: [**SogCLR**](https://arxiv.org/abs/2202.12387) and [**iSogCLR**](https://arxiv.org/abs/2305.11965).
- Introduced [**iSogCLR**](https://arxiv.org/abs/2305.11965) for automatic temperature individualization in self-supervised contrastive learning. To use `iSogCLR`, you can set `GCLoss('unimodal', enable_isogclr=True)` and `GCLoss('bimodal', enable_isogclr=True)`.
- Introduced three new multi-label losses: `mAPLoss` for optimizing mean AP, `MultiLabelAUCMLoss` for optimizing multi-label AUC loss, and `MultiLabelpAUCLoss` for multi-label partial AUC loss.
- Introduced `PairwiseAUCLoss` to support optimization of traditional pairwise AUC losses.
- Added more evaluation metrics: `ndcg_at_k`, `map_at_k`, `precision_at_k`, and `recall_at_k`.
Acknowledgment
Team: Zhuoning Yuan, Dixian Zhu, Zi-Hao Qiu, Gang Li, Tianbao Yang (Advisor)
Feedback
We value your thoughts and feedback! Please fill out [this brief survey](https://forms.gle/oWNtjN9kLT51CMdf9) to guide our future developments. Thank you for your time! For other questions, please contact us [Zhuoning Yuan](https://zhuoning.cc/) [[yzhuoninggmail.com](mailto:yzhuoninggmail.com)] and [Tianbao Yang](http://people.tamu.edu/~tianbao-yang/) [[tianbao-yangtamu.edu](mailto:tianbao-yangtamu.edu)].