Pymab

Latest version: v0.1.0

Safety actively analyzes 687918 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

0.1.0alpha

PyMAB (Python Multi-Armed Bandit) is an experimental framework for comparing multiple Multi-Armed Bandit algorithms and configurations. This alpha release provides researchers and practitioners with a flexible foundation for bandit-based experimentation and analysis. New algorithms and improvements coming soon.

Algorithms:
- Greedy Policy
- Epsilon-Greedy Policy
- UCB
- Bayesian UCB with Gaussian/Bernoulli distributions
- Thompson Sampling (Gaussian/Bernoulli variants)
- Contextual Bandits

Environment Types:
- Stationary
- Non-Stationary:
- Gradual distribution changes
- Abrupt distribution changes
- Random arm swapping

Known limitations:
- Some policies (Softmax, Gradient) pending implementation
- API may undergo changes based on feedback
- Lack of parallelisation and other optimisations

Links

Releases

Has known vulnerabilities

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.