Flash-attention-softmax-n

Latest version: v0.3.2

Safety actively analyzes 682487 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 2 of 3

0.1.2

The attention bias in MosaicBERT has `attn_bias.ndim == 4`, so I generalized `flash_attention_n` to accomodate this.

0.1.1

corrected default argument of dim

0.1.0rc6

0.1.0rc5

use token instead of password

0.1.0rc4

install triton-nightly separately

0.1.0rc3

- Use a valid email address in setup.py
- Exclude all test directories

Page 2 of 3

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.