Xformers

Latest version: v0.0.26.post1

Safety actively analyzes 642283 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 5 of 5

0.0.7

Fixed
- Dropout setting not properly passed in many attentions [facebookresearch/xformers123]

0.0.6

Fixed
- Fix self attention optimization not being triggered, broken residual path [facebookresearch/xformers119]
- Improve speed by not using contiguous Tensors when not needed [facebookresearch/xformers119]

Added
- Attention mask wrapper [facebookresearch/xformers113]
- ViT comparison benchmark [facebookresearch/xformers117]

0.0.4

Fixed
- Homogenizing the masks, additive or bool [facebookresearch/xformers79][facebookresearch/xformers85][facebookresearch/xformers86]
- Fix causality flag not being respected [facebookresearch/xformers103]
- Enabling FusedLayerNorm by default in the factory if Triton is available
- Fixing Favor with fp16
- Fixing Favor trainability

Added
- Fused dropout/bias/activation layer [facebookresearch/xformers58]
- Fused layernorm used by default in the factory [facebookresearch/xformers92]

0.0.3

Fixed
- Nystrom causal attention [facebookresearch/xformers75]

0.0.2

Fixed
- More robust blocksparse [facebookresearch/xformers24]

Added
- Rotary embeddings [facebookresearch/xformers32]
- More flexible layernorm [facebookresearch/xformers50]

Page 5 of 5

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.