Xformers

Latest version: v0.0.28.post1

Safety actively analyzes 663899 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 6 of 7

0.0.10

Fixed
- Expose bias flag for feedforwards, same default as Timm [facebookresearch/xformers220]
- Update eps value for layernorm, same default as torch [facebookresearch/xformers221]
- PreNorm bugfix, only one input was normalized [facebookresearch/xformers233]
- Fix bug where embedding dimensions that did not match model dim would lead to a crash [facebookresearch/xformers244]

Added
- Add DeepNet (DeepNorm) residual path and init [facebookresearch/xformers227]

0.0.9

Added
- Compositional Attention [facebookresearch/xformers41]
- Experimental Ragged attention [facebookresearch/xformers189]
- Mixture of Experts [facebookresearch/xformers181]
- BlockSparseTensor [facebookresearch/xformers202]
- Nd-tensor support for triton softmax [facebookresearch/xformers210]

Fixed
- Bugfix Favor, single feature map [facebookresearch/xformers183]
- Sanity check blocksparse settings [facebookresearch/xformers207]
- Fixed some picklability [facebookresearch/xformers204]

0.0.8

Fixed
- Much faster fused dropout [facebookresearch/xformers164]
- Fused dropout repeatability [facebookresearch/xformers173]

Added
- Embedding weight tying option [facebookresearch/xformers172]

0.0.7

Fixed
- Dropout setting not properly passed in many attentions [facebookresearch/xformers123]

0.0.6

Fixed
- Fix self attention optimization not being triggered, broken residual path [facebookresearch/xformers119]
- Improve speed by not using contiguous Tensors when not needed [facebookresearch/xformers119]

Added
- Attention mask wrapper [facebookresearch/xformers113]
- ViT comparison benchmark [facebookresearch/xformers117]

0.0.4

Fixed
- Homogenizing the masks, additive or bool [facebookresearch/xformers79][facebookresearch/xformers85][facebookresearch/xformers86]
- Fix causality flag not being respected [facebookresearch/xformers103]
- Enabling FusedLayerNorm by default in the factory if Triton is available
- Fixing Favor with fp16
- Fixing Favor trainability

Added
- Fused dropout/bias/activation layer [facebookresearch/xformers58]
- Fused layernorm used by default in the factory [facebookresearch/xformers92]

Page 6 of 7

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.