Fixed
- Fix self attention optimization not being triggered, broken residual path [facebookresearch/xformers119]
- Improve speed by not using contiguous Tensors when not needed [facebookresearch/xformers119]
Added
- Attention mask wrapper [facebookresearch/xformers113]
- ViT comparison benchmark [facebookresearch/xformers117]