Invariant-attention

Latest version: v0.1.0

Safety actively analyzes 623439 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

0.1.0

This is the initial release of Fast Transformer and implements Fast Transformer as a subclassed TensorFlow model.

Classes

- `InvariantPointAttention`: Invariant Point Attention which was used in the structure module of Alphafold2 from the paper Highly accurate protein structure prediction with AlphaFold for coordinate refinement. Invariant Point Attention is a form of attention that acts on a set of frames and is invariant under global Euclidean transformations on said frames
- `IPABlock`: Invariant Point Attention Block which is an IPA followed by a feedforward and has normalization layers
- `IPATransformer`: IPA Based transformer which is a stack of `IPABlock` and feedforward layers:

Utility Methods

- `quaternion_raw_multiply`: multiply two quaternions
- `standardize_quaternion`: convert a unit quaternion to a standard form: one in which the real part is non-negative
- `quaternion_multiply`: multiply two quaternions representing rotations, returning the quaternion representing their composition, i.e. the versor with a nonnegative real part.
- `quaternion_to_matrix`: Convert rotations given as quaternions to rotation matrices.

Links

Releases

Has known vulnerabilities

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.