Fast-transformer

Latest version: v0.2.0

Safety actively analyzes 681881 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

0.2.0

✅ Bug Fixes / Improvements

- Unit Tests for output rank and shape
- Looser dependency requirements (now supports all TensorFlow versions >= 2.5.0)

0.1.0

This is the initial release of Fast Transformer and implements Fast Transformer as a subclassed TensorFlow model.

Classes

- [FastAttention](https://github.com/Rishit-dagli/Fast-Transformer/blob/d47d4e74e1c84907d4136ef07f7c57c441eaf603/fast_transformer/fast_attention.py#L6): Implements additive attention as a TensorFlow Keras layer, and supports using relative positional encodings.
- [PreNorm](https://github.com/Rishit-dagli/Fast-Transformer/blob/d47d4e74e1c84907d4136ef07f7c57c441eaf603/fast_transformer/fast_transformer.py#L8): Normalize the activations of the previous layer for each given example in a batch independently and apply some function to it, implemented as a TensorFlow Keras Layer.
- [FeedForward](https://github.com/Rishit-dagli/Fast-Transformer/blob/d47d4e74e1c84907d4136ef07f7c57c441eaf603/fast_transformer/fast_transformer.py#L19): Create a FeedForward neural net with two `Dense` layers and GELU activation, implemented as a TensorFlow Keras Layer.
- [FastTransformer](https://github.com/Rishit-dagli/Fast-Transformer/blob/d47d4e74e1c84907d4136ef07f7c57c441eaf603/fast_transformer/fast_transformer.py#L37): Implements the FastTransformer model using all the other classes, allows using rotary embeddings, weight tie projections, and converts to logits. Implemented as a TensorFlow Keras Model.

Links

Releases

Has known vulnerabilities

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.