Mlx-transformers

Latest version: v0.1.4

Safety actively analyzes 624472 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

0.1.3

What's Changed
* Add Phi 2.0 and 3.0 by ToluClassics in https://github.com/ToluClassics/mlx-transformers/pull/8
* Add llama by ToluClassics in https://github.com/ToluClassics/mlx-transformers/pull/3
* Add Bert Classification Model by ToluClassics in https://github.com/ToluClassics/mlx-transformers/pull/2
* Implemented `RobertaForSequenceClassification` by Seun-Ajayi in https://github.com/ToluClassics/mlx-transformers/pull/1
* Implemented Bert sub-tasks by Seun-Ajayi in https://github.com/ToluClassics/mlx-transformers/pull/4
* Add a mixing for loading models directly from Huggingface by ToluClassics in https://github.com/ToluClassics/mlx-transformers/pull/6
* Code Refactor and NLLB Example by ToluClassics in https://github.com/ToluClassics/mlx-transformers/pull/7
* Implemented `XLMRoberta` sub-tasks by Seun-Ajayi in https://github.com/ToluClassics/mlx-transformers/pull/5
* Version 0.1.3 by ToluClassics in https://github.com/ToluClassics/mlx-transformers/pull/9

New Contributors
* ToluClassics made their first contribution in https://github.com/ToluClassics/mlx-transformers/pull/2
* Seun-Ajayi made their first contribution in https://github.com/ToluClassics/mlx-transformers/pull/1

**Full Changelog**: https://github.com/ToluClassics/mlx-transformers/compare/V0.0.1-pre-release...v0.1.3

V0.0.1-pre-release
Pre release MLX Transformers is a library that provides model implementation in MLX. It uses a similar model interface as HuggingFace Transformers and provides a way to load and run models in Apple Silicon devices with a few model implementations.

**Full Changelog**: https://github.com/ToluClassics/mlx-transformers/commits/V0.0.1-pre-release

Links

Releases

Has known vulnerabilities

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.