Mlx

Latest version: v0.24.1

Safety actively analyzes 723158 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 9 of 9

0.0.9

Not secure
Highlights:

- Initial (and experimental) GGUF support
- Support Python buffer protocol (easy interoperability with NumPy, Jax, Tensorflow, PyTorch, etc)
- `at[]` syntax for scatter style operations: `x.at[idx].add(y)`, (`min`, `max`, `prod`, etc)

Core

- Array creation from other mx.array’s (`mx.array([x, y])`)
- Complete support for Python buffer protocol
- `mx.inner`, `mx.outer`
- mx.logical_and, mx.logical_or, and operator overloads
- Array at syntax for scatter ops
- Better support for in-place operations (`+=`, `*=`, `-=`, ...)
- VJP for scatter and scatter add
- Constants (`mx.pi`, `mx.inf`, `mx.newaxis`, …)

NN

- GLU activation
- `cosine_similarity` loss
- Cache for `RoPE` and `ALiBi`

Bugfixes / Misc

- Fix data type with `tri`
- Fix saving non-contiguous arrays
- Fix graph retention for inlace state, and remove `retain_graph`
- Multi-output primitives
- Better support for loading devices

0.0.7

Not secure
Core

- Support for loading and saving HuggingFace's safetensor format
- Transposed quantization matmul kernels
- `mlx.core.linalg` sub-package with `mx.linalg.norm` (Frobenius, infininty, p-norms)
- `tensordot` and `repeat`

NN
- Layers
- `Bilinear`,`Identity`, `InstanceNorm`
- `Dropout2D`, `Dropout3D`
- more customizable `Transformer` (pre/post norm, dropout)
- More activations: `SoftSign`, `Softmax`, `HardSwish`, `LogSoftmax`
- Configurable scale in `RoPE` positional encodings
- Losses: `hinge`, `huber`, `log_cosh`

Misc
- Faster GPU reductions for certain cases
- Change to memory allocation to allow swapping

0.0.6

Not secure
Core

- quantize, dequantize, quantized_matmul
- moveaxis, swapaxes, flatten
- stack
- floor, ceil, clip
- tril, triu, tri
- linspace

Optimizers
- RMSProp, Adamax, Adadelta, Lion

NN

- Layers: `QuantizedLinear`, `ALiBi` positional encodings
- Losses: Label smoothing, Smooth L1 loss, Triplet loss

Misc
- Bug fixes

0.0.5

Not secure
- Core ops `remainder`, `eye`, `identity`
- Additional functionality in `mlx.nn`
- Losses: binary cross entropy, kl divergence, mse, l1
- Activations: PRELU, Mish, and several others
- More optimizers: AdamW, Nesterov momentum, Adagrad
- Bug fixes

Page 9 of 9

© 2025 Safety CLI Cybersecurity Inc. All Rights Reserved.