Sae-lens

Latest version: v5.6.1

Safety actively analyzes 723685 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 6 of 22

4.1.1

Chore

* chore: Update training_a_sparse_autoencoder.ipynb (358)

Changed &34;She lived in a big, happy little girl.&34; to &34;She lived in a big, happy little town.&34; ([`b8703fe`](https://github.com/jbloomAus/SAELens/commit/b8703fe8332b6eb6c49df778f6550c59d2276458))

Fix

* fix: load the same config from_pretrained and get_sae_config (361)

* fix: load the same config from_pretrained and get_sae_config

* merge neuronpedia_id into get_sae_config

* fixing test ([`8e09458`](https://github.com/jbloomAus/SAELens/commit/8e094581c4772e33ec4577349ed0d02c6c90ed27))

4.1.0

Feature

* feat: Support training JumpReLU SAEs (352)

* adds JumpReLU logic to TrainingSAE

* adds unit tests for JumpReLU

* changes classes to match tutorial

* replaces bandwidth constant with param

* re-add logic to JumpReLU logic to TrainingSAE

* adds TrainingSAE.save_model()

* changes threshold to match paper

* add tests for TrainingSAE when archicture is jumprelu

* adds test for SAE.load_from_pretrained() for JumpReLU

* removes code causing test to fail

* renames initial_threshold to threshold

* removes setattr()

* adds test for TrainingSAE.save_model()

* renames threshold to jumprelu_init_threshold

* adds jumprelu_bandwidth

* removes default value for jumprelu_init_threshold downstream

* replaces zero tensor with None in Step.backward()

* adds jumprelu to architecture type ([`0b56d03`](https://github.com/jbloomAus/SAELens/commit/0b56d035ce0fa12722d62cc1bc559bd4fd35e9f3))

4.0.10

Fix

* fix: normalize decoder bias in fold_norm_scaling_factor (355)

* WIP: fix fold_norm_scaling

* fixing test ([`6951e74`](https://github.com/jbloomAus/SAELens/commit/6951e7437f0bf9a33727c2929982917d9f51e7d2))

4.0.9

Fix

* fix: typo in layer 12 YAML ([`d634c8b`](https://github.com/jbloomAus/SAELens/commit/d634c8b2e8665bc3156c46fc8b1b439e26c289c9))

Unknown

* Merge pull request 349 from jbloomAus/np_id_fix_2

fix: use the correct layer for new gemma scope SAE sparsities ([`4c32de0`](https://github.com/jbloomAus/SAELens/commit/4c32de0de3efe9f35007df00c6b5aad102552150))

4.0.8

Fix

* fix: use the correct layer for new gemma scope SAE sparsities ([`a78b93e`](https://github.com/jbloomAus/SAELens/commit/a78b93e33ecfee5ff5e5b08cdf9076cdeabec573))

Unknown

* Merge pull request 348 from jbloomAus/np_id_fix

fix: use the correct layer for new gemma scope SAE sparsities ([`1f6823a`](https://github.com/jbloomAus/SAELens/commit/1f6823a4881a26df18b7c23e2f3a29a8cc93bcf6))

4.0.7

Fix

* fix: Test JumpReLU/Gated SAE and fix sae forward with error term (328)

* chore: adding tests a slight refactoring for SAE forward methods

* refactoring forward methods using a helper to avoid firing hooks

* rewording intermediate var

* use process_sae_in helper in training sae encode

* testing that sae.forward() with error term works with hooks

* cleaning up more unneeded device=cpu in tests ([`ae345b6`](https://github.com/jbloomAus/SAELens/commit/ae345b642ceeeb87851af1ffa180979cc3670c9b))

Page 6 of 22

© 2025 Safety CLI Cybersecurity Inc. All Rights Reserved.