Transformer-lens

Latest version: v2.9.0

Safety actively analyzes 688027 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 1 of 8

2.9.0

Lot's of accuracy improvements! A number of models are behaving closer to how they behave in Transformers, and a new internal configuration has been added to allow for more ease of use!

What's Changed
* fix the bug that attention_mask and past_kv_cache cannot work together by yzhhr in https://github.com/TransformerLensOrg/TransformerLens/pull/772
* Set prepend_bos to false by default for Bloom model family by degenfabian in https://github.com/TransformerLensOrg/TransformerLens/pull/775
* Fix that if use_past_kv_cache is set to True models from the Bloom family produce weird outputs. by degenfabian in https://github.com/TransformerLensOrg/TransformerLens/pull/777

New Contributors
* yzhhr made their first contribution in https://github.com/TransformerLensOrg/TransformerLens/pull/772
* degenfabian made their first contribution in https://github.com/TransformerLensOrg/TransformerLens/pull/775

**Full Changelog**: https://github.com/TransformerLensOrg/TransformerLens/compare/v2.8.1...v2.9.0

2.8.1

New notebook for comparing models, and bug fix with dealing with newer LLaMA models!

What's Changed
* Logit comparator tool by curt-tigges in https://github.com/TransformerLensOrg/TransformerLens/pull/765
* Add support for NTK-by-Part Rotary Embedding & set correct rotary base for Llama-3.1 series by Hzfinfdu in https://github.com/TransformerLensOrg/TransformerLens/pull/764

New Contributors
* Hzfinfdu made their first contribution in https://github.com/TransformerLensOrg/TransformerLens/pull/764

**Full Changelog**: https://github.com/TransformerLensOrg/TransformerLens/compare/v2.8.0...v2.8.1

2.8.0

What's Changed
* add transformer diagram by akozlo in https://github.com/TransformerLensOrg/TransformerLens/pull/749
* Demo colab compatibility by bryce13950 in https://github.com/TransformerLensOrg/TransformerLens/pull/752
* Add support for `Mistral-Nemo-Base-2407` model by ryanhoangt in https://github.com/TransformerLensOrg/TransformerLens/pull/751
* Fix the bug that tokenize_and_concatenate function not working for small dataset by xy-z-code in https://github.com/TransformerLensOrg/TransformerLens/pull/725
* added new block for recent diagram, and colab compatibility notebook by bryce13950 in https://github.com/TransformerLensOrg/TransformerLens/pull/758
* Add warning and halt execution for incorrect T5 model usage by vatsalrathod16 in https://github.com/TransformerLensOrg/TransformerLens/pull/757
* New issue template for reporting model compatibility by bryce13950 in https://github.com/TransformerLensOrg/TransformerLens/pull/759
* Add configurations for Llama 3.1 models(Llama-3.1-8B and Llama-3.1-70B) by vatsalrathod16 in https://github.com/TransformerLensOrg/TransformerLens/pull/761

New Contributors
* akozlo made their first contribution in https://github.com/TransformerLensOrg/TransformerLens/pull/749
* ryanhoangt made their first contribution in https://github.com/TransformerLensOrg/TransformerLens/pull/751
* xy-z-code made their first contribution in https://github.com/TransformerLensOrg/TransformerLens/pull/725
* vatsalrathod16 made their first contribution in https://github.com/TransformerLensOrg/TransformerLens/pull/757

**Full Changelog**: https://github.com/TransformerLensOrg/TransformerLens/compare/v2.7.1...v2.8.0

2.7.1

What's Changed
* Updated broken Slack link by neelnanda-io in https://github.com/TransformerLensOrg/TransformerLens/pull/742
* `from_pretrained` has correct return type (i.e. `HookedSAETransformer.from_pretrained` returns `HookedSAETransformer`) by callummcdougall in https://github.com/TransformerLensOrg/TransformerLens/pull/743
* Avoid warning in `utils.download_file_from_hf` by albertsgarde in https://github.com/TransformerLensOrg/TransformerLens/pull/739


New Contributors
* albertsgarde made their first contribution in https://github.com/TransformerLensOrg/TransformerLens/pull/739

**Full Changelog**: https://github.com/TransformerLensOrg/TransformerLens/compare/v2.7.0...v2.7.1

2.7.0

Model 3.2 support! There is also a new compatibility added to the function `test_promt` to allow for multiple prompts, as well as a minor typo.

What's Changed
* Typo hooked encoder by bryce13950 in https://github.com/TransformerLensOrg/TransformerLens/pull/732
* `utils.test_prompt` compares multiple prompts by callummcdougall in https://github.com/TransformerLensOrg/TransformerLens/pull/733
* Model llama 3.2 by bryce13950 in https://github.com/TransformerLensOrg/TransformerLens/pull/734


**Full Changelog**: https://github.com/TransformerLensOrg/TransformerLens/compare/v2.6.0...v2.7.0

2.6.0

Another nice little feature update! You now have the ability to ungroup the grouped query attention head component through a new config parameter `ungroup_grouped_query_attention`!

What's Changed
* Ungrouping GQA by hannamw & FlyingPumba in https://github.com/TransformerLensOrg/TransformerLens/pull/713


**Full Changelog**: https://github.com/TransformerLensOrg/TransformerLens/compare/v2.5.0...v2.6.0

Page 1 of 8

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.