Thanks to obalcells and andyrdt Llama-2 models should now have `1e-4` atol logit errors rather than `1e0` errors!
We also now force PyTorch2 to be >= 2.1.1 thanks to a PyTorch issue on MPS jettjaniak pointed out. Thanks all!
What's Changed
* Fix Grokking Notebook by ArthurConmy in https://github.com/neelnanda-io/TransformerLens/pull/450
* Fixed current CI issues with accuracy failing for Pythia model by bryce13950 in https://github.com/neelnanda-io/TransformerLens/pull/451
* Fixing Llama2 numerical errors by obalcells in https://github.com/neelnanda-io/TransformerLens/pull/456
* Pin PyTorch2 to be at least 2.1.1 by ArthurConmy in https://github.com/neelnanda-io/TransformerLens/pull/457
New Contributors
* obalcells made their first contribution in https://github.com/neelnanda-io/TransformerLens/pull/456
**Full Changelog**: https://github.com/neelnanda-io/TransformerLens/compare/v1.10.0...v1.11.0