**Open Interpreter** lets LLMs run code locally. You can chat with Open Interpreter through a ChatGPT-like interface in your terminal by running `$ interpreter` after installing.
* **CodeLlama** supported with `--local`, more models coming soon
* Interpreters loaded for Python, Javascript, Shell, and Javascript
* Streaming chat in your terminal (thanks to [Textualize/Rich](https://github.com/Textualize/rich)!)
New Contributors
* TanmayDoesAI made their first contribution in https://github.com/KillianLucas/open-interpreter/pull/25
**Full Changelog**: https://github.com/KillianLucas/open-interpreter/compare/v0.0.297...v0.1.0