- Connect to local Ollama or remote Ollama (preferably on the same network)
- Chat with LLMs served locally & privately
- Streaming support
- Useful information shown in Side Bar (such as which server, what model , whether streaming is enabled or disabled)
- Ability to reset chat
- Available in PIP
pip install -U localchatgpt
then to use it
usage: localchatgpt [-h] {start,stop}
positional arguments:
{start,stop} Specify 'start' or 'stop'.
options:
-h, --help show this help message and exit