Changelog
- Support of local language models through custom inference function
- Support of local transformers models
- INIT_PARAMS: dict config parameter introduced to customize local models / API clients
- CHAT_MODE: bool config parameter introduced to force switch to text-completion/chat mode
- Config instance now may be casted to dict (convenient to reconfigure / export)
- More tests added
- API tests fixed
- get_vram_usage(), show_vram_usage() & is_google_colab() utility functions implemented
- (llm response str).as_message + (prompt | llm response str).as_user, .as_system, .as_assistant, .as_model properties added (utils.ConvertableToMessage class)