Llm-gemini

Latest version: v0.4.1

Safety actively analyzes 681812 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 1 of 2

0.5a0

- Track token usage. [25](https://github.com/simonw/llm-gemini/issues/25)

0.4.1

- Depends on LLM 0.18 or higher. 23

0.4

- Handle attachments that are sent without a prompt. 20
- Support for new `gemini-exp-1114` model. Thanks, Dominik Hayon. 21
- Support for JSON output mode using `-o json_object 1`. 22
- Now provides async versions of the Gemini models, compatible with [LLM 0.18](https://llm.datasette.io/en/stable/changelog.html#v0-18). 23

0.3

- Multi-modal model support with LLM 0.17 attachments. Gemini 1.5 models can now accept images, audio and video. [17](https://github.com/simonw/llm-gemini/issues/17)
bash
llm -m gemini-1.5-flash-8b-latest 'describe image' \
-a https://static.simonwillison.net/static/2024/pelicans.jpg

- Support for code execution mode. [18](https://github.com/simonw/llm-gemini/issuse/18)
bash
llm -m gemini-1.5-pro-latest 'write and execute python to calculate factorial of 13' -o code_execution 1

- Support for options: `temperature`, `max_output_tokens`, `top_p`, `top_k`. Pass these as e.g. `-o temperature 0.5`. [3](https://github.com/simonw/llm-gemini/issues/3)

0.3a0

- Support for multi-modal models using attachments in [LLM 0.17a0](https://llm.datasette.io/en/latest/changelog.html#a0-2024-10-28).

0.2

- Added support for the inexpensive [new Gemini 1.5 Flash-8B](https://developers.googleblog.com/en/gemini-15-flash-8b-is-now-generally-available-for-use/): `llm -m gemini-1.5-flash-8b-latest "say hi"` [#14](https://github.com/simonw/llm-gemini/issues/14)
- First non-alpha release.

Page 1 of 2

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.