[](https://github.com/ollama/ollama/releases/download/v0.1.25/OllamaSetup.exe)
Windows Preview
Ollama is now available on Windows in preview. Download it [here](https://github.com/ollama/ollama/releases/download/v0.1.25/OllamaSetup.exe). Ollama on Windows makes it possible to pull, run and create large language models in a new native Windows experience. It includes built-in GPU acceleration, access to the full [model library](https://ollama.com/library), and the Ollama API including [OpenAI compatibility](https://ollama.com/blog/openai-compatibility).
What's Changed
* Ollama on Windows is now available in preview.
* Fixed an issue where requests would hang after being repeated several times
* Ollama will now correctly error when provided an unsupported image format
* Fixed issue where `ollama serve` wouldn't immediately quit when receiving a termination signal
* Fixed issues with prompt templating for the `/api/chat` endpoint, such as where Ollama would omit the second system prompt in a series of messages
* Fixed issue where providing an empty list of messages would return a non-empty response instead of loading the model
* Setting a negative `keep_alive` value (e.g. `-1`) will now correctly keep the model loaded indefinitely
New Contributors
* lebrunel made their first contribution in https://github.com/ollama/ollama/pull/2477
* bnorick made their first contribution in https://github.com/ollama/ollama/pull/2480
**Full Changelog**: https://github.com/ollama/ollama/compare/v0.1.24...v0.1.25