What's Changed
* Enable GCC Toolet 12 to support AVX VNNI by nzwulfin in https://github.com/containers/ramalama/pull/473
* Failover to OCI when push fails with default push mechanism by rhatdan in https://github.com/containers/ramalama/pull/476
* Fall back to huggingface-cli when pulling via URL fails by rhatdan in https://github.com/containers/ramalama/pull/475
* Revert "Switch to llama-simple-chat" by rhatdan in https://github.com/containers/ramalama/pull/477
* Add support for http, https and file pulls by rhatdan in https://github.com/containers/ramalama/pull/463
* Bump to v0.1.3 by rhatdan in https://github.com/containers/ramalama/pull/479
New Contributors
* nzwulfin made their first contribution in https://github.com/containers/ramalama/pull/473
**Full Changelog**: https://github.com/containers/ramalama/compare/v0.1.2...v0.1.3