Eval-mm

Latest version: v0.4.0

Safety actively analyzes 723166 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 2 of 2

0.1.1

**Full Changelog**: https://github.com/llm-jp/llm-jp-eval-mm/compare/v0.1.0...v0.1.1

Fix dependencies to publish the package.

0.1.0

What's Changed
* Generation Configの追加 by speed1313 in https://github.com/llm-jp/llm-jp-eval-mm/pull/77
* Add custom metrics features and Refactoring by speed1313 in https://github.com/llm-jp/llm-jp-eval-mm/pull/78
* add record benchmark result table by Silviase in https://github.com/llm-jp/llm-jp-eval-mm/pull/80
* Add JDocQA Task by speed1313 in https://github.com/llm-jp/llm-jp-eval-mm/pull/79


**Full Changelog**: https://github.com/llm-jp/llm-jp-eval-mm/compare/v0.0.7...v0.1.0

0.0.7

**Full Changelog**: https://github.com/llm-jp/llm-jp-eval-mm/compare/v0.0.6...v0.0.7

0.0.6

**Full Changelog**: https://github.com/llm-jp/llm-jp-eval-mm/compare/v0.0.5...v0.0.6

0.0.5

0.0.4

What's Changed
* add record metrics by speed1313 in https://github.com/llm-jp/llm-jp-eval-mm/pull/63
* add Pangea_7B_hf model by speed1313 in https://github.com/llm-jp/llm-jp-eval-mm/pull/65


**Full Changelog**: https://github.com/llm-jp/llm-jp-eval-mm/compare/v0.0.3...v0.0.4

Page 2 of 2

© 2025 Safety CLI Cybersecurity Inc. All Rights Reserved.