Fms-hf-tuning

Latest version: v2.5.0

Safety actively analyzes 702183 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 1 of 7

2.5.0

In v2.5.0, `fms-hf-tuning` library is now built with python 3.12. See more on support update below.
Other note-worthy updates in this release:

**New tracker:**
- New tracker using [HFResourceScanner](https://github.com/foundation-model-stack/hf-resource-scanner/tree/main/HFResourceScanner) to enable lightweight tracking of memory usage and train time during training.

**Support update:**
- We have tested and extended the support for python 3.12. `fms-hf-tuning` can now run with py 3.9, 3.10, 3.11 and 3.12.
- `Dockerfile` is updated to use python 3.12 as default.

What's Changed
* docs: EOS token support by willmj in https://github.com/foundation-model-stack/fms-hf-tuning/pull/443
* feat: add scanner tracker by aluu317 in https://github.com/foundation-model-stack/fms-hf-tuning/pull/422
* docs: add note to note that file extension is required in training data path by willmj in https://github.com/foundation-model-stack/fms-hf-tuning/pull/447
* feat: updates documentation with chat template guide flowchart by YashasviChaurasia in https://github.com/foundation-model-stack/fms-hf-tuning/pull/445
* chore: bump python version by dushyantbehl in https://github.com/foundation-model-stack/fms-hf-tuning/pull/449

New Contributors
* YashasviChaurasia made their first contribution in https://github.com/foundation-model-stack/fms-hf-tuning/pull/445

**Full Changelog**: https://github.com/foundation-model-stack/fms-hf-tuning/compare/v2.4.0...v2.5.0

2.4.0

Summary of Changes
Acceleration Updates:
- Dataclass args added for accelerated MoE tuning, which can be activated using the new int flag `fast_moe` for the number of expert parallel sharding.
- Update function name from `requires_agumentation` to `requires_augmentation`.
- Note: the lower limit of the `fms-acceleration` library has been increased to 0.6.0.
Data Preprocessor Updates:
- Allows for padding free plugin to be used without response template.
- Allows HF dataset IDs to be passed via the `training_data_path flag`.
Additional Changes:
- Add pad_token to special_tokens_dict when pad_token == eos_token, which improves granite 3.0 + 3.1 quality on the tuning stack.
For full details of changes, see the [release notes](https://github.com/foundation-model-stack/fms-hf-tuning/releases/tag/v2.4.0).
(edited)


Full List of Change
* fix: broken README.md link by dushyantbehl in https://github.com/foundation-model-stack/fms-hf-tuning/pull/429
* feat: Allow hf dataset id to be passed via training_data_path by dushyantbehl in https://github.com/foundation-model-stack/fms-hf-tuning/pull/431
* feat: dataclass args for accelerated MoE tuning by willmj in https://github.com/foundation-model-stack/fms-hf-tuning/pull/390
* feat: allow for padding free plugin to be used without response template by dushyantbehl in https://github.com/foundation-model-stack/fms-hf-tuning/pull/430
* fix: function name from `requires_agumentation` to `requires_augmentation` by willmj in https://github.com/foundation-model-stack/fms-hf-tuning/pull/434
* fix: Add pad_token to special_tokens_dict when pad_token == eos_token by Abhishek-TAMU in https://github.com/foundation-model-stack/fms-hf-tuning/pull/436
* chore(deps): upgrade fms-acceleration to >= 0.6 by willmj in https://github.com/foundation-model-stack/fms-hf-tuning/pull/440
* docs: update granite3 model support by anhuong in https://github.com/foundation-model-stack/fms-hf-tuning/pull/441


**Full Changelog**: https://github.com/foundation-model-stack/fms-hf-tuning/compare/v2.3.1...v2.4.0

2.4.0rc.2

What's Changed
* fix: broken README.md link by dushyantbehl in https://github.com/foundation-model-stack/fms-hf-tuning/pull/429
* feat: Allow hf dataset id to be passed via training_data_path by dushyantbehl in https://github.com/foundation-model-stack/fms-hf-tuning/pull/431
* feat: dataclass args for accelerated MoE tuning by willmj in https://github.com/foundation-model-stack/fms-hf-tuning/pull/390
* feat: allow for padding free plugin to be used without response template by dushyantbehl in https://github.com/foundation-model-stack/fms-hf-tuning/pull/430
* fix: function name from `requires_agumentation` to `requires_augmentation` by willmj in https://github.com/foundation-model-stack/fms-hf-tuning/pull/434
* fix: Add pad_token to special_tokens_dict when pad_token == eos_token by Abhishek-TAMU in https://github.com/foundation-model-stack/fms-hf-tuning/pull/436
* chore(deps): upgrade fms-acceleration to >= 0.6 by willmj in https://github.com/foundation-model-stack/fms-hf-tuning/pull/440
* docs: update granite3 model support by anhuong in https://github.com/foundation-model-stack/fms-hf-tuning/pull/441


**Full Changelog**: https://github.com/foundation-model-stack/fms-hf-tuning/compare/v2.3.0...v2.4.0-rc.2

2.4.0rc.1

2.3.1

New feature updates around data handling and preprocessing:

- Enable loading of Parquet and Arrow Dataset files.
- Dataset mixing via sampling probabilities in data config.
- New additional_data_handlers arg in train function to be registered with the data preprocessor.
- Support multiple files, directories, pattern-based paths, HF Dataset IDs, and their combinations via `data_config`.
- New support for both multi-turn and single-turn chat interactions.

New tracker:
- New MLFlow tracker

Additional Changes
- Refactor test artifacts into tests/artifacts , adding new data types, datasets, and predefined data configs for new unit tests.
- Resolve issues with deprecated training arguments.

Full list of Changes
* feat: Add support to handle Parquet Dataset files via data config by Abhishek-TAMU in https://github.com/foundation-model-stack/fms-hf-tuning/pull/401
* test: add arrow datasets and arrow unit tests by willmj in https://github.com/foundation-model-stack/fms-hf-tuning/pull/403
* feat: Perform dataset mixing via sampling probabilities in data config by dushyantbehl in https://github.com/foundation-model-stack/fms-hf-tuning/pull/408
* feat: Expose additional data handlers as an argument in train by dushyantbehl in https://github.com/foundation-model-stack/fms-hf-tuning/pull/409
* fix: Move deprecated positional arguments from SFTTrainer to SFTConfig by Luka-D in https://github.com/foundation-model-stack/fms-hf-tuning/pull/399
* fix: update dataclass objects directly instead of creating new variables by kmehant in https://github.com/foundation-model-stack/fms-hf-tuning/pull/418
* test: Add unit tests to test multiple files in single dataset by Abhishek-TAMU in https://github.com/foundation-model-stack/fms-hf-tuning/pull/412
* feat: Add multi and single turn chat support by dushyantbehl in https://github.com/foundation-model-stack/fms-hf-tuning/pull/415
* feat: Integrate MLflow tracker by dushyantbehl in https://github.com/foundation-model-stack/fms-hf-tuning/pull/425
* feat: Handle passing of multiple files, multiple folders, path with patterns, HF Dataset and combination by Abhishek-TAMU in https://github.com/foundation-model-stack/fms-hf-tuning/pull/424
* docs: Add documentation for data preprocessor release by dushyantbehl in https://github.com/foundation-model-stack/fms-hf-tuning/pull/423

New Contributors
* Luka-D made their first contribution in https://github.com/foundation-model-stack/fms-hf-tuning/pull/399

**Full Changelog**: https://github.com/foundation-model-stack/fms-hf-tuning/compare/v2.2.0...v2.3.1

2.3.0

Release missing the right README docs, please see [v2.3.1](https://github.com/foundation-model-stack/fms-hf-tuning/releases/tag/v2.3.1) for the complete Release Changelog.

Page 1 of 7

© 2025 Safety CLI Cybersecurity Inc. All Rights Reserved.