Composer

Latest version: v0.27.0

Safety actively analyzes 682387 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 2 of 11

0.24.1

Bug Fixes

**1. Disallow passing `device_mesh` to `FSDPConfig` ([3580](https://github.com/mosaicml/composer/pull/3580))**

Explicitly errors if `device_mesh` is passed to `FSDPConfig`. This completes the deprecation from v0.24.0 and also addresses cases where a user specified a device mesh but it was ignored, leading to training with the incorrect parallelism style (e.g., using FSDP instead of HSDP).

What's Changed
* Bump main version to 0.25.0.dev0 by snarayan21 in https://github.com/mosaicml/composer/pull/3573
* update daily by KevDevSha in https://github.com/mosaicml/composer/pull/3572
* Bump pandoc from 2.3 to 2.4 by dependabot in https://github.com/mosaicml/composer/pull/3575
* Update transformers requirement from !=4.34.0,<4.44,>=4.11 to >=4.11,!=4.34.0,<4.45 by dependabot in https://github.com/mosaicml/composer/pull/3574
* Checkpoint backwards compatibility tests for v0.24.0 by snarayan21 in https://github.com/mosaicml/composer/pull/3579
* Error if device mesh specified in fsdp config by snarayan21 in https://github.com/mosaicml/composer/pull/3580
* Bump version to 0.24.1. by snarayan21 in https://github.com/mosaicml/composer/pull/3581

**Full Changelog**: https://github.com/mosaicml/composer/compare/v0.24.0...v0.24.1

0.24.0

What's New
1. Torch 2.4 Compatibility ([3542](https://github.com/mosaicml/composer/pull/3542), [#3549](https://github.com/mosaicml/composer/pull/3549), [#3553](https://github.com/mosaicml/composer/pull/3553), [#3552](https://github.com/mosaicml/composer/pull/3552), [#3565](https://github.com/mosaicml/composer/pull/3565))
Composer now supports Torch 2.4! We are tracking a few issues with the latest PyTorch we have raised with the PyTorch team related to checkpointing:
- \[[PyTorch Issue](https://github.com/pytorch/pytorch/issues/133415)\] Distributed checkpointing using PyTorch DCP has issues with stateless optimizers, e.g. SGD. We recommend using `composer.optim.DecoupledSGDW` as a workaround.
- \[[PyTorch Issue](https://github.com/pytorch/pytorch/issues/133923)\] Distributed checkpointing using PyTorch DCP broke backwards compatibility. We have patched this using the following [planner](https://github.com/mosaicml/composer/pull/3565), but this may break custom planner loading.

2. New checkpointing APIs ([3447](https://github.com/mosaicml/composer/pull/3447), [#3474](https://github.com/mosaicml/composer/pull/3474), [#3488](https://github.com/mosaicml/composer/pull/3488), [#3452](https://github.com/mosaicml/composer/pull/3452))
We've added new checkpointing APIs to download, upload, and load / save, so that checkpointing is usable outside of a `Trainer` object. We will be fully migrating to these new APIs in the next minor release.

3: Improved Auto-microbatching ([3510](https://github.com/mosaicml/composer/pull/3510), [#3522](https://github.com/mosaicml/composer/pull/3522))
We've fixed deadlocks with auto-microbatching with FSDP, bringing throughput in line with manually setting the microbatch size. This is achieved through enabling sync hooks wherever a training run might OOM to find the correct microbatch size, and disabling these hooks for the rest of training.


Bug Fixes
1. Fix checkpoint symlink uploads ([3376](https://github.com/mosaicml/composer/pull/3376))
Ensures that checkpoint files are uploaded before the symlink file, fixing errors with missing or incomplete checkpoints.

2. Optimizer tracks same parameters after FSDP wrapping ([3502](https://github.com/mosaicml/composer/pull/3502))
When only a subset of parameters should be tracked by the optimizer, FSDP wrapping will now not interfere.

What's Changed
* Bump ipykernel from 6.29.2 to 6.29.5 by dependabot in https://github.com/mosaicml/composer/pull/3459
* Update torchmetrics requirement from <1.3.3,>=0.10.0 to >=1.4.0.post0,<1.4.1 by dependabot in https://github.com/mosaicml/composer/pull/3460
* [Checkpoint] Fix symlink issue where symlink file uploaded before checkpoint files upload by bigning in https://github.com/mosaicml/composer/pull/3376
* Bump databricks-sdk from 0.28.0 to 0.29.0 by dependabot in https://github.com/mosaicml/composer/pull/3456
* Remove Log Exception by jjanezhang in https://github.com/mosaicml/composer/pull/3464
* Corrected docs for MFU in SpeedMonitor by JackZ-db in https://github.com/mosaicml/composer/pull/3469
* [checkpoint v2] Download api by bigning in https://github.com/mosaicml/composer/pull/3447
* Upload api by bigning in https://github.com/mosaicml/composer/pull/3474
* [Checkpoint V2] Upload API by bigning in https://github.com/mosaicml/composer/pull/3488
* Load api by eracah in https://github.com/mosaicml/composer/pull/3452
* Add helpful comment explaining HSDP initialization seeding by mvpatel2000 in https://github.com/mosaicml/composer/pull/3470
* Add fit start to mosaicmllogger by ethanma-db in https://github.com/mosaicml/composer/pull/3467
* Remove OOM-Driven FSDP Deadlocks and Increase Throughput of Automicrobatching by JackZ-db in https://github.com/mosaicml/composer/pull/3510
* Move hooks and fsdp modules onto state rather than trainer by JackZ-db in https://github.com/mosaicml/composer/pull/3522
* Bump coverage[toml] from 7.5.4 to 7.6.0 by dependabot in https://github.com/mosaicml/composer/pull/3471
* revert a wip PR by bigning in https://github.com/mosaicml/composer/pull/3475
* Change FP8 Eval to default to activation dtype by j316chuck in https://github.com/mosaicml/composer/pull/3454
* Get a shared file system safe signal file name by dakinggg in https://github.com/mosaicml/composer/pull/3485
* Bumping flash attention version to v2.6.2 by ShashankMosaicML in https://github.com/mosaicml/composer/pull/3489
* Bump to Pytorch 2.4 by mvpatel2000 in https://github.com/mosaicml/composer/pull/3542
* Add Torch 2.4 Tests by mvpatel2000 in https://github.com/mosaicml/composer/pull/3549
* Fix torch 2.4 images for tests by snarayan21 in https://github.com/mosaicml/composer/pull/3553
* Fix torch 2.4 tests by mvpatel2000 in https://github.com/mosaicml/composer/pull/3552
* Fix bug when subset of model parameters is passed into optimizer with FSDP by sashaDoubov in https://github.com/mosaicml/composer/pull/3502
* Correctly process `parallelism_config['tp']` when it's a dict by snarayan21 in https://github.com/mosaicml/composer/pull/3434
* [torch2.4] Fix sharded checkpointing backward compatibility issue by bigning in https://github.com/mosaicml/composer/pull/3565
* [fix-daily] Use composer get_model_state_dict instead of torch's by eracah in https://github.com/mosaicml/composer/pull/3492
* Load Microbatches instead of Entire Batches to GPU by JackZ-db in https://github.com/mosaicml/composer/pull/3487
* Make Pytest log in color in Github Action by eitanturok in https://github.com/mosaicml/composer/pull/3505
* Revert "Load Microbatches instead of Entire Batches to GPU " by JackZ-db in https://github.com/mosaicml/composer/pull/3508
* Bump transformers version by dakinggg in https://github.com/mosaicml/composer/pull/3511
* Fix FSDP Config Validation by mvpatel2000 in https://github.com/mosaicml/composer/pull/3530
* Add FSDP input validation for use_orig_params and activation_cpu_offload flag by j316chuck in https://github.com/mosaicml/composer/pull/3515
* Fix checkpoint events by b-chu in https://github.com/mosaicml/composer/pull/3468
* Patch conf.py for readthedocs sphinx injection deprecation. by mvpatel2000 in https://github.com/mosaicml/composer/pull/3491
* save load path in state and pass to mosaicmllogger by ethanma-db in https://github.com/mosaicml/composer/pull/3506
* Disable gcs azure daily test by bigning in https://github.com/mosaicml/composer/pull/3514
* Update huggingface-hub requirement from <0.24,>=0.21.2 to >=0.21.2,<0.25 by dependabot in https://github.com/mosaicml/composer/pull/3481
* restore version on dev by XiaohanZhangCMU in https://github.com/mosaicml/composer/pull/3451
* Deprecate deepspeed by dakinggg in https://github.com/mosaicml/composer/pull/3512
* Update importlib-metadata requirement from <7,>=5.0.0 to >=5.0.0,<9 by dependabot in https://github.com/mosaicml/composer/pull/3519
* Update peft requirement from <0.12,>=0.10.0 to >=0.10.0,<0.13 by dependabot in https://github.com/mosaicml/composer/pull/3518
* Use gloo as part of DeviceGPU's process group backend by snarayan21 in https://github.com/mosaicml/composer/pull/3509
* Add a monitor of mlflow logger so that it sets run status as failed if main thread exits unexpectedly by chenmoneygithub in https://github.com/mosaicml/composer/pull/3449
* Revert "Use gloo as part of DeviceGPU's process group backend (3509)" by snarayan21 in https://github.com/mosaicml/composer/pull/3523
* Fix autoresume docstring (save_overwrite) by eracah in https://github.com/mosaicml/composer/pull/3526
* Unpin pip by dakinggg in https://github.com/mosaicml/composer/pull/3524
* hasattr check for Wandb 0.17.6 by mvpatel2000 in https://github.com/mosaicml/composer/pull/3531
* Remove dev on github workflows by mvpatel2000 in https://github.com/mosaicml/composer/pull/3536
* Remove dev branch in GPU workflows by mvpatel2000 in https://github.com/mosaicml/composer/pull/3539
* restore google cloud object store test by bigning in https://github.com/mosaicml/composer/pull/3538
* Update moto[s3] requirement from <5,>=4.0.1 to >=4.0.1,<6 by dependabot in https://github.com/mosaicml/composer/pull/3516
* use s3 boto3 Adaptive retry as default retry mode by bigning in https://github.com/mosaicml/composer/pull/3543
* Use python 3.11 in GAs by eitanturok in https://github.com/mosaicml/composer/pull/3529
* Implement ruff rules enforcing pep 585 by snarayan21 in https://github.com/mosaicml/composer/pull/3551
* Update numpy requirement from <2.1.0,>=1.21.5 to >=1.21.5,<2.2.0 by dependabot in https://github.com/mosaicml/composer/pull/3556
* Bump databricks-sdk from 0.29.0 to 0.30.0 by dependabot in https://github.com/mosaicml/composer/pull/3559
* Update Optim to DecoupledSGD in Notebooks by mvpatel2000 in https://github.com/mosaicml/composer/pull/3554
* Remove lambda code eval testing by mvpatel2000 in https://github.com/mosaicml/composer/pull/3560
* Restore Azure Tests by mvpatel2000 in https://github.com/mosaicml/composer/pull/3561
* Remove tokens for `to_next_epoch` by mvpatel2000 in https://github.com/mosaicml/composer/pull/3562
* Change iteration timestamp for old checkpoints by b-chu in https://github.com/mosaicml/composer/pull/3563
* Fix typo in `composer_collect_env` by dakinggg in https://github.com/mosaicml/composer/pull/3566
* Add default value to get_device() by coryMosaicML in https://github.com/mosaicml/composer/pull/3568
* add ghcr and update build matrix generator by KevDevSha in https://github.com/mosaicml/composer/pull/3465
* Bump aws_ofi_nccl to 1.11.0 by willgleich in https://github.com/mosaicml/composer/pull/3569
* allow listed runners by KevDevSha in https://github.com/mosaicml/composer/pull/3486
* fix runner linux-ubuntu > ubuntu-latest by KevDevSha in https://github.com/mosaicml/composer/pull/3571
* Bump version to v0.24.0 + deprecations by snarayan21 in https://github.com/mosaicml/composer/pull/3570

New Contributors
* ethanma-db made their first contribution in https://github.com/mosaicml/composer/pull/3467
* KevDevSha made their first contribution in https://github.com/mosaicml/composer/pull/3465

**Full Changelog**: https://github.com/mosaicml/composer/compare/v0.23.5...v0.24.0

0.23.5

What's New
1. Variable length dataloaders (3416)
Adds support for dataloaders with rank-dependent lengths. The solution terminates iteration for dataloaders on all ranks when the first dataloader finishes.

Bug Fixed
1. Remove close flush for mosaicml logger (3446)
Previously, the MosaicML Logger sporadically raised an error when the python interpreter was shutting down as it attempted to flush data on `Event.CLOSE` using futures, which cannot be scheduled at that time. Instead, we now only block on finishing existing data upload on `Event.CLOSE`, avoiding scheduling new futures.

What's Changed
* Update numpy requirement from <1.27.0,>=1.21.5 to >=1.21.5,<2.1.0 by dependabot in https://github.com/mosaicml/composer/pull/3406
* Restore dev version by karan6181 in https://github.com/mosaicml/composer/pull/3417
* Save checkpoint to disk for API with new save layout by eracah in https://github.com/mosaicml/composer/pull/3399
* Patch PyTorch 2.3.1 by mvpatel2000 in https://github.com/mosaicml/composer/pull/3419
* Fixes some typing issues by dakinggg in https://github.com/mosaicml/composer/pull/3418
* Fix style by b-chu in https://github.com/mosaicml/composer/pull/3420
* Bump coverage[toml] from 7.5.3 to 7.5.4 by dependabot in https://github.com/mosaicml/composer/pull/3422
* Update psutil requirement from <6,>=5.8.0 to >=5.8.0,<7 by dependabot in https://github.com/mosaicml/composer/pull/3424
* Add support for variable length dataloaders in DDP by JAEarly in https://github.com/mosaicml/composer/pull/3416
* Hsdp + MoE CI tests by KuuCi in https://github.com/mosaicml/composer/pull/3378
* Bumping MLflow version to 2.14.1 by JackZ-db in https://github.com/mosaicml/composer/pull/3425
* Skip HSDP + TP pytests that require torch 2.3 or above by KuuCi in https://github.com/mosaicml/composer/pull/3426
* Remove CodeQL workflow by mvpatel2000 in https://github.com/mosaicml/composer/pull/3429
* Remove save overwrite by mvpatel2000 in https://github.com/mosaicml/composer/pull/3431
* Fixes to TP Docs by snarayan21 in https://github.com/mosaicml/composer/pull/3430
* Lower the system metrics logging frequency to reduce MLflow server's load by chenmoneygithub in https://github.com/mosaicml/composer/pull/3436
* Update paramiko requirement from <3,>=2.11.0 to >=3.4.0,<4 by dependabot in https://github.com/mosaicml/composer/pull/3439
* Bump CI testing version by mvpatel2000 in https://github.com/mosaicml/composer/pull/3433
* Fix docstring for EVAL_AFTER_ALL/EVAL_BEFORE_ALL by mvpatel2000 in https://github.com/mosaicml/composer/pull/3445
* Remove close flush for mosaicml logger by mvpatel2000 in https://github.com/mosaicml/composer/pull/3446
* Remove MosaicMLLambdaEvalClient by aspfohl in https://github.com/mosaicml/composer/pull/3432
* Relax hf hub pin by dakinggg in https://github.com/mosaicml/composer/pull/3435
* Pytest skip 2 by KuuCi in https://github.com/mosaicml/composer/pull/3448
* bump version v0.23.5 by XiaohanZhangCMU in https://github.com/mosaicml/composer/pull/3450


**Full Changelog**: https://github.com/mosaicml/composer/compare/v0.23.4...v0.23.5

0.23.4

Bug Fixes

**1. Patch PyTorch 2.3.1 (https://github.com/mosaicml/composer/pull/3419)**

Fixes missing import when monkeypatching device mesh functions in PyTorch 2.3.1. This is necessary for MoE training.

**Full Changelog**: https://github.com/mosaicml/composer/compare/v0.23.3...v0.23.4

0.23.3

New Features

1. Update mlflow logger to use the new API with time-dimension to view images in MLFlow (3286)

We've enhanced the MLflow logger's `log_image` function to use the new API with time-dimension support, enabling images to be viewed in MLflow.

2. Add logging buffer time to MLFLow logger (3401)

We've added the `logging_buffer_seconds` argument to the MLflow logger, which specifies how many seconds to buffer before sending logs to the MLflow tracking server.

Bug Fixes
1. Only require `databricks-sdk` when on Databricks platform (3389)

Previously, MLFlow always imported the databricks-sdk. Now, we only require the sdk if on the databricks platform and using databricks secrets to access managed MLFlow.

2. Skip extra dataset state load during job resumption (3393)

Previously, when loading a checkpoint with `train_dataloader`, the `dataset_state` would load first, and if `train_dataloader` was set again afterward, `load_state_dict` would be called with a `None` value. Now, we've added a check in the `train_dataloader` setter to skip this redundant load.

3. Fix auto-microbatching on CUDA 12.4 (3400)

In CUDA 12.4, the out-of-memory error message has changed to `CUDA error: out of memory`. Previously, our logic hardcoded checks for `CUDA out of memory` when using `device_train_microbatch_size="auto"`. Now, we check for both `CUDA out of memory` and `CUDA error: out of memory`.

4. Fix mlflow logging to Databricks workspace file paths which startswith `/Shared/` prefix (3410)

Previously, for MLflow logging, we prepended the path `/Users/` to all user-provided logging paths on the Databricks platform, if not specified, including paths starting with `/Shared/`, which was incorrect since `/Shared/` indicates a shared workspace. Now, the `/Users/` prepend is skipped for paths starting with `/Shared/`.

What's Changed
* Bump CI from 0.0.7 to 0.0.8 by KuuCi in https://github.com/mosaicml/composer/pull/3383
* Fix backward compatibility caused by missing eval metrics class by bigning in https://github.com/mosaicml/composer/pull/3385
* Bump version v0.23.2 by bigning in https://github.com/mosaicml/composer/pull/3386
* Restore dev version by bigning in https://github.com/mosaicml/composer/pull/3388
* Only requires `databricks-sdk` when inside the Databricks platform by antoinebrl in https://github.com/mosaicml/composer/pull/3389
* Update packaging requirement from <24.1,>=21.3.0 to >=21.3.0,<24.2 by dependabot in https://github.com/mosaicml/composer/pull/3392
* Bump cryptography from 42.0.6 to 42.0.8 by dependabot in https://github.com/mosaicml/composer/pull/3391
* Skip extra dataset state load by mvpatel2000 in https://github.com/mosaicml/composer/pull/3393
* Remove FSDP restriction from PyTorch 1.13 by mvpatel2000 in https://github.com/mosaicml/composer/pull/3395
* Check for 'CUDA error: out of memory' when auto-microbatching by JAEarly in https://github.com/mosaicml/composer/pull/3400
* Add tokens to iterations by b-chu in https://github.com/mosaicml/composer/pull/3374
* Busy wait utils in dist by dakinggg in https://github.com/mosaicml/composer/pull/3396
* Add buffering time to mlflow logger by chenmoneygithub in https://github.com/mosaicml/composer/pull/3401
* Add missing import for PyTorch 2.3.1 device mesh slicing by mvpatel2000 in https://github.com/mosaicml/composer/pull/3402
* Add pynvml to mlflow dep group by dakinggg in https://github.com/mosaicml/composer/pull/3404
* min/max flagging added to system_metrics_monitor with only non-redundant, necessary gpu metrics logged by JackZ-db in https://github.com/mosaicml/composer/pull/3373
* Simplify launcher world size parsing by mvpatel2000 in https://github.com/mosaicml/composer/pull/3398
* Optionally use `flash-attn`'s CE loss for metrics by snarayan21 in https://github.com/mosaicml/composer/pull/3394
* log image fix by jessechancy in https://github.com/mosaicml/composer/pull/3286
* [ckpt-rewr] Save state dict API by eracah in https://github.com/mosaicml/composer/pull/3372
* Revert "Optionally use `flash-attn`'s CE loss for metrics (3394)" by snarayan21 in https://github.com/mosaicml/composer/pull/3408
* CPU tests image fix by snarayan21 in https://github.com/mosaicml/composer/pull/3409
* Add setter for epoch in iteration by b-chu in https://github.com/mosaicml/composer/pull/3407
* Move pillow dep as required by mvpatel2000 in https://github.com/mosaicml/composer/pull/3412
* fixing mlflow logging to Databricks workspace file paths with /Shared/ prefix by JackZ-db in https://github.com/mosaicml/composer/pull/3410
* Bump version v0.23.3 by karan6181 in https://github.com/mosaicml/composer/pull/3414

New Contributors
* JackZ-db made their first contribution in https://github.com/mosaicml/composer/pull/3373

**Full Changelog**: https://github.com/mosaicml/composer/compare/v0.23.2...v0.23.3

0.23.2

Bug Fixes
* Fix backward compatibility issue caused by missing eval metrics class

What's Changed:
* Fix backward compatibility issue caused by missing eval metrics class by bigning in https://github.com/mosaicml/composer/pull/3385

**Full Changelog**: https://github.com/mosaicml/composer/compare/v0.23.1...release/v0.23.2

Page 2 of 11

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.