Mosaicml

Latest version: v0.29.0

Safety actively analyzes 723177 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 15 of 15

0.1.3

What's Changed
* Add docker images for diffusion repo by j316chuck in https://github.com/mosaicml/diffusion/pull/174
* Add option for default prompts/negative prompts in eval by coryMosaicML in https://github.com/mosaicml/diffusion/pull/148
* Add image generator to generate images for use with geneval by coryMosaicML in https://github.com/mosaicml/diffusion/pull/172
* Add composer model class for running with precomputed CLIP and T5 text latents by coryMosaicML in https://github.com/mosaicml/diffusion/pull/171
* Update latent precomputation script with batching by jazcollins in https://github.com/mosaicml/diffusion/pull/170
* Update dependencies for use with torch 2.4.1 and composer 0.25.0 by coryMosaicML in https://github.com/mosaicml/diffusion/pull/176
* Add option for stream weights in the image caption latents dataloader by coryMosaicML in https://github.com/mosaicml/diffusion/pull/178
* Update evaluation and inference code to handle other precisions and models by coryMosaicML in https://github.com/mosaicml/diffusion/pull/179

New Contributors
* j316chuck made their first contribution in https://github.com/mosaicml/diffusion/pull/174

**Full Changelog**: https://github.com/mosaicml/diffusion/compare/v0.1.2...v0.1.3

0.1.2

Streaming `v0.1.2` is released! Install via `pip`:

bash
pip install --upgrade mosaicml-streaming==0.1.2


What's Changed
* Fixed contributing page link by karan6181 in https://github.com/mosaicml/streaming/pull/61
* Add Distributed test and supported multi device unittest by karan6181 in https://github.com/mosaicml/streaming/pull/57
* Added template and adhere to standard coding practice by karan6181 in https://github.com/mosaicml/streaming/pull/62
* Bump pytest from 7.1.3 to 7.2.0 by dependabot in https://github.com/mosaicml/streaming/pull/63
* Bump pypandoc from 1.9 to 1.10 by dependabot in https://github.com/mosaicml/streaming/pull/65
* Add code coverage report and moved scripts outside of src by karan6181 in https://github.com/mosaicml/streaming/pull/66
* Bump sphinxext-opengraph from 0.6.3 to 0.7.2 by dependabot in https://github.com/mosaicml/streaming/pull/67
* Add Google Cloud Storage support by karan6181 in https://github.com/mosaicml/streaming/pull/68
* Create and push release branch as part of workflow by karan6181 in https://github.com/mosaicml/streaming/pull/69
* Add test CI badge in README by karan6181 in https://github.com/mosaicml/streaming/pull/70
* Add unit test for download, encodings, hashing, and others by karan6181 in https://github.com/mosaicml/streaming/pull/72
* Bump version to 0.1.2 by karan6181 in https://github.com/mosaicml/streaming/pull/75


**Full Changelog**: https://github.com/mosaicml/streaming/compare/v0.1.1...v0.1.2

0.1.1

Streaming v0.1.1 is released! Install via `pip`:

bash
pip install --upgrade mosaicml-streaming==0.1.1


What's Changed
* Streaming datasets V2 by knighton in https://github.com/mosaicml/streaming/pull/2
* Initial Docs Site by bandish-shah in https://github.com/mosaicml/streaming/pull/3
* Added a ADE20K and COCO2017 data conversion scripts by karan6181 in https://github.com/mosaicml/streaming/pull/5
* Added pre-commit config by karan6181 in https://github.com/mosaicml/streaming/pull/6
* Added pre-commit config for a License Header by karan6181 in https://github.com/mosaicml/streaming/pull/7
* Convert relative imports to absolute imports by karan6181 in https://github.com/mosaicml/streaming/pull/8
* C4 dataset by knighton in https://github.com/mosaicml/streaming/pull/4
* Add a ADE20K streaming dataset class by karan6181 in https://github.com/mosaicml/streaming/pull/9
* PyPi mods for setup.py by bandish-shah in https://github.com/mosaicml/streaming/pull/10
* Disable local shard deletion by knighton in https://github.com/mosaicml/streaming/pull/12
* Add a COCO streaming dataset class by karan6181 in https://github.com/mosaicml/streaming/pull/13
* Add docstrings. by knighton in https://github.com/mosaicml/streaming/pull/14
* Added unittest for Writer and Reader by karan6181 in https://github.com/mosaicml/streaming/pull/16
* added new streaming logos by ejyuen in https://github.com/mosaicml/streaming/pull/15
* Update package version code for unification by karan6181 in https://github.com/mosaicml/streaming/pull/17
* Fix wait-for-unzip race by knighton in https://github.com/mosaicml/streaming/pull/18
* Added algolia search to streaming docs site by nqn in https://github.com/mosaicml/streaming/pull/19
* Add a pre-commit GitHub workflow by karan6181 in https://github.com/mosaicml/streaming/pull/21
* Added pydocstyle and docformatter in pre-commit config by karan6181 in https://github.com/mosaicml/streaming/pull/20
* Improve algorithmic complexity of sample-to-shard lookup from O(log N) to O(1) by knighton in https://github.com/mosaicml/streaming/pull/22
* Add enwiki-20200101 streaming dataset by knighton in https://github.com/mosaicml/streaming/pull/23
* Add submodules to api reference doc by karan6181 in https://github.com/mosaicml/streaming/pull/24
* Initial Docs site content by bandish-shah in https://github.com/mosaicml/streaming/pull/11
* Add unittest for compression by karan6181 in https://github.com/mosaicml/streaming/pull/25
* Fix hang when compression is used but compressed files are not retained by knighton in https://github.com/mosaicml/streaming/pull/26
* Add long_description for packaging by bandish-shah in https://github.com/mosaicml/streaming/pull/29
* Update tutorial notebooks to have it run end-to-end by karan6181 in https://github.com/mosaicml/streaming/pull/30
* Adjustment for last partition bug by knighton in https://github.com/mosaicml/streaming/pull/27
* Fix preprocessing for English Wikipedia dataset by knighton in https://github.com/mosaicml/streaming/pull/28
* Fix enwiki dataset by dskhudia in https://github.com/mosaicml/streaming/pull/31
* Skip pre-commit check for enwiki convert skip to have code parity by karan6181 in https://github.com/mosaicml/streaming/pull/32
* Update doc and fixed reference links by karan6181 in https://github.com/mosaicml/streaming/pull/33
* Parallel tfrecord creation, validate sample counts vs MDS by knighton in https://github.com/mosaicml/streaming/pull/34
* Bump up the version to 0.0.1b by karan6181 in https://github.com/mosaicml/streaming/pull/35
* Add NLP synthetic dataset jupyter notebook tutorial by karan6181 in https://github.com/mosaicml/streaming/pull/36
* Add README and CONTRIBUTING guide by karan6181 in https://github.com/mosaicml/streaming/pull/38
* Typos + copy editing in README by dblalock in https://github.com/mosaicml/streaming/pull/40
* Re-factor docs tutorials to top-level examples by bandish-shah in https://github.com/mosaicml/streaming/pull/39
* Fixed typos and update documentation by karan6181 in https://github.com/mosaicml/streaming/pull/42
* Add CodeQL security scanner and Dependabot workflow by karan6181 in https://github.com/mosaicml/streaming/pull/43
* Bump gitpython from 3.1.28 to 3.1.29 by dependabot in https://github.com/mosaicml/streaming/pull/46
* Bump myst-parser from 0.16.1 to 0.18.1 by dependabot in https://github.com/mosaicml/streaming/pull/47
* Add bug report and feature request template by karan6181 in https://github.com/mosaicml/streaming/pull/48
* mlperf enwiki conversion code mild cleanup by knighton in https://github.com/mosaicml/streaming/pull/41
* Add Build publish to PyPI and create GitHub release workflow by karan6181 in https://github.com/mosaicml/streaming/pull/50
* Added writer unittest and update existing test by karan6181 in https://github.com/mosaicml/streaming/pull/52
* Bump version to 0.1.0 by karan6181 in https://github.com/mosaicml/streaming/pull/53
* Fixed dead image link in pypi home page by karan6181 in https://github.com/mosaicml/streaming/pull/54
* Add TorchVision VisionDataset inheritance. by knighton in https://github.com/mosaicml/streaming/pull/55
* bump version to 0.1.1b0 by karan6181 in https://github.com/mosaicml/streaming/pull/56
* Fixed rendering of pypi image by karan6181 in https://github.com/mosaicml/streaming/pull/59
* Bump version to 0.1.1 by karan6181 in https://github.com/mosaicml/streaming/pull/60

New Contributors
* knighton made their first contribution in https://github.com/mosaicml/streaming/pull/2
* bandish-shah made their first contribution in https://github.com/mosaicml/streaming/pull/3
* karan6181 made their first contribution in https://github.com/mosaicml/streaming/pull/5
* ejyuen made their first contribution in https://github.com/mosaicml/streaming/pull/15
* nqn made their first contribution in https://github.com/mosaicml/streaming/pull/19
* dskhudia made their first contribution in https://github.com/mosaicml/streaming/pull/31
* dblalock made their first contribution in https://github.com/mosaicml/streaming/pull/40
* dependabot made their first contribution in https://github.com/mosaicml/streaming/pull/46

**Full Changelog**: https://github.com/mosaicml/streaming/commits/v0.1.1

0.1

We've spun off Streaming datasets into it's own [repository](https://github.com/mosaicml/streaming)! Streaming datasets is a high-performance drop-in for Torch `IterableDataset`, enabling users to stream training data from cloud based object stores. Streaming is shipping with built-in support for popular open source datasets (ADE20K, C4, COCO, Enwiki, ImageNet, etc.)

To get started, install the Streaming PyPi package:
bash
pip install mosaicml-streaming


You can use the streaming Dataset class with the PyTorch native DataLoader class as follows:
python
import torch
from streaming import Dataset

dataloader = torch.utils.data.DataLoader(dataset=Dataset(remote='s3://...'))


For more information, please check out the [Streaming docs](https://docs.mosaicml.com/projects/streaming/en/latest/).

1. **✔👉 Simplified Checkpointing Interface**

With this release we’ve greatly simplified configuration of loading and saving checkpoints in Composer.

To save checkpoints to S3, all you need to do is:
- Specify with `save_folder` your full URI to your save directory destination (e.g. `'s3://my-bucket/{run_name}/checkpoints'`)
- Optionally, set `save_filename` to the pattern you want for your checkpoint file names

python
from composer.trainer import Trainer

Checkpoint saving to S3.
trainer = Trainer(
model=model,
save_folder="s3://my-bucket/{run_name}/checkpoints",
run_name='my-run',
save_interval="1ep",
save_filename="ep{epoch}.pt",
save_num_checkpoints_to_keep=0, delete all checkpoints locally
...
)

trainer.fit()


Likewise, to load checkpoints from S3, all you have to do is:
- Set `load_path` to the full URI to your desired checkpoint file (e.g.`'s3://my-bucket/my-run/checkpoints/epoch13.pt'`)

python
from composer.trainer import Trainer

Checkpoint loading from S3.
new_trainer = Trainer(
model=model,
train_dataloader=train_dataloader,
max_duration="10ep",
load_path="s3://my-bucket/my-run/checkpoints/ep13.pt",
)

new_trainer.fit()


For more information, please see our [Checkpointing guide](https://docs.mosaicml.com/en/v0.11.0/trainer/checkpointing.html).

1. **𐄳 Improved Distributed Experience**

We’ve made it easier to write your own custom distributed entry points by exposing our distributed API. You can now leverage all of our helpful distributed functions and contexts.

For example, let's say we want to need to download a dataset in a distributed training application. To avoid race conditions where different ranks try to write the dataset to the same place, we need to ensure that only rank 0 downloads the dataset first:

python
import datetime
from composer.trainer.devices import DeviceGPU
from composer.utils import dist

dist.initialize(DeviceGPU(), datetime.timedelta(seconds=30)) Initialize distributed module

if dist.get_local_rank() == 0: Download dataset on rank zero
dataset = download_my_dataset()
dist.barrier() All ranks wait until dataset is downloaded

Create and train your model!


For more information, please check out our [Distributed API docs](https://docs.mosaicml.com/en/v0.11.0/api_reference/composer.utils.dist.html).

Bug Fixes
* fix loss and eval_forward for HF models (1597)
* add more robust casting to int for fsdp min_params (1608)
* Deepspeed Docs Typo (1605)
* Fix mmdet typo (1618)
* Blurpool idempotent (1625)
* When model is not on `meta` device, initialization should occur on compute device not CPU (1623)
* Auto resumption (1615)
* Adjust speed monitor (1645)
* Hot fix console logging (1643)
* Lazy Logging + pretty print dict for hparams (1653)
* Fix many failing notebook tests (1646)

What's Changed
* Bump coverage[toml] from 6.4.4 to 6.5.0 by dependabot in https://github.com/mosaicml/composer/pull/1583
* Bump furo from 2022.9.15 to 2022.9.29 by dependabot in https://github.com/mosaicml/composer/pull/1584
* Add English Wikipedia 2020-01-01 dataset by knighton in https://github.com/mosaicml/composer/pull/1572
* Add pull request template by dakinggg in https://github.com/mosaicml/composer/pull/1588
* Bump ipykernel from 6.15.3 to 6.16.0 by dependabot in https://github.com/mosaicml/composer/pull/1587
* Update importlib-metadata requirement from <5,>=4.11.0 to >=5.0,<6 by dependabot in https://github.com/mosaicml/composer/pull/1585
* Bump sphinx-argparse from 0.3.1 to 0.3.2 by dependabot in https://github.com/mosaicml/composer/pull/1586
* Add step explicitly to ImageVisualizer logging calls by dakinggg in https://github.com/mosaicml/composer/pull/1591
* Image viz test by dakinggg in https://github.com/mosaicml/composer/pull/1592
* Remove unused fixture by mvpatel2000 in https://github.com/mosaicml/composer/pull/1594
* Fixes RandAugment API by mvpatel2000 in https://github.com/mosaicml/composer/pull/1596
* fix loss and eval_forward for HF models by dskhudia in https://github.com/mosaicml/composer/pull/1597
* Remove tensorflow-io from setup.py by eracah in https://github.com/mosaicml/composer/pull/1577
* Fixes enwiki for the newly processed wiki dataset by dskhudia in https://github.com/mosaicml/composer/pull/1600
* Change install to all by mvpatel2000 in https://github.com/mosaicml/composer/pull/1599
* Remove log level and should_log_artifact by dakinggg in https://github.com/mosaicml/composer/pull/1603
* Add more robust casting to int for fsdp min_params by dblalock in https://github.com/mosaicml/composer/pull/1608
* Deepspeed Docs Typo by mvpatel2000 in https://github.com/mosaicml/composer/pull/1605
* Object store logger refactor by dakinggg in https://github.com/mosaicml/composer/pull/1601
* Bump gitpython from 3.1.27 to 3.1.28 by dependabot in https://github.com/mosaicml/composer/pull/1609
* Bump tabulate from 0.8.10 to 0.9.0 by dependabot in https://github.com/mosaicml/composer/pull/1610
* Log the number of GPUs and nodes Composer running on. by eracah in https://github.com/mosaicml/composer/pull/1604
* Update MLPerfCallback for v2.1 by hanlint in https://github.com/mosaicml/composer/pull/1607
* Remove object store cls by dakinggg in https://github.com/mosaicml/composer/pull/1606
* Add LAMB Optimizer by hanlint in https://github.com/mosaicml/composer/pull/1613
* Mmdet adapter by A-Jacobson in https://github.com/mosaicml/composer/pull/1545
* Fix mmdet typo by Landanjs in https://github.com/mosaicml/composer/pull/1618
* update torchmetrics requirement by hanlint in https://github.com/mosaicml/composer/pull/1620
* Add distributed sampler error by mvpatel2000 in https://github.com/mosaicml/composer/pull/1598
* Landan/deeplabv3 ade20k example by Landanjs in https://github.com/mosaicml/composer/pull/1593
* Upgrade CodeQL Action to version 2 by karan6181 in https://github.com/mosaicml/composer/pull/1628
* Blurpool idempotent by mvpatel2000 in https://github.com/mosaicml/composer/pull/1625
* Defaulting streaming dataset version to 2 by karan6181 in https://github.com/mosaicml/composer/pull/1616
* Abhi/fsdp bugfix 0 11 by abhi-mosaic in https://github.com/mosaicml/composer/pull/1623
* Remove warning when `master_port` is auto selected by abhi-mosaic in https://github.com/mosaicml/composer/pull/1629
* Remove unused import by dakinggg in https://github.com/mosaicml/composer/pull/1630
* Usability improvements to `intitialize_dist()` by growlix in https://github.com/mosaicml/composer/pull/1619
* Remove Graph in Auto Grad Accum by mvpatel2000 in https://github.com/mosaicml/composer/pull/1631
* Auto resumption by dakinggg in https://github.com/mosaicml/composer/pull/1615
* add stop method by hanlint in https://github.com/mosaicml/composer/pull/1627
* S3 Checkpoint Saving By URI by eracah in https://github.com/mosaicml/composer/pull/1614
* S3 Checkpoint loading from URI by eracah in https://github.com/mosaicml/composer/pull/1624
* Add mvpatel2000 as codeowner for algos by mvpatel2000 in https://github.com/mosaicml/composer/pull/1640
* Adjust speed monitor by mvpatel2000 in https://github.com/mosaicml/composer/pull/1645
* Adding in FSDP Docs by bcui19 in https://github.com/mosaicml/composer/pull/1621
* Attempt to fix flaky doctest by dakinggg in https://github.com/mosaicml/composer/pull/1647
* Fix Missing Underscores in FSDP Docs by bcui19 in https://github.com/mosaicml/composer/pull/1648
* Fixed html path for make host command for docs by karan6181 in https://github.com/mosaicml/composer/pull/1642
* Fix hyperparameters logged to console even when progress_bar and log_to_console are False by eracah in https://github.com/mosaicml/composer/pull/1643
* Fix ImageNet Example normalization values by Landanjs in https://github.com/mosaicml/composer/pull/1641
* Python log level by dakinggg in https://github.com/mosaicml/composer/pull/1651
* Changed default logging to WARN for doctests by eracah in https://github.com/mosaicml/composer/pull/1644
* Add Event.AFTER_LOAD by mvpatel2000 in https://github.com/mosaicml/composer/pull/1652
* Lazy Logging + pretty print dict for hparams by eracah in https://github.com/mosaicml/composer/pull/1653
* Fix todo in memory monitor by mvpatel2000 in https://github.com/mosaicml/composer/pull/1654
* Tests for Idempotent Surgery by mvpatel2000 in https://github.com/mosaicml/composer/pull/1639
* Remove c4 dataset by mvpatel2000 in https://github.com/mosaicml/composer/pull/1635
* Update torchmetrics by hanlint in https://github.com/mosaicml/composer/pull/1656
* Search index filtered by project by nqn in https://github.com/mosaicml/composer/pull/1549
* FSDP Tests by bcui19 in https://github.com/mosaicml/composer/pull/1650
* Add composer version to issue template by dakinggg in https://github.com/mosaicml/composer/pull/1657
* Fix many failing notebook tests by dakinggg in https://github.com/mosaicml/composer/pull/1646
* Re-build the Docker images to resolve pip version error by bandish-shah in https://github.com/mosaicml/composer/pull/1655


**Full Changelog**: https://github.com/mosaicml/composer/compare/v0.10.1...v0.11.0

0.1.0

This is the first release of MosaicML's LLM Foundry!

Our efficient code for training, evaluating, and deploying LLMs outgrew our [examples repository](https://github.com/mosaicml/examples), so we've migrated to a brand new repository dedicated to everything LLMs. Keep watching this space and see the [top-level README](https://github.com/mosaicml/llm-foundry) and our [blog post](www.mosaicml.com/blog/mpt-7b) for more details on this announcement!

Model releases

In addition to all the open-source code released here, we're releasing four open-source models that we hope will be useful to the community. All models were trained on the [MosaicML platform](https://www.mosaicml.com/training), using [Composer](https://github.com/mosaicml/composer) and [Streaming](https://github.com/mosaicml/streaming). If you're interested in training your own models, or using these models with our [optimized inference stack](https://www.mosaicml.com/inference), please [reach out](https://forms.mosaicml.com/demo)!

- `mpt-7b`: This is our base **7-billion parameter** model, trained for **1 trillion tokens**. This model is released with an Apache-2.0 (commercial use permitted) license.
- `mpt-7b-storywriter`: All of the models use ALiBi to allow them to exrapolate to longer sequence lengths than they saw during training, but storywriter is our **long context** model, further pretrained on 65k-token excerpts of a fiction subset of the books3 corpus. This model is released with an Apache-2.0 (commercial use permitted) license.
- `mpt-7b-instruct`: This model is **instruction finetuned** on a dataset we also release, derived from Databrick's [Dolly-15k](https://huggingface.co/datasets/databricks/databricks-dolly-15k) and Anthropic’s [Helpful and Harmless](https://huggingface.co/datasets/Anthropic/hh-rlhf) datasets. This model is released with a CC-By-SA-3.0 (commercial use permitted) license.
- `mpt-7b-chat`: This model is trained to be able to **chat** by further training on the [ShareGPT-Vicuna](https://huggingface.co/datasets/jeffwan/sharegpt_vicuna), [HC3](https://huggingface.co/datasets/Hello-SimpleAI/HC3), [Alpaca](https://huggingface.co/datasets/tatsu-lab/alpaca), [Helpful and Harmless](https://huggingface.co/datasets/Anthropic/hh-rlhf), and [Evol-Instruct](https://huggingface.co/datasets/victor123/evol_instruct_70k) datasets. This model is released with a CC-By-NC-SA-4.0 (non-commercial use only) license.

Features

Training

We release fully featured code for efficiently training any HuggingFace LLM (including our optimized [MPT](https://github.com/mosaicml/llm-foundry/tree/main/llmfoundry/models/mpt) using FSDP, Composer, and Streaming. Seamlessly scale to multi-gpu and multi-node training, stream your data from one cloud, train on a different cloud, write checkpoints to a third cloud, send your training logs to Weights&Biases, and much more. See the [README](https://github.com/mosaicml/llm-foundry/tree/main/scripts/train) for more detailed instructions on getting started pretraining and finetuning!

Our MPT model is equipped with the latest advancements in training large transformers (e.g. ALiBi, the LION optimizer, FlashAttention), and is desgined to be easily hackable, configurable, and extendable!

Evaluation

Our [evaluation framework](https://github.com/mosaicml/llm-foundry/tree/main/scripts/eval), makes it easy to fully re-evaluate any HuggingFace model. We also include [copies of the processed data for many popular benchmarks](https://github.com/mosaicml/llm-foundry/tree/main/scripts/eval/local_data), to make it easy to replicate our evals, and perform your own! We welcome the addition of new benchmarks to our suite. In previous benchmarks, our setup is 8x faster than other eval frameworks on a single GPU and seamlessly achieves linear scaling with multiple GPUs. Built-in support for FSDP makes it possible to evaluate large models and use larger batch sizes for further acceleration.

Inference

MPT is designed to be fast, easy, and cheap to deploy for inference. To begin with, all MPT models are subclassed from the HuggingFace PretrainedModel base class, which means that they are fully compatible with the HuggingFace ecosystem. You can upload MPT models to the HuggingFace Hub, generate outputs with standard pipelines like `model.generate(...)`, build HuggingFace Spaces (see some of ours [here](https://huggingface.co/mosaicml#spaces)!), and more.

What about performance? With MPT’s optimized layers (including FlashAttention and low precision layernorm), the out-of-the-box performance of MPT-7B on GPUs when using `model.generate(...)` is 1.5x-2x faster than other 7B models like LLaMa-7B. This makes it easy to build fast and flexible inference pipelines with just HuggingFace and PyTorch.

Finally, for the best hosting experience, deploy your MPT models directly on MosaicML’s [Inference service](https://www.mosaicml.com/inference). Start with our managed endpoints for models like MPT-7B-Instruct, and/or deploy your own custom model endpoints for optimal cost and data privacy. Check out the [Inference blog post](https://www.mosaicml.com/blog/inference-launch) for more details!

0.0.1

Page 15 of 15

© 2025 Safety CLI Cybersecurity Inc. All Rights Reserved.