Current status
**:warning: As of July 2023, we have paused active development on TorchData and have paused new releases. We have learnt a lot from building it and hearing from users, but also believe we need to re-evaluate the technical design and approach given how much the industry has changed since we began the project. During the rest of 2023 we will be re-evaluating our plans in this space. Please reach out if you suggestions or comments (please use [1196](https://github.com/pytorch/data/issues/1196) for feedback).**
Bug Fixes
- MPRS request/response cycle for workers (https://github.com/pytorch/data/commit/40dd648bdd2b7b9c078ba3d2f47316b6dd4446d3)
- Sequential reading service checkpointing (https://github.com/pytorch/data/commit/8d452cf4d0688fdce478089fe77cba52fc27e1c3)
- Cancel future object and always run callback in FullSync during shutdown (1171)
- DataPipe, Ensures Prefetcher shuts down properly (1166)
- DataPipe, Fix FullSync shutdown hanging issue while paused (1153)
- DataPipe, Fix a word in WebDS DataPipe (1156)
- DataPipe, Add handler argument to iopath DataPipes (1154)
- Prevent in_memory_cache from yielding from source_dp when it's fully cache (1160)
- Fix pin_memory to support single-element batch (1158)
- DataLoader2, Removing delegation for 'pause', 'limit', and 'resume' (1067)
- DataLoader2, Handle MapDataPipe by converting to IterDataPipe internally by default (1146)
New Features
- Implement InProcessReadingService (1139)
- Enable miniepoch for MultiProcessingReadingService (1170)
- DataPipe, Implement pause/resume for FullSync (1130)
- DataLoader2, Saving and restoring initial seed generator (998)
- Add ThreadPoolMapper (1052)