Completes the preprocessing -> pretraining -> model usage pipeline, by adding the full reference scripts for preprocessing the entire Sth-Sth-v2 dataset (for pretraining MVP, R3M, and Voltron models), as well as the PyTorch XLA pretraining script (again, for all models).
A "standard" PyTorch GPU/DDP pretraining implementation is in the works, but hopefully the logic transfers!
What's Changed
* Add Preprocessing Pipeline for Sth-Sth-v2 by siddk in https://github.com/siddk/voltron-robotics/pull/8
* Add XLA Pretraining Script by siddk in https://github.com/siddk/voltron-robotics/pull/10
**Full Changelog**: https://github.com/siddk/voltron-robotics/compare/v0.0.1...v1.0.0