Tutel

Latest version: v0.1

Safety actively analyzes 625891 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

0.1.4

1. Enhance communication features: a2a overlap with computation, support different granularity of group creation, etc.
2. Add single-thread CPU implementation for correctness check & reference;
3. Refine JIT compiler interface for flexible usability: jit::inject_source && jit::jit_execute;
4. Enhance examples: fp64 support, cuda amp, checkpointing, etc.
5. Support execution inside torch.distributed.pipeline.

sh
How to Setup:
python3 -m pip install --user https://github.com/microsoft/tutel/archive/refs/tags/v0.1.4.tar.gz


Contributors: yzygitzh, ghostplant, EricWangCN

0.1.3

1. Add Tutel Launcher Support based on Open MPI;
2. Support Establishing Data Model Parallel in Initialization;
3. Support Single Expert Evenly Sharded on Multiple GPUs;
4. Support List of Gates and Forwarding MoE Layer with Specified Gating Index;
5. Fix NVRTC Compatibility when Enabling `USE_NVRTC=1`;
6. Other Implementation Enhancements & Correctness Checking;

sh
How to Setup:
python3 -m pip install --user https://github.com/microsoft/tutel/archive/refs/tags/v0.1.3.tar.gz


Contributors: ghostplant, EricWangCN, guoshzhao.

0.1.2

1. General-purpose top-k gating with `{'type': 'top', 'k': 2}`;
2. Add Megatron-ML Tensor Parallel as gating type;
3. Add [deepspeed-based & megatron-based helloworld example](https://github.com/microsoft/tutel/tree/v0.1.x/tutel/examples) for fair comparison;
4. Add torch.bfloat16 datatype support for single-GPU;

sh
How to Setup:
python3 -m pip install --user https://github.com/microsoft/tutel/archive/refs/tags/v0.1.2.tar.gz


Contributors: ghostplant, EricWangCN, foreveronehundred.

0.1.1

1. Enable fp16 support for AMDGPU.
2. Using NVRTC for JIT compilation if available.
3. Add new system_init interface for initializing NUMA settings in distributed GPUs.
4. Extend more gating types: Top3Gate & Top4Gate.
5. Allow high level to change capacity value in Tutel fast dispatcher.
6. Add custom AllToAll extension for old Pytorch version without builtin AllToAll operator support.

sh
How to Setup:
python3 -m pip install --user https://github.com/microsoft/tutel/archive/refs/tags/v0.1.1.tar.gz


Contributors: jspark1105 , ngoyal2707 , guoshzhao, ghostplant .

0.1.0

The first version of Tutel for efficient MoE implementation.

sh
How to setup:
python3 -m pip install --user https://github.com/microsoft/tutel/archive/refs/tags/v0.1.0.tar.gz

Links

Releases

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.