All Projects → ray-project → distml

ray-project / distml

Licence: other
Distributed ML Optimizer

Programming Languages

python
139335 projects - #7 most used programming language
shell
77523 projects

Introduction

DistML is a Ray extension library to support large-scale distributed ML training on heterogeneous multi-node multi-GPU clusters. This library is under active development and we are adding more advanced training strategies and auto-parallelization features.

DistML currently supports:

  • Distributed training strategies

    • Data parallelism
      • AllReduce strategy
      • Sharded parameter server strategy
      • BytePS strategy Pipeline parallleism
      • Micro-batch pipeline parallelism
  • DL Frameworks:

    • PyTorch
    • JAX

Installation

Install Dependencies

Depending on your CUDA version, install cupy following https://docs.cupy.dev/en/stable/install.html.

Install from source for dev

pip install -e .
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].