TopoOpt: Optimizing the Network Topology for Distributed DNN Training
We explore a novel approach for building DNN training clusters using commodity optical devices. Our proposal, called TopoOpt, co-optimizes the distributed training process across three dimensions: computation, communication, and network topology. TopoOpt uses a novel alternating optimization technique and a group theory-inspired algorithm to find the best network topology and routing plan, together with parallelization strategy, for distributed DNN training. To motivate our proposal, we measure the communication patterns of distributed DNN workloads at a large online service provider. Experiments with a 12-node prototype demonstrate the feasibility of TopoOpt. Simulations on real distributed training models show that, compared to similar-cost FatTree interconnects, TopoOpt reduces DNN training time by up to 3x.
READ FULL TEXT