E-mail senden E-Mail Adresse kopieren
2024-06-30

The Limits and Potentials of Local SGD for Distributed Heterogeneous Learning with Intermittent Communication

Zusammenfassung

Local SGD is a popular optimization method in distributed learning, often outperforming mini-batch SGD. Despite this practical success, proving the efficiency of local SGD has been difficult, creating a significant gap between theory and practice. We provide new lower bounds for local SGD under existing first-order data heterogeneity assumptions, showing these assumptions can not capture local SGD’s effectiveness. We also demonstrate the min-max optimality of accelerated mini-batch SGD under these assumptions. Our findings emphasize the need for improved modeling of data heterogeneity. Under higher-order assumptions, we provide new upper bounds that verify the dominance of local SGD over mini-batch SGD when data heterogeneity is low.

Konferenzbeitrag

Conference on Learning Theory (COLT)

Veröffentlichungsdatum

2024-06-30

Letztes Änderungsdatum

2024-10-14