E-mail senden E-Mail Adresse kopieren
2024-10-31

Stabilized Proximal Point Methods for Federated Optimization

Zusammenfassung

In developing efficient optimization algorithms, it is crucial to account for communication constraints---a significant challenge in modern federated learning settings. The best-known communication complexity among non-accelerated algorithms is achieved by DANE, a distributed proximal point algorithm that solves local subproblems in each iteration and that can exploit Hessian similarity. However, to achieve such communication efficiency, the local subproblems must be solved more accurately across communication rounds. Inspired by the extra-gradient method, in this work, we i) propose a novel distributed algorithm S-DANE. This method adopts a more stabilized reference point in the proximal step and matches DANE's deterministic communication complexity. Moreover, the accuracy level of the subproblem is always fixed during the optimization process, leading to enhanced local computation efficiency. Furthermore, it supports partial client participation and arbitrary stochastic local solvers, making it more attractive in practice. We further ii) accelerate S-DANE, and show that the resulting algorithm achieves the best-known communication complexity among all existing methods for convex distributed optimization. The efficient computation property is also retained.

Konferenzbeitrag

Conference on Neural Information Processing Systems (NeurIPS)

Veröffentlichungsdatum

2024-10-31

Letztes Änderungsdatum

2024-10-10