CISPA
Browse

Revisiting Consensus Error: A Fine-grained Analysis of Local SGD under Second-order Data Heterogeneity

Download (1002.33 kB)
conference contribution
posted on 2025-11-04, 10:44 authored by Kumar Kshitij Patel, Ali Zindari, Sebastian StichSebastian Stich, Lingxiao Lingxiao
Local SGD, or Federated Averaging, is one of the most widely used algorithms for distributed optimization. Although it often outperforms alternatives such as mini-batch SGD, existing theory has not fully explained this advantage under realistic assumptions about data heterogeneity. Recent work has suggested that a second-order heterogeneity assumption may suffice to justify the empirical gains of local SGD. We confirm this conjecture by establishing new upper and lower bounds on the convergence of local SGD. These bounds demonstrate how a low second-order heterogeneity, combined with third-order smoothness, enables local SGD to interpolate between heterogeneous and homogeneous regimes while maintaining communication efficiency. Our main technical contribution is a refined analysis of the consensus error, a central quantity in such results. We validate our theory with experiments on a distributed linear regression task.

History

Primary Research Area

  • Trustworthy Information Processing

Name of Conference

Conference on Neural Information Processing Systems (NeurIPS)

CISPA Affiliation

  • Yes

BibTeX

@conference{Patel:Zindari:Stich:Lingxiao:2025, title = "Revisiting Consensus Error: A Fine-grained Analysis of Local SGD under Second-order Data Heterogeneity", author = "Patel, Kumar Kshitij" AND "Zindari, Ali" AND "Stich, Sebastian" AND "Lingxiao, Lingxiao", year = 2025, month = 12 }

Usage metrics

    Categories

    No categories selected

    Licence

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC