CISPA
Browse
- No file added yet -

ProxSkip: Yes! Local Gradient Steps Provably Lead to Communication Acceleration! Finally!

Download (964.71 kB)
Version 2 2023-12-14, 12:32
Version 1 2023-11-29, 18:21
conference contribution
posted on 2023-12-14, 12:32 authored by Konstantin Mishchenko, Grigory Malinovsky, Peter Richtarik, Sebastian StichSebastian Stich
We introduce ProxSkip—a surprisingly simple and provably efficient method for minimizing the sum of a smooth (f) and an expensive nonsmooth proximable (ψ) function. The canonical approach to solving such problems is via the proximal gradient descent (ProxGD) algorithm, which is based on the evaluation of the gradient of f and the prox operator of ψ in each iteration. In this work we are specifically interested in the regime in which the evaluation of prox is costly relative to the evaluation of the gradient, which is the case in many applications. ProxSkip allows for the expensive prox operator to be skipped in most iterations: while its iteration complexity is O(κlog 1/ε), where κ is the condition number of f, the number of prox evaluations is O(κ−−√log 1/ε) only. Our main motivation comes from federated learning, where evaluation of the gradient operator corresponds to taking a local GD step independently on all devices, and evaluation of prox corresponds to (expensive) communication in the form of gradient averaging. In this context, ProxSkip offers an effective acceleration of communication complexity. Unlike other local gradient-type methods, such as FedAvg, SCAFFOLD, S-Local-GD and FedLin, whose theoretical communication complexity is worse than, or at best matching, that of vanilla GD in the heterogeneous data regime, we obtain a provable and large improvement without any heterogeneity-bounding assumptions.

History

Preferred Citation

Konstantin Mishchenko, Grigory Malinovsky, Sebastian Stich and Peter Richtarik. ProxSkip: Yes! Local Gradient Steps Provably Lead to Communication Acceleration! Finally!. In: International Conference on Machine Learning (ICML). 2022.

Primary Research Area

  • Trustworthy Information Processing

Name of Conference

International Conference on Machine Learning (ICML)

Legacy Posted Date

2022-07-22

Open Access Type

  • Green

BibTeX

@inproceedings{cispa_all_3733, title = "ProxSkip: Yes! Local Gradient Steps Provably Lead to Communication Acceleration! Finally!", author = "Mishchenko, Konstantin and Malinovsky, Grigory and Stich, Sebastian U. and Richtarik, Peter", booktitle="{International Conference on Machine Learning (ICML)}", year="2022", }

Usage metrics

    Categories

    No categories selected

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC