CISPA
Browse
cispa_all_3976.pdf (938.96 kB)

Revisiting Gradient Clipping: Stochastic bias and tight convergence guarantees

Download (938.96 kB)
Version 2 2023-12-14, 12:33
Version 1 2023-11-29, 18:24
conference contribution
posted on 2023-12-14, 12:33 authored by Anastasia Koloskova, Hadrien Hendrikx, Sebastian StichSebastian Stich
Gradient clipping is a popular modification to standard (stochastic) gradient descent, at every iteration limiting the gradient norm to a certain value . It is widely used for example for stabilizing the training of deep learning models (Goodfellow et al., 2016), or for enforcing differential privacy (Abadi et al., 2016). Despite popularity and simplicity of the clipping mechanism, its convergence guarantees often require specific values of and strong noise assumptions. In this paper, we give convergence guarantees that show precise dependence on arbitrary clipping thresholds and show that our guarantees are tight with both deterministic and stochastic gradients. In particular, we show that (i) for deterministic gradient descent, the clipping threshold only affects the higher-order terms of convergence, (ii) in the stochastic setting convergence to the true optimum cannot be guaranteed under the standard noise assumption, even under arbitrary small step-sizes. We give matching upper and lower bounds for convergence of the gradient norm when running clipped SGD, and illustrate these results with experiments.

History

Preferred Citation

Anastasia Koloskova, Hadrien Hendrikx and Sebastian Stich. Revisiting Gradient Clipping: Stochastic bias and tight convergence guarantees. In: International Conference on Machine Learning (ICML). 2023.

Primary Research Area

  • Trustworthy Information Processing

Name of Conference

International Conference on Machine Learning (ICML)

Legacy Posted Date

2023-06-30

Open Access Type

  • Green

BibTeX

@inproceedings{cispa_all_3976, title = "Revisiting Gradient Clipping: Stochastic bias and tight convergence guarantees", author = "Koloskova, Anastasia and Hendrikx, Hadrien and Stich, Sebastian U.", booktitle="{International Conference on Machine Learning (ICML)}", year="2023", }

Usage metrics

    Categories

    No categories selected

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC