CISPA
Browse
cispa_all_3803.pdf (462.98 kB)

Most Activation Functions Can Win the Lottery Without Excessive Depth

Download (462.98 kB)
conference contribution
posted on 2023-11-29, 18:22 authored by Rebekka BurkholzRebekka Burkholz
The strong lottery ticket hypothesis has highlighted the potential for training deep neural networks by pruning, which has inspired interesting practical and theoretical insights into how neural networks can represent functions. For networks with ReLU activation functions, it has been proven that a target network with depth L can be approximated by the subnetwork of a randomly initialized neural network that has double the target’s depth 2L and is wider by a logarithmic factor. We show that a depth L + 1 network is sufficient. This result indicates that we can expect to find lottery tickets at realistic, commonly used depths while only requiring logarithmic overparametrization. Our novel construction approach applies to a large class of activation functions and is not limited to ReLUs.

History

Preferred Citation

Rebekka Burkholz. Most Activation Functions Can Win the Lottery Without Excessive Depth. In: Conference on Neural Information Processing Systems (NeurIPS). 2022.

Primary Research Area

  • Trustworthy Information Processing

Name of Conference

Conference on Neural Information Processing Systems (NeurIPS)

Legacy Posted Date

2022-10-12

Open Access Type

  • Green

BibTeX

@inproceedings{cispa_all_3803, title = "Most Activation Functions Can Win the Lottery Without Excessive Depth", author = "Burkholz, Rebekka", booktitle="{Conference on Neural Information Processing Systems (NeurIPS)}", year="2022", }

Usage metrics

    Categories

    No categories selected

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC