CISPA
Browse
- No file added yet -

Federated Learning from Small Datasets.

Download (618.85 kB)
conference contribution
posted on 2024-08-26, 10:49 authored by Michael Kamp, Jonas Fischer, Jilles VreekenJilles Vreeken
Federated learning allows multiple parties to collaboratively train a joint model without having to share any local data. It enables applications of machine learning in settings where data is inherently distributed and undisclosable, such as in the medical domain. Joint training is usually achieved by aggregating local models. When local datasets are small, locally trained models can vary greatly from a globally good model. Bad local models can arbitrarily deteriorate the aggregate model quality, causing federating learning to fail in these settings. We propose a novel approach that avoids this problem by interleaving model aggregation and permutation steps. During a permutation step we redistribute local models across clients through the server, while preserving data privacy, to allow each local model to train on a daisy chain of local datasets. This enables successful training in data-sparse domains. Combined with model aggregation, this approach enables effective learning even if the local datasets are extremely small, while retaining the privacy benefits of federated learning.

History

Primary Research Area

  • Trustworthy Information Processing

Name of Conference

International Conference on Learning Representations (ICLR)

Journal

ICLR

BibTeX

@conference{Kamp:Fischer:Vreeken:2023, title = "Federated Learning from Small Datasets.", author = "Kamp, Michael" AND "Fischer, Jonas" AND "Vreeken, Jilles", year = 2023, month = 2, journal = "ICLR" }

Usage metrics

    Categories

    No categories selected

    Licence

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC