CISPA
Browse
- No file added yet -

Detection and Attribution of Models Trained on Generated Data

Download (974.09 kB)
conference contribution
posted on 2024-04-03, 11:21 authored by Ge Han, Ahmed Salem, Zheng Li, Shanqing Guo, Michael BackesMichael Backes, Yang Zhang
Generative Adversarial Networks (GANs) have become widely used in model training, as they can improve performance and/or protect sensitive information by generating data. However, this also raises potential risks, as malicious GANs may compromise or sabotage models by poisoning their training data. Therefore, it is important to verify the origin of a model’s training data for accountability purposes. In this work, we take the first step in the forensic analysis of models trained on GAN-generated data. Specifically, we first detect whether a model is trained on GAN-generated or real data. We then attribute these models, trained on GAN-generated data, to their respective source GANs. We conduct extensive experiments on three datasets, using four popular GAN architectures and four common model architectures. Empirical results show the remarkable performance of our detection and attribution methods. Furthermore, we conduct a more in-depth study and reveal that models trained on various data sources exhibit different decision boundaries and behaviours.

History

Primary Research Area

  • Trustworthy Information Processing

Name of Conference

International Conference on Acoustics Speech and Signal Processing (ICASSP)

Page Range

4875-4879

Publisher

Institute of Electrical and Electronics Engineers (IEEE)

Open Access Type

  • Not Open Access

BibTeX

@conference{Han:Salem:Li:Guo:Backes:Zhang:2024, title = "Detection and Attribution of Models Trained on Generated Data", author = "Han, Ge" AND "Salem, Ahmed" AND "Li, Zheng" AND "Guo, Shanqing" AND "Backes, Michael" AND "Zhang, Yang", year = 2024, month = 4, pages = "4875--4879", publisher = "Institute of Electrical and Electronics Engineers (IEEE)", doi = "10.1109/icassp48485.2024.10446824" }

Usage metrics

    Categories

    No categories selected

    Licence

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC