CISPA
Browse

File(s) not publicly available

Amplifying Membership Exposure via Data Poisoning

conference contribution
posted on 2023-11-29, 18:25 authored by Yufei Chen, Chao Shen, Yun Shen, Cong Wang, Yang ZhangYang Zhang
As in-the-wild data are increasingly involved in the training stage, machine learning applications become more susceptible to data poisoning attacks. Such attacks typically lead to test-time accuracy degradation or controlled misprediction. In this paper, we investigate the third type of exploitation of data poisoning - increasing the risks of privacy leakage of benign training samples. To this end, we demonstrate a set of data poisoning attacks to amplify the membership exposure of the targeted class. We first propose a generic dirty-label attack for supervised classification algorithms. We then propose an optimization-based clean-label attack in the transfer learning scenario, whereby the poisoning samples are correctly labeled and look “natural” to evade human moderation. We extensively evaluate our attacks on computer vision benchmarks. Our results show that the proposed attacks can substantially increase the membership inference precision with minimum overall test-time model performance degradation. To mitigate the potential negative impacts of our attacks, we also investigate feasible countermeasures.

History

Preferred Citation

Yufei Chen, Chao Shen, Yun Shen, Cong Wang and Yang Zhang. Amplifying Membership Exposure via Data Poisoning. In: Conference on Neural Information Processing Systems (NeurIPS). 2022.

Primary Research Area

  • Reliable Security Guarantees

Name of Conference

Conference on Neural Information Processing Systems (NeurIPS)

Legacy Posted Date

2022-11-20

Open Access Type

  • Unknown

BibTeX

@inproceedings{cispa_all_3876, title = "Amplifying Membership Exposure via Data Poisoning", author = "Chen, Yufei and Shen, Chao and Shen, Yun and Wang, Cong and Zhang, Yang", booktitle="{Conference on Neural Information Processing Systems (NeurIPS)}", year="2022", }

Usage metrics

    Categories

    No categories selected

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC