CISPA
Browse
2618_simscood_systematic_analysis_o.pdf (1.01 MB)

SimSCOOD: Systematic Analysis of Out-of-Distribution Generalization in Fine-tuned Source Code Models

Download (1.01 MB)
conference contribution
posted on 2024-04-03, 11:20 authored by Hossein Hajipour, Ning Yu, Cristian-Alexandru Stalcu, Mario Fritz

 

Large code datasets have become increasingly accessible for pre-training source code models. However, for the fine-tuning phase, obtaining representative training data that fully covers the code distribution for specific downstream tasks remains challenging due to the task-specific nature and limited labeling resources. Moreover, fine-tuning pretrained models can result in forgetting previously acquired pre-training knowledge. These lead to out-of-distribution (OOD) generalization issues with unexpected model inference behaviors that have not been systematically studied yet. In this paper, we contribute the first systematic approach that simulates various OOD scenarios along different dimensions of source code data properties and study the fine-tuned model behaviors in such scenarios. We investigate the behaviors of models under different fine-tuning methodologies, including full fine-tuning and Low-Rank Adaptation (LoRA) fine-tuning methods. Our comprehensive analysis, conducted on four state-of-the-art pretrained models and applied to two code generation tasks, exposes multiple failure modes attributed to OOD generalization issues. Additionally, our analysis uncovers that LoRA fine-tuning consistently exhibits significantly better OOD generalization performance than full fine-tuning across various scenarios.

History

Primary Research Area

  • Trustworthy Information Processing

Name of Conference

Annual Conference of the North American Chapter of the Association for Computational Linguistics (NAACL)

Journal

Findings of the Association for Computational Linguistics: NAACL 2024

Open Access Type

  • Gold

BibTeX

@conference{Hajipour:Yu:Stalcu:Fritz:2024, title = "SimSCOOD: Systematic Analysis of Out-of-Distribution Generalization in Fine-tuned Source Code Models", author = "Hajipour, Hossein" AND "Yu, Ning" AND "Stalcu, Cristian-Alexandru" AND "Fritz, Mario", year = 2024, month = 6, journal = "Findings of the Association for Computational Linguistics: NAACL 2024" }

Usage metrics

    Categories

    No categories selected

    Licence

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC