Regression tests should consistently produce the same outcome when executed against the same version of the system under test. Recent studies, however, show a different picture: in many cases simply changing the order in which tests execute is enough to produce different test outcomes. These studies also identify the presence of dependencies between tests as one likely cause of this behavior. Test dependencies affect the quality of tests and of the correlated development activities, like regression test selection, prioritization, and parallelization, which assume that tests are independent. Therefore, developers must promptly identify and resolve problematic test dependencies.
This paper presents PRADET, a novel approach for detecting problematic dependencies that is both effective and efficient. PRADET uses a systematic, data-driven process to detect problematic test dependencies significantly faster and more precisely than prior work. PRADET scales to analyze large projects with thousands of tests that existing tools cannot analyze in reasonable amount of time, and found 27 previously unknown dependencies.
History
Preferred Citation
Alessio Gambi, Jonathan Bell and Andreas Zeller. Practical Test Dependency Detection. In: International Conference on Software Testing, Verification and Validation (ICST). 2018.
Primary Research Area
Empirical and Behavioral Security
Name of Conference
International Conference on Software Testing, Verification and Validation (ICST)
Legacy Posted Date
2018-02-14
Open Access Type
Green
BibTeX
@inproceedings{cispa_all_1495,
title = "Practical Test Dependency Detection",
author = "Gambi, Alessio and Bell, Jonathan and Zeller, Andreas",
booktitle="{International Conference on Software Testing, Verification and Validation (ICST)}",
year="2018",
}