International conference on Software Engineering

Date:
Mis à jour le 08/04/2021
L’équipe-projet Rmod obtient un article lors de l'ISCE (International conference on Software Engineering) , l'une des plus importantes conférences du domaine.
Cet article présente un des travaux de l’équipe sur les Rotten Green Tests .
RMOD
© Inria / Photo Raphaël de Bengy

L’article "Rotten Green Tests", rédigé par plusieurs scientifiques de l’équipe-projet Rmod* est publié pour  l'ISCE (International conference on Software Engineering) qui aura lieu en mai 2019 à Montréal. 

Résumé de l’article

Unit tests are a tenant of agile programming methodologies, and are widely used to improve code quality and prevent code regression. A green (passing) test is usually taken as a robust sign that the code under test is valid. However, some green tests contain assertions that are never executed. We call such tests Rotten Green Tests. Rotten Green Tests represent a case worse than a broken test: they report that the code under test is valid, but in fact do not test that validity. We describe an approach to identify rotten green tests by combining simple static and dynamic call-site analyses. Our approach takes into account test helper methods, inherited helpers, and trait compositions, and has been implemented in a tool called DrTest. DrTest reports no false negatives, yet it still reports some false positives due to conditional use or multiple test contexts. Using DrTest we conducted an empirical evaluation of 19,905 real test cases in mature projects of the Pharo ecosystem. The results of the evaluation shows that the tool is effective; it detected 294 tests as rotten – green tests that contain assertions that are not executed. Some rotten tests have been “sleeping” in Pharo for at least 5 years.
Auteurs Julien Delplanque , Stéphane Ducasse , Guillermo Polito , Andrew P. Black , Anne Etien

*l’équipe-projet Rmod est une équipe-projet Inria commune avec le laboratoire CRIStAL (Centrale Lille, CNRS, Université de Lille)