Authors
Mariam El Mezouar, Feng Zhang, Ying Zou
Publication date
2016/12
Conference
Proceedings of the 25th Annual International Conference on Computer Science and Software Engineering (CASCON'16)
Publisher
ACM
Description
Software entities (eg, files or classes) do not have the same density of defects and therefore do not require the same amount of effort for inspection. With limited resources, it is critical to reveal as many defects as possible. To satisfy such need, effort-aware defect prediction models have been proposed. However, the performance of prediction models is commonly affected by a large amount of possible variability in the training data. Prior studies have inspected whether using a subset of the original training data (ie, local models) could improve the performance of prediction models in the context of defect prediction and effort estimation in comparison with global models (ie, trained on the whole dataset). However, no consensus has been reached and the comparison has not been performed in the context of effort-aware defect prediction. In this study, we compare local and global effort-aware defect prediction models using 15 projects from the widely used AEEEM and PROMISE datasets. We observe that although there is at least one local model that can outperform the global model, there always exists another local model that performs very poorly in all the projects. We further find that the poor performing local model is built on the subset of the training set with a low ratio of defective entities. By excluding such subset of the training set and building a local effort-aware model with the remaining training set, the local model usually underperforms the global model in 11 out of the 15 studied projects. A close inspection on the failure of local effortaware models reveals that the major challenge comes from defective entities with small size (ie, few lines of …
Total citations
2017201820192020202120222023122122
Scholar articles
ME Mezouar, F Zhang, Y Zou - Proceedings of the 26th Annual International …, 2016