Authors
Arghavan Moradi Dakhel, Amin Nikanjam, Vahid Majdinasab, Foutse Khomh, Michel C Desmarais
Publication date
2024/7/1
Journal
Information and Software Technology
Volume
171
Pages
107468
Publisher
Elsevier
Description
Context
One of the critical phases in the software development life cycle is software testing. Testing helps with identifying potential bugs and reducing maintenance costs. The goal of automated test generation tools is to ease the development of tests by suggesting efficient bug-revealing tests. Recently, researchers have leveraged Large Language Models (LLMs) of code to generate unit tests. While the code coverage of generated tests was usually assessed, the literature has acknowledged that the coverage is weakly correlated with the efficiency of tests in bug detection.
Objective
To improve over this limitation, in this paper, we introduce MuTAP (Mutation Test case generation using Augmented Prompt) for improving the effectiveness of test cases generated by LLMs in terms of revealing bugs by leveraging mutation testing.
Methods
Our goal is achieved by augmenting prompts with surviving mutants, as those …
Total citations
Scholar articles
AM Dakhel, A Nikanjam, V Majdinasab, F Khomh… - Information and Software Technology, 2024