AUTOMATED GENERATION OF SOFTWARE TEST CASES BY MEANS OF LARGE LANGUAGE MODELS USING PROMPT ENGINEERING

DOI: 10.31673/2412-4338.2025.048914

Authors

Abstract

Software testing is a critical phase of the development lifecycle, ensuring the reliability and correctness of software systems. Traditional test case generation can be time-consuming and labor-intensive, often requiring significant manual effort. With the rapid development of generative artificial intelligence, tools such as large language models offer new opportunities to automate this process. This paper investigates the application of these artificial intelligence-based tools for test case generation, evaluating their effectiveness in achieving complete code coverage for a variety of software tasks, including data processing tasks. This paper investigates the application of various large language models to automate the generation of software code and test cases in software development. It is found that the size of the LLM model significantly affects the quality of the generated code and tests, so the Gemma3:12B and GPT-5.1 and Gemini models show the best results. Small models (Gemma3:1B, Gemma2:7B) are more error-prone and have poorer edge case coverage than large ones. Test coverage and code quality for data processing tasks are found to significantly improve Test-Driven-Development and Data-analysis prompts. The robustness of numerical computations is also shown to correlate with LLM size. These results highlight the potential of generative AI to streamline software testing workflows, freeing developers from the need to focus on solving higher-order problems. However, the study also highlights the need for further improvements to these tools to increase their robustness and robustness. This work is a fundamental step towards using generative AI to transform software development and testing practices.

Keywords: Automated test case generation, AI-powered software testing, Generative AI for testing, AI-based test automation, AI-based test case generation

Published

2025-12-29

Issue

Section

Articles