Modeling of software fault detection and correction processes based on the correction lag

10Citations
Citations of this article
9Readers
Mendeley users who have this article in their library.

Abstract

This study presents a software reliability growth model integrating the fault detection process with the fault correction process. Although, a few research projects have been devoted to the modeling of these two processes, most of them studied the correction lag from a theoretical viewpoint of time delay. In this study, the correction lag is characterized by the remaining uncorrected faults which can be clearly observed from the actual data. Through analyzing its varying trend, the Gamma curve is found to be appropriate in representing the correction lag function. Then, the proposed model is derived. Two real data sets of software testing are used to evaluate models. Experimental results indicate that the proposed model not only provides better performance than other models on both fault detection and correction processes, but also does better in describing the correction lag. Finally, a revised software cost model is presented based on the proposed model. From the analysis on the determination of software release time, the new cost model shows more practical than the traditional approach. © 2009 Asian Network for Scientific Information.

Cite

CITATION STYLE

APA

Shu, Y., Liu, H., Wu, Z., & Yang, X. (2009). Modeling of software fault detection and correction processes based on the correction lag. Information Technology Journal, 8(5), 735–742. https://doi.org/10.3923/itj.2009.735.742

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free