To understand requirements traceability in practice, we present a preliminary study of identifying questions from requirements repositories and examining their answering status. Investigating four open-source projects results in 733 requirements questions, among which 43% were answered successfully, 35% were answered unsuccessfully, and 22% were not answered at all. We evaluate the accuracy of using a state-of-The-Art natural language processing tool to identify the requirements questions and illuminate automated ways to classify their answering status.
CITATION STYLE
Gupta, A., Wang, W., Niu, N., & Savolainen, J. (2018). Poster: Answering the requirements traceability questions. In Proceedings - International Conference on Software Engineering (pp. 444–445). IEEE Computer Society. https://doi.org/10.1145/3183440.3195049
Mendeley helps you to discover research relevant for your work.