Improving Bug Severity Prediction With Domain-Specific Representation Learning

5Citations
Citations of this article
17Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

Automating the process of bug severity assignment can accelerate bug triagers' efficiency in the life-cycle of software maintenance, improving the quality of software products. The mainstream approaches for bug severity prediction mainly use different neural networks due to their automated learning ability. However, there are two problems that make existing approaches fail to predict severities for some bugs: 1) they cannot learn the internal knowledge of bug reports; 2) supervised training is difficult to understand the global context of bug reports. To resolve these two problems, in this paper, we propose a bug severity prediction approach, namely KICL, which combines pre-trained language models and domain-specific pre-training strategies, i.e., Knowledge-Intensified pre-training and contrastive learning pre-training. Specifically, Knowledge-Intensified allows KICL to learn project-specific bug report tokens, deeply understanding internal knowledge of bug reports. As for contrastive learning, it allows KICL to perform sequence-level learning, understanding bug reports from the perspective of the global context. When finishing pre-training, we can fine-tune pre-trained KICL for bug severity prediction. To evaluate the effectiveness of KICL, we choose six baseline approaches and compare their performance on a public dataset. The experimental results show that KICL outperforms all baseline approaches by up to 30.68% in terms of weighted average F1-score, achieving new results for bug severity prediction.

Cite

CITATION STYLE

APA

Wei, Y., Zhang, C., & Ren, T. (2023). Improving Bug Severity Prediction With Domain-Specific Representation Learning. IEEE Access, 11, 62829–62839. https://doi.org/10.1109/ACCESS.2023.3279205

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free