CSDA: A novel attention-based LSTM approach for code search

7Citations
Citations of this article
8Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

Previous studies have proposed semantic-based approaches for code search over large-scale codebases, which has bridged the gap in understanding the semantics between natural language and source-code language. However, these studies either failed to determine an effective method for semantic representations or lacked the distinction of semantic features. In this study, we propose a novel attention-based LSTM neural network known as CSDA (Code Search based on Description Attention), which can effectively improve the code search performance. The proposed model can focus on different parts of a semantic feature when numerous aspects of a source code snippet are used as input. As opposed to assigning the same weight to different parts of the semantic vector, CSDA takes the semantics of natural language descriptions into account, so that the subtle differences hidden in the code snippet can be discriminated and associated with the corresponding queries. We compare CSDA with the existing state-of-the-art approach CODEnn, which uses a jointly embedding technique for code search. Our experimental evaluation demonstrates that CSDA outperforms previous methods and achieves superior code search performance to CODEnn, with higher success rates and mean reciprocal ranks. This study provides significant insights into the use of semantic representation methods in deep learning-based code search approaches.

Cite

CITATION STYLE

APA

Ren, L., Shan, S., Wang, K., & Xue, K. (2020). CSDA: A novel attention-based LSTM approach for code search. In Journal of Physics: Conference Series (Vol. 1544). Institute of Physics Publishing. https://doi.org/10.1088/1742-6596/1544/1/012056

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free