An adaptable infrastructure to generate training datasets for decompilation issues

2Citations
Citations of this article
5Readers
Mendeley users who have this article in their library.
Get full text

Abstract

The conventional decompilation approach is based on a combination of heuristics and pattern matching. This approach depends on the processor architecture, the code generation templates used by the compiler, and the optimization level. In addition, there are specific scenarios where heuristics and pattern matching do not infer high-level information such as the return type of a function. Since AI has been previously used in similar scenarios, we have designed an adaptable infrastructure to facilitate the use of AI techniques for overcoming the decompilation issues detected. The proposed infrastructure is aimed at automatically generating training datasets. The architecture follows the Pipes and Filters architectural pattern that facilitates adapting the infrastructure to different kind of decompilation scenarios. It also makes it easier to parallelize the implementation. The generated datasets can be processed in any AI engine, training the predictive model obtained before adding it to the decompiler as a plug-in. © Springer International Publishing Switzerland 2014.

Cite

CITATION STYLE

APA

Escalada, J., & Ortin, F. (2014). An adaptable infrastructure to generate training datasets for decompilation issues. In Advances in Intelligent Systems and Computing (Vol. 276 VOLUME 2, pp. 85–94). Springer Verlag. https://doi.org/10.1007/978-3-319-05948-8_9

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free