Experimentation is critical to understanding the malware operation and to evaluating potential defenses. However, constructing the controlled environments needed for this experimentation is both time-consuming and error-prone. In this study, we highlight several common mistakes made by researchers and conclude that existing evaluations of malware detection techniques often lack in both flexibility and transparency. For instance, we show that small variations in the malware’s behavioral parameters can have a significant impact on the evaluation results. These variations, if unexplored, may lead to overly optimistic conclusions and detection systems that are ineffective in practice. To overcome these issues, we propose a framework to model malware behavior and guide systematic parameter selection. We evaluate our framework using a synthetic botnet executed within the CyberVAN testbed. Our study is intended to foster critical evaluation of proposed detection techniques and stymie unintentionally erroneous experimentation.
CITATION STYLE
Celik, Z. B., McDaniel, P., & Bowen, T. (2018). Malware modeling and experimentation through parameterized behavior. Journal of Defense Modeling and Simulation, 15(1), 31–48. https://doi.org/10.1177/1548512917721755
Mendeley helps you to discover research relevant for your work.