Leveraging Practitioners' Feedback to Improve a Security Linter

8Citations
Citations of this article
14Readers
Mendeley users who have this article in their library.

Abstract

Infrastructure-as-Code (IaC) is a technology that enables the management and distribution of infrastructure through code instead of manual processes. In 2020, Palo Alto Network's Unit 42 announced the discovery of over 199K vulnerable IaC templates through their "Cloud Threat"Report. This report highlights the importance of tools to prevent vulnerabilities from reaching production. Unfortunately, we observed through a comprehensive study that a security linter for IaC scripts is not reliable yet - high false positive rates. Our approach to tackling this problem was to leverage community expertise to improve the precision of this tool. More precisely, we interviewed professional developers to collect their feedback on the root causes of imprecision of the state-of-the-art security linter for Puppet. From that feedback, we developed a linter adjusting 7 rules of an existing linter ruleset and adding 3 new rules. We conducted a new study with 131 practitioners, which helped us improve the tool's precision significantly and achieve a final precision of . An important takeaway from this paper is that obtaining professional feedback is fundamental to improving the rules' precision and extending the rulesets, which is critical for the usefulness and adoption of lightweight tools, such as IaC security linters.

Author supplied keywords

Cite

CITATION STYLE

APA

Reis, S., Abreu, R., D’Amorim, M., & Fortunato, D. (2022). Leveraging Practitioners’ Feedback to Improve a Security Linter. In ACM International Conference Proceeding Series. Association for Computing Machinery. https://doi.org/10.1145/3551349.3560419

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free