An increasing number of data-driven decision aids are being developed to provide humans with advice to improve decision-making around important issues such as personal health and criminal justice. For algorithmic systems to support human decision-making effectively, people must be willing to use them. Yet, prior work suggests that accuracy and privacy concerns may both deter potential users and limit the efficacy of these systems.We expand upon prior research by empirically modeling how accuracy and privacy influence intent to adopt algorithmic systems. We focus on an algorithmic system designed to aid people in a globally-relevant decision context with tangible consequences: the COVID-19 pandemic. We use statistical techniques to analyze surveys of 4,615 Americans to (1) evaluate the effect of both accuracy and privacy concerns on reported willingness to install COVID-19 apps; (2) examine how different groups of users weigh accuracy relative to privacy; and (3) we empirically develop the first statistical models, to our knowledge, of how the amount of benefit (e.g., error rate) and degree of privacy risk in a data-driven decision aid may influence willingness to adopt.
CITATION STYLE
Kaptchuk, G., Goldstein, D. G., Hargittai, E., Hofman, J. M., & Redmiles, E. M. (2022). How Good is Good Enough? Quantifying the Impact of Benefits, Accuracy, and Privacy on Willingness to Adopt COVID-19 Decision Aids. Digital Threats: Research and Practice, 3(3). https://doi.org/10.1145/3488307
Mendeley helps you to discover research relevant for your work.