Mental health apps bring unprecedented benefits and risks to individual and public health. A thorough evaluation of these apps involves considering two aspects that are often neglected: the algorithms they deploy and the functions they perform. We focus on mental health apps based on black box algorithms, explore their forms of opacity, discuss the implications derived from their opacity, and propose how to use their outcomes in mental healthcare, self-care practices, and research. We argue that there is a relevant distinction between functions performed by algorithms in mental health apps, and we focus on the functions of analysis and generation of advice. When performing analytic functions, such as identifying patterns and making predictions concerning people's emotions, thoughts, and behaviors, black box algorithms can be better than other algorithms to provide information to identify early signs of relapse, support diagnostic processes, and improve research by generating outcomes that lead to a better understanding of mental health. However, when carrying out the function of providing mental health advice, black box algorithms have the potential to deliver unforeseen advice that may harm users. We argue that the outcomes of these apps may be trustworthy as a complementary source of information, but express caution about black box algorithms that give advice directly to users. To reap the benefits of mental health apps based on black box algorithms and avoid unintended consequences, we critically need to know whether these algorithms are fulfilling the function of providing mental health advice.
CITATION STYLE
Manríquez Roa, T., & Biller-Andorno, N. (2023). Black box algorithms in mental health apps: An ethical reflection. Bioethics, 37(8), 790–797. https://doi.org/10.1111/bioe.13215
Mendeley helps you to discover research relevant for your work.