Can Your Phone Be Your Therapist? Young People’s Ethical Perspectives on the Use of Fully Automated Conversational Agents (Chatbots) in Mental Health Support

  • Kretzschmar K
  • Tyroll H
  • et al.
N/ACitations
Citations of this article
429Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

Over the last decade, there has been an explosion of digital interventions that aim to either supplement or replace face-to-face mental health services. More recently, a number of automated conversational agents have also been made available, which respond to users in ways that mirror a real-life interaction. What are the social and ethical concerns that arise from these advances? In this article, we discuss, from a young person's perspective, the strengths and limitations of using chatbots in mental health support. We also outline what we consider to be minimum ethical standards for these platforms, including issues surrounding privacy and confidentiality, efficacy, and safety, and review three existing platforms (Woebot, Joy, and Wysa) according to our proposed framework. It is our hope that this article will stimulate ethical debate among app developers, practitioners, young people, and other stakeholders, and inspire ethically responsible practice in digital mental health.

Cite

CITATION STYLE

APA

Kretzschmar, K., Tyroll, H., Pavarini, G., Manzini, A., & Singh, I. (2019). Can Your Phone Be Your Therapist? Young People’s Ethical Perspectives on the Use of Fully Automated Conversational Agents (Chatbots) in Mental Health Support. Biomedical Informatics Insights, 11, 117822261982908. https://doi.org/10.1177/1178222619829083

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free