Kretzschmar K, Tyroll H, Pavarini G, Manzini A, Singh I. (2019). Can your phone be your therapist? Young people’s ethical perspectives on the use of fully automated conversational agents (chatbots) in mental health support. Biomedical Informatics Insights. 11: 1–9. doi: 10.1177/1178222619829083
Twenty-four members and 3 co-leaders of an English health ethics student advisory group (aged 14-18) participated in group discussions of adolescent perspectives on computerized conversational agents (chatbots) in mental health interventions at a 4-day digital health conference in London. Participants proposed 3 standards for chatbot design: respect for user privacy, evidence-based intervention, and protection of user safety. Participants also evaluated 3 popular, commercial cognitive behavioral therapy (CBT)-based chatbots, Wysa, Woebot, and Joy, against these 3 standards. Wysa, a mobile application (app) allowed users anonymous in-app chatbot dialogue, whereas Woebot, an app and Facebook Messenger add-on, and Joy, a Facebook Messenger add-on, exposed users’ names and chatbot conversations to third-party access. While Woebot and Wysa pledged to protect user data from outside companies, Joy reserved the right to fully exploit user content. Woebot alone had empirical data (significant reduction of depression symptoms, high engagement levels) to support its alleged benefits. Participants believed chatbots’ discrete intervention, commercial availability and low, inexpensive smartphone data burden may be ideal for adolescents. Participants emphasized the need for chatbots to be transparent in their capabilities, limitations, and target audience. Participants also recommended that chatbots urge users to pursue conventional therapy and discourage use by individuals with severe mental illnesses. Increased demand for chatbots may augment deficits in current mental health care models, and discussion results suggest that prioritization of privacy, safety, and evidence-driven design in chatbot interventions may be beneficial. If validated, chatbots may offer individuals with mental health difficulties an accessible, affordable, lower-stigma tool.