This convergent parallel mixed-method study investigated how individuals use ChatGPT for emotional and mental health support (EMS) in real-world contexts. This study generated both qualitative and quantitative data concurrently in the main online survey to examine the real-world frequency of use, therapeutic purposes, emotional experiences, and perceived effectiveness of ChatGPT for EMS. Data were collected in two stages via the Prolific platform and Qualtrics between March and April 2024, following a pilot study conducted in January 2024. From an initial sample of 4,387 respondents, 384 individuals (9%) reported using ChatGPT for both text generation and EMS purposes. Of these, 270 valid participants were included in the final analysis. Participants ranged in age from 18 to 67 years (M = 30.06), with the majority identifying as female (57.8%). The most common usage frequency was once or twice a month (38%), indicating regular engagement. Participants reported that ChatGPT was used to address both traditional mental health needs like symptom management and mental health literacy, as well as broader psychosocial needs, including companionship and decision-making support. The perceived effectiveness of ChatGPT was high, with 73% of participants rating ChatGPT as helpful or very helpful, while only 0.4% found it not helpful. Participants’ emotional experiences encompassed positive (e.g., relief, connection), neutral (e.g., curiosity), and negative (e.g., disappointment, disconnection) responses. While many users reported experiencing meaningful emotional engagement and validation, limitations included superficial interactions, a lack of personalization, and the absence of human presence. This study highlights the inconsistencies in how humans experience Gen-AI interactions for EMS. Many participants rated their emotional connectedness as like that of a human therapist, but disappointment, disconnection, and emptiness after use were also common. Many participants reported feeling ashamed or embarrassed for using ChatGPT rather than seeking a human provider. While GenAI may bridge some of the care gaps currently seen in access to mental health care, including stigma and freedom, there seems to be a new stigma emerging where users feel ashamed for secretly seeking support from AI, which warrants further research in the future.
Eye on Innovation