Article Excerpt: Amid the many AI chatbots and avatars at your disposal these days, you’ll find all kinds of characters to talk to: fortune tellers, style advisers, even your favorite fictional characters. But you’ll also likely find characters purporting to be therapists, psychologists or just bots willing to listen to your woes.
There’s no shortage of generative AI bots claiming to help with your mental health, but go that route at your own risk. Large language models trained on a wide range of data can be unpredictable. In just a few years, these tools have become mainstream, and there have been high-profile cases in which chatbots encouraged self-harm and suicide and suggested that people dealing with addiction use drugs again. These models are designed, in many cases, to be affirming and to focus on keeping you engaged, not on improving your mental health, experts say. And it can be hard to tell whether you’re talking to something that’s built to follow therapeutic best practices or something that’s just built to talk…
One advantage of AI chatbots in providing support and connection is that they’re always ready to engage with you (because they don’t have personal lives, other clients or schedules). That can be a downside in some cases, where you might need to sit with your thoughts, Nick Jacobson, an associate professor of biomedical data science and psychiatry at Dartmouth, told me recently. In some cases, although not always, you might benefit from having to wait until your therapist is next available. “What a lot of folks would ultimately benefit from is just feeling the anxiety in the moment,” he said.
Full Article: https://tinyurl.com/2m667a7h
Article Source: CNET