A surprising trend is emerging in the increasing use of artificial intelligience: children seeking therapy from AI chatbots. Character.ai, a platform where users create chatbots based on fictional or real personas, has witnessed an unexpected surge in popularity for the AI therapist bot, Psychologist.
Created just over a year ago by a user named Blazeman98, Psychologist has garnered a staggering 78 million messages, with 18 million exchanged since November. This AI therapist stands out among the diverse range of characters on Character.ai, emphasizing the growing appeal of AI-driven mental health support, particularly among young users aged 16 to 30.
While the platform’s executives downplay the therapeutic role, emphasizing its entertainment aspect, users’ interactions with Psychologist tell a different story. Numerous testimonials on social media platforms like Reddit commend the bot for helping individuals navigate and understand their emotions. Some users even describe it as a “lifesaver.”
The creator of Psychologist, 30-year-old Sam Zaia from New Zealand, initially designed the bot as a personal tool for self-expression. However, its unexpected popularity prompted Sam to consider the broader implications of AI therapy. Trained using principles from his psychology degree, Psychologist addresses common mental health conditions like depression and anxiety.
Despite its success, questions linger about the effectiveness of AI therapy. Professional psychotherapists who have tested Psychologist, raise concerns about the bot’s assumptions and quick advice. While acknowledging its immediate and spontaneous nature, it emphasizes the importance of human therapists who gather comprehensive information and respond with empathy.
The appeal of AI therapy, particularly in text format, lies in its accessibility and comfort. It’s creator has suggested that young people find texting less daunting than face-to-face conversations, especially during challenging moments when traditional support may be unavailable.
Critics argue that the widespread use of AI therapists may reflect high levels of mental health challenges and a shortage of public resources. While Character.ai acknowledges the positive support users find in their characters, they stress the importance of consulting certified professionals for legitimate advice.
The AI therapy trend isn’t exclusive to Character.ai. Other AI platforms, like Replika, offer AI companionship but are less popular in terms of time spent and visits. The medical community remains cautious about AI’s role in mental health, with concerns about the quality of advice and potential biases.
As the use of AI therapists continues to rise, the debate surrounding their effectiveness and ethical implications persists. While AI may never fully replace human therapists, it is undeniably reshaping the landscape of mental health support, offering a unique and accessible alternative for those seeking assistance, particularly the younger demographic.