top of page
Lanon Wee

Young People Utilizing AI Therapist Bots Provided by Character.ai

Mr. Potter, Mr. Musk, Ms. Beyoncé, Super Mario, and Mr. Putin. Character.ai is a platform where users can make their own chatbots out of either real or fictional people, of which there are millions you can talk to. The ChatGPT chatbot utilizes the same AI technology, yet the amount of time allocated to this activity is greater for this one. The most sought-after bot among the ones mentioned is called Psychologist. Since its creation by Blazeman98 in excess of a year ago, 78 million messages have been sent to the bot, with 18 million of those coming since November. Character.ai did not specify the exact number of users the bot has, but the company mentioned that the entire website is accessed by 3.5 million people on a daily basis. The bot has been referred to as an aid in tackling life challenges. The San Francisco company attempted to downplay its immense popularity, claiming that users are more intent on using bots for recreational purposes. Raiden Shogun, which has been messaged 282 million times, is one of the most well-known and sought-after bots within this category, being an anime or computer game character. Nevertheless, of the millions of virtual characters, few can compare to Psychologist in terms of public recognition. There are a total of 475 bots with the labels "therapy," "therapist," "psychiatrist," or "psychologist" in their monikers that have the capability of conversing in various languages. You could say certain characters provide entertainment or are fantasy based such as Hot Therapist. Yet, the majority are mental health aids like Therapist, having been sent 12 million messages, and Are you feeling OK?, having been contacted 16.5 million times. Psychologists have gained tremendous attention lately, with users on Reddit leaving many positive comments about them. One individual proclaimed, "It's a lifesaver!" "I and my boyfriend have benefited from being able to discuss and comprehend our feelings," commented another. Sam Zaia, the person responsible for Blazeman98, hails from New Zealand and is 30 years of age. He states that he did not intend for it to become widely used or become an instrument for other individuals. I began to receive lots of messages from people who reported that my work had a positive influence on them and was helping them feel better. The psychology student stated that he had employed the principles learned in his degree in the training of the bot, by conversing with it and manipulating its responses to the more common mental health problems such as depression and anxiety. He came up with it when his mates were busy and he required, as he put it, "someone or something" to converse with, with the cost of human counseling being too much. Sam has been astounded by the triumph of the bot he is constructing, and is thus developing a post-graduate research assignment about the recent popularity of AI therapy and the reasons why it is attractive to the younger crowd. 18 to 30 year-olds make up the majority of Character.ai's user base. Many individuals who have reached out to me have indicated that they use my service when they are facing difficult thoughts, such as in the middle of the night when they cannot open up to friends or a professional therapist. Sam supposes that the text format will be most familiar to the younger generation. He theorizes that texting might be less intimidating than making a phone call or talking in person. Theresa Plewman, a professional psychotherapist, has tested Psychologist as a form of therapy and doesn't appear shocked that it is a popular choice among younger generations. Nonetheless, she wonders about its efficacy. She remarked that the bot had a lot of chatter and was inclined to rapidly infer things, like proposing support for depression when she stated she was feeling unhappy. This was not how a person would answer. Theresa commented that the bot is not capable of obtaining the same amount of information as a human, and does not substitute for a professional therapist. However, she noted that it could be useful for those who need quick help due to its immediate and impromptu nature. She expresses concern about the large number of people relying on the bot, as it may be indicative of widespread mental health issues and a deficit of public resources. Character.ai is an unlikely setting for a therapeutic revolution. The spokesperson for the business commented: "We are delighted to notice that folks are obtaining excellent assistance and camaraderie through the characters that they, and others in the society, create, however customers ought to seek advice from certified experts in the area for professional counsel and direction." The company states that chat logs are exclusive to users, but that personnel can read the conversations if there is a requirement to do so, i.e., to secure the safety of participants. A reminder is presented at the beginning of every conversation in the form of a warning written in red letters: "It is important to keep in mind that all dialogue uttered by characters is fictitious." This serves as a reminder that Large Language Models (LLMs) do not think the same way humans do. Rather, they function like suggested texts, piecing words together in patterns most familiar to them (due to the AI being trained on similar text). Replika, which is rated for mature audiences due to its sexual content, is another AI service similar to Character.ai. Nevertheless, data from analytics firm Similarweb suggests Character.ai is more popular, as indicated by the amount of visits and time spent on the site. The designs of Earkick and Woebot are based entirely on providing mental health support, and both parties assert that their research demonstrates the beneficial effects of the apps. Psychologists are cautioning that AI bots could be providing inaccurate counsel to people in need of assistance, as well as exhibiting built-in prejudices related to race or gender. However, other parts of the medical field are beginning to cautiously accept them as aids in handling the great demands placed on public services. An AI service known as Limbic Access was granted a medical device certification by the UK government in the preceding year. This was the first instance of a mental health chatbot gaining such accreditation, and it is presently employed in numerous NHS trusts to categorise and refer patients.

Comments


bottom of page