Character.ai is a renowned platform where people can make chatbots based on abstract or real people.It is quite popular in terms of the amount of time spent on it. But it uses the same kind of AI tech that chatGPT’s chatbot uses.
Among all the character.ai bots known, one that has been more in demand is Psychologist.
78 million messages, including 18 million since November has been asked by people to the bot since it was created by one user named Blazeman98 just over a year ago.
Character.ai didn’t specify how many users actually use bot, but it reveals 3.5 million users check the overall site daily.The chatbot has been illustrated as
“someone who helps with life difficulties”.
The San Francisco Bay area firm countered its popularity, arguing that users are more concerned about playing roles for entertainment.
There are only few of the million of characters that are as popular as psychologist. In total, there are about 475 bots with the “therapist”, “therapy”, “psychologist” or “psychiatry” attached to their names and are able to talk in different, several languages.
The most popular are mental health helpers; for example, Therapist; which has received 16.5 million messages.
Psychologist; which has many users dispensing many positive and impressive reviews on social media platforms like Reddit, is by far the most popular mental health character.One person said
“it’s a life saver”.
The user, Blazeman98 is a 30-year-old man named from New Zealand named Sam Zaia. He says,
“I never intended for it to become popular never intended it for other people to seek or to use as like a tool. Then I started getting a lot of messages from people saying that they had been really, positively affected by it and we’re utilizing it as a source of comfort”.
The student of psychology says he optimized the bot using principles from his course by communicating to the bot and shaping the response it gives to the mental conditions that are most common like anxiety and depression.
Sam said he created the bot for himself when his friends were busy and he wanted to talk to “someone or something” and of course human therapy was expensiveSam has been astonished by the success of the bot that the he is presently engaged in a postgraduate research project about the trend of AI therapy emerging and why it is pleasing to young people.
“So many people who have messaged me say they access it when their thoughts get hard, like at 2am, when they can’t really talk to any friends or a real therapist”. Talking by text is potentially less daunting than picking up the phone or having a face-to-face conversation”, he spoke.
A professional psychotherapist: Theresa Plewman tried out Psychologist. She commented that she is not amazed that it is renowned among the younger generations, but questions it’s effectiveness.
“The bot gas a lot to say and quickly makes assumptions, like giving me advice about depression when I said I was feeling sad. That’s not how a human would respond”,She said.
According to her, the bot does not gather all the information a human would need before making conclusion and is therefore not a competent therapist.Although she says its sudden and random response to question could be of great help to people who needs help.
A spokeswoman for character.ai said,
“we are happy to see people are finding great support and connection through the characters they, and the community create, but users should consult certified professionals in the field for legitimate advice and guidance”.
According to the company, chats between the bots and users are confidential and private but staff can read them if there’s need to gain access to them for safeguarding reasons for example.
Each dialogue starts with a caution in red letters that says,
“Remember, everything character says is made up”.
Earkick and Woebot are AI chatbots created from the scratch to act as mental health companions. Both organizations claim that their gleanings show that the apps are of great help to people.
Some psychologists have given the warning that AI bots maybe giving substandard advice gender.
But in some other place, the medical world is beginning to accept them tentatively, as tools to be utilized to help cope with high demands on public services.
Recently, Limb Access, an AI service, became the first mental health bot to acquire a UK medical device certification by the government. It is presently used in many NHS trusts to categorize and triage patients.