As AI has begun to make its debut in the mental health area, there has been a spread of new AI chatbots programmed with the purpose of being your online therapist. This started because some health care professionals decided to develop their own AI chatbots with the idea of being able to offer their services after the professionals clock out.
OpenAI, which owns ChatGPT, has released their own personal therapist, Abby. Users are still hesitant to use such websites or pre-made programs, as AI is still such a new industry, and one with many questions surrounding its safety and effectiveness.
However, AI systems that aren’t pre-scripted have led to some issues. These AIs can result in the system feeding of false information, which has been referred to as “model toxicity”. Because of this, some users of these AI systems have been told to self-harm as a way to release pressure. There was a case in Tallahassee last year where an AI chatbot encouraged a 14 year old boy to commit suicide.
“For those exploring the use of AI for mental health support, it is crucial to consult with a trained, licensed practicing counselor. AI may offer promising benefits, but its claims can sometimes be overly ambitious and simplified, non-evidence based, or even incorrect and potentially harmful,” American Counseling Association said.
It is repeated over and over that AI therapy is not the same, and that an AI therapist can not physically emphasize the way a human therapist would. Many tend to focus on ways AI therapy just falls short to a human therapist, but to many users, it is better for them. Some people just do not feel comfortable sharing information with an actual therapist, and would prefer to vent to a website who won’t do anything with said information.
“I think it could be a good thing because some people aren’t comfortable enough to talk to real people and talking to AI first might make it easier to get more comfortable with real people,” NHS freshman Kylie McKenzie said.
Saying this, there have been concerns over the data and information leaking that could potentially happen with AI therapists. These conversations that users are having are stored in the AI’s database and used to create more responses and scripts for the AI to use. Data storage in itself is somewhat flawed and can often have complications. conversations stored in the database could be leaked if the database were to have any complications because of this
“I think that the fact that the conversations could be leaked is dangerous and threatening to people’s online safety. I personally do not think that using AI as a therapist is worth the risks that are shown. In general, there could be both benefits and downsides of using AI as a therapist, but personally, I would find a professional person over AI,” NHS freshman Mia Lourenco said.
According to the American Psychological Association, Connecticut has only 15 to 20 therapists for every 10,000 residents. There is also an international shortage of available therapists, and this has made AI therapists a lot more popular throughout the United States. Many people who have been users of the AI therapists say that they have found it helpful, or even calming, when they couldn’t contact an actual therapist.
In California, a bill is being introduced that aims to ban companies from developing an AI chatbot that acts like a human therapist or health provider. This bill's goal is to piggyback off of existing laws that prohibit unlicensed people from implying that they are certified healthcare providers. This bill would further expand these laws in order to change how easily accessible un-scripted AI chatbots are.
At this point in time, AI is not recommended for making a diagnosis on mental health by the American Counseling Association. They say AI just does not contain the understanding and judgement that is needed to officially and accurately diagnose someone. Comparing AI to human therapists, AI chatbots cannot make comprehensive diagnoses and cannot consider a person’s complex traits as well as understand a person’s history or any sort of cultural context.
“I don’t think this is a good idea because AI in general is just looked down upon, and utilizing it to help with mental issues is just not the thing to do. And I think there are like hotlines you could reach out to if you need help immediately, and if you do not need immediate help, just wait til there is an available therapist,” Bethel High School freshman Vincent Caruso said.
It has been recommended by the American Counseling Association that those who do consult an AI therapist or chatbot do discuss this decision, or whatever the AI bot recommends, with a licensed professional that can deem these decisions safe or properly diagnosed. This recommendation gets rid of one of the key reasons why people are moving to AI therapists, this reason being that people have easier access to AI on the internet instead of having to spend up to months searching for an available professional.
Comments