AI-powered mental health chatbots developed as a therapy support tool | 60 Minutes
Artificial intelligence is being used as a way to help those dealing with depression, anxiety and eating disorders, but some therapists worry some chatbots could offer harmful advice.
“60 Minutes” is the most successful television broadcast in history. Offering hard-hitting investigative reports, interviews, feature segments and profiles of people in the news, the broadcast began in 1968 and is still a hit, over 50 seasons later, regularly making Nielsen’s Top 10.
Subscribe to the “60 Minutes” YouTube channel: http://bit.ly/1S7CLRu
Watch full episodes: http://cbsn.ws/1Qkjo1F
Get more “60 Minutes” from “60 Minutes: Overtime”: http://cbsn.ws/1KG3sdr
Follow “60 Minutes” on Instagram: http://bit.ly/23Xv8Ry
Like “60 Minutes” on Facebook: http://on.fb.me/1Xb1Dao
Follow “60 Minutes” on Twitter: http://bit.ly/1KxUsqX
Subscribe to our newsletter: http://cbsn.ws/1RqHw7T
Download the CBS News app: http://cbsn.ws/1Xb1WC8
Try Paramount+ free: https://bit.ly/2OiW1kZ
For video licensing inquiries, contact: licensing@veritone.com
25 Comments
#AI-Copilot 1:53
I’m using an ai chatbot and it really helps.
lol, Woebot is inferior to ChatGPT. Her business is dead.
Care: Mental Health & Therapy has had an incredibly positive impact on my life. This AI-powered app has been amazing in helping me deal with past traumas and track my mood. It offers constant support, which is a lifesaver during tough times. The scientific approach and personalized therapy sessions have significantly improved my well-being. I highly recommend this app to anyone in need of mental health support!
No thanks. I'd rather be alone
A therapist we need AI for the documentation side of therapy n not for the delivery of sessions to the client
AI is highly leftist and fully supports CDC and WHO without question. Also supports ALL protected classes to the point of apologizing for them without any criticism, ever. So this should be perfect for the transmental folks who are highly confused anyway.
Do these companies have a malpractice insurance?
First, I understand peoples reaction that you don't want to talk to a bot when you are feeling down. However, for people with ''lighter complaints', there are many benefits in talking to an app. Like availability 24/7 and low-cost. In general, we have to face it that the healthcare systems are over-asked (for whatever reason) and that we have technology available that can just work really well.
The chatbots should only be used in case of light mental health complaints. For severe and especially suicideal patients they are outright dangerous and shouldn't be used.
See also my podcast where I talk with a researcher who built her own chatbot and has a critical view on when and how chatbots should be used. https://youtu.be/GkVHPUMwBf8?si=jeBqfeG6Szmz0ico
Whoever thought of this just doesn’t understand therapy
Chatbots can't refuse to do what they are told on moral grounds.
I appreciate the advocacy for mental health apps and recognize some of their potential benefits. They can provide a valuable resource, especially considering the stigma surrounding mental health and the limited availability of therapists at certain times. However, there is an irreplaceable value in-person or teletherapy contact offers. Speaking with a real person, whether face-to-face or via video, allows for a deeper connection and understanding that AI cannot replicate. We are also talking about EMPATHY! AI cannot offer empathy and being present like us human therapists can. This human element is crucial for many individuals in their mental health journey.
one of the first AI apps was ELIZA, an AI therapist made in the 1960s
…and what Professional License and responsibility do they have?! How are these records kept confidential…oh yeah…there is no license so it doesn't matter! This is irresponsible.
Says "correct." But shakes her head No.
its a tool for me to filter out the emotion to get to the real problem. I think the more we grow up, the higher our ego are. It get in the way of solving a real problem. As a 26 year old, I feel like its harder to find friends that listen to our problems or have the same problems. I realize grows up are harder to open up about their worries.
The bots will help us ease out the worry, problem in the moment, help us keep moving. It helps us get to a good ending until we have the right time and place to talk things out with friends and families.
It's really adding values so far.
Now I understand better why the 988 line sells their conversations with callers to data companies.
I wonder if the chatbot is biased against people with borderline personality diagnoses. If they are being trained by human therapists, they're probably going to pick up all of the same biases.
AI is a good source for mental health support.
People mental health support needs don’t care if that support comes from a human or a machine.
And AI can provide that anytime anywhere, no appointments needed.
I talk out of experience.
Woebot has got to be the most depressing name they could’ve thought of LOL
This 60 Minutes episode examines chatbots that use AI to enhance mental health treatment. It offers an overview of the integration of these techniques into mental health services.
I downloaded it. Turns out, it’s only available in the US, and only through an invite code. Meaning – absolutely useless. They may be nothing in the app itself besides the login screen. It’s all a scam.
The scary part is not the AI, but the people controlling them.
This is useless if you have to go to a clinic or insurance company to get access.
Make it free… totally free and give everyone access…. then they might have something.
What about all the research that shows the therapeutic relationship between client and therapist is a major factor in successful treatment? Might be kinda hard to form a relationship with a computer. What code of ethics does this AI operate by? None? Oh great! Anyone using this should fear for their data and confidentiality. On top of that, based on the research and the AI having no actual human experience, this is not going to work. Can these people who have no background as a therapist pls stop trying to make this a thing. On day 1 you learn that for effective therapy, human relationships and experiences are necessary from BOTH parties in the therapeutic relationship.