Close Menu
The Baylor Lariat
    Facebook X (Twitter) Instagram YouTube LinkedIn
    Trending
    • Review: The good, bad, tasty of ACL Eats
    • 5 years after Notre-Dame fire, lead organist comes to Baylor
    • Rivalry-week energy fuels Baylor ahead of TCU game
    • James Franklin, coach firings show programs focused on expectations, not success
    • Sports Take: Top 5 Baylor MBB players from the 2000s
    • Baylor athletes inspire Waco’s youth
    • Government shutdown: How does it affect you?
    • Mobile clinics provide speedy solution to flu prevention
    • About us
      • Fall 2025 Staff Page
      • Copyright Information
    • Contact
      • Contact Information
      • Letters to the Editor
      • Subscribe to The Morning Buzz
      • Department of Student Media
    • Employment
    • PDF Archives
    • RSS Feeds
    Facebook X (Twitter) Instagram YouTube LinkedIn
    The Baylor LariatThe Baylor Lariat
    Subscribe to the Morning Buzz
    Tuesday, October 14
    • News
      • State and National News
        • State
        • National
      • Politics
        • 2025 Inauguration Page
        • Election Page
      • Homecoming Page
      • Baylor News
      • Waco Updates
      • Campus and Waco Crime
    • Arts & Life
      • Wedding Edition 2025
      • What to Do in Waco
      • Campus Culture
      • Indy and Belle
      • Sing 2025
      • Leisure and Travel
        • Leisure
        • Travel
          • Baylor in Ireland
      • Student Spotlight
      • Local Scene
        • Small Businesses
        • Social Media
      • Arts and Entertainment
        • Art
        • Fashion
        • Food
        • Literature
        • Music
        • Film and Television
    • Opinion
      • Editorials
      • Points of View
      • Lariat Letters
    • Sports
      • March Madness 2025
      • Football
      • Basketball
        • Men’s Basketball
        • Women’s Basketball
      • Soccer
      • Baseball
      • Softball
      • Volleyball
      • Equestrian
      • Cross Country and Track & Field
      • Acrobatics & Tumbling
      • Tennis
      • Golf
      • Pro Sports
      • Sports Takes
      • Club Sports
    • Lariat TV News
    • Multimedia
      • Video Features
      • Podcasts
        • Don’t Feed the Bears
      • Slideshows
    • Advertising
    The Baylor Lariat
    Home»Opinion

    Think before you type: AI shouldn’t be used as a therapist

    Marissa MunizBy Marissa MunizSeptember 17, 2025 Opinion No Comments3 Mins Read
    Marissa Muniz | LTVN
    Share
    Facebook Twitter LinkedIn Pinterest Email

    By Marissa Muniz | LTVN Reporter

    Over the last few years, the use of AI has skyrocketed, with 65% of Americans reporting that they utilize chatbots to answer immediate questions. Platforms like Meta and ChatGPT aren’t just appearing in workplaces and classrooms ­— they’ve also made their way into our personal lives. People now turn to AI as a data analyst, tutor, dietitian and surprisingly even for emotional support. But play therapist? That’s where things get dangerous.

    At first glance, this may sound reasonable. After all, AI doesn’t judge or bring its own feelings into the mix, but that’s exactly the problem. AI isn’t capable of emotions or empathy, qualities that are essential in therapy.

    For starters, therapists are sworn to secrecy by doctor-patient confidentiality. AI, on the other hand, stores and processes data in ways that raise serious privacy questions. Even the CEO of OpenAI has expressed concern about the safety of AI. If that’s the case, should we really be trusting these systems with our deepest secrets and personal struggles? Think of this like that two-faced friend who remembers information to throw in your face again later.

    Beyond privacy, the simple truth is that people go to therapy to gain a deeper understanding and unpack their emotions. AI doesn’t actually understand emotions; instead, it processes patterns of words. This software was not designed to distinguish between “hey” and “heyyy,” resulting in inconsistent advice.

    AI also has the potential to worsen existing mental health struggles. For someone already battling anxiety, depression or grief, reading a tone-deaf response can make them feel even more isolated. Each “wrong” answer chips away at trust and deepens the sense of being misunderstood. Real therapists are trained to spot subtle signs, adjust their tone and respond with care. Skills that AI simply does not possess.

    One of the most alarming risks of relying on AI for emotional support is the very real danger of wrongful deaths. There have already been cases where people turned to AI in moments of crisis and were met with harmful and even fatal advice. Instead of receiving life-saving guidance or empathy, these individuals were left with responses that worsened their state of mind and resulted in death or injury.

    Take ADHD coach Kendra Hilty, for example. She went viral this August after confessing she had fallen in love with her psychiatrist. While she technically had a licensed therapist, she spent most of her time talking to her AI chatbot, “Henry,” who only fed into her delusions. Instead of helping her work through reality, Henry fed her validation that her psychiatrist was manipulating her.

    Throughout her videos, she mentions at one point that the bot taught her about countertransference, which is when a therapist/psychiatrist develops feelings for a client. Hilty then started seeing her psychiatrist’s every move through that lens, convinced he must secretly love her back.

    The story blew up on TikTok, and for good reason. It illustrates precisely why AI poses a danger in the mental health space. It doesn’t challenge unhealthy thoughts; it just reflects them back. If you’re looking for real help, you don’t need AI fueling your delusions; you could just call up your delusional friend for guidance instead. At least then you’d get a human response.

    AI can be a powerful tool in various aspects of life. But when it comes to something as personal and sensitive as mental health, we need to think twice before handing the job of a trained, empathetic therapist over to a machine.

    AI Therapist ChatGPT heyyy Kendra Hilty meta OpenAI TikTok
    Marissa Muniz

    Marissa Muniz is a senior majoring in Broadcast Journalism and Corporate Communication. She works as a reporter and anchor, and loves telling stories and bringing them together for her audience. Outside the newsroom, you can usually find her with friends, talking about Taylor Swift, going on coffee runs, or asking people for their hot takes.

    Keep Reading

    Why familiar beats revolutionary: The truth about modern innovation

    Hollywood is butchering the classics and calling it ‘art’

    Self-driving cars are going the wrong direction

    The dilemma burning through national parks

    ‘The Lord of the Rings’ will give you hope

    Self-driving cars are going the wrong direction

    Add A Comment

    Comments are closed.

    Recent Posts
    • Review: The good, bad, tasty of ACL Eats October 13, 2025
    • 5 years after Notre-Dame fire, lead organist comes to Baylor October 13, 2025
    About

    The award-winning student newspaper of Baylor University since 1900.

    Articles, photos, and other works by staff of The Baylor Lariat are Copyright © Baylor® University. All rights reserved.

    Subscribe to the Morning Buzz

    Get the latest Lariat News by just Clicking Subscribe!

    Follow the Live Coverage
    Tweets by @bulariat

    Facebook X (Twitter) Instagram YouTube LinkedIn
    • Featured
    • News
    • Sports
    • Opinion
    • Arts and Life
    © 2025 ThemeSphere. Designed by ThemeSphere.

    Type above and press Enter to search. Press Esc to cancel.