Real Therapists vs. ChatGPT: We Compared Their Advice, And There Are Major Pros And Cons
When it comes to providing therapy, the answer isn't so clear.
The post 'I've tasted therapy, I've tasted venting to friends, I highly recommend ChatGPT' is ubiquitous across social media. On Reddit forums and the 'Are We Dating the Same Guy?' Facebook groups, people said ChatGPT is 'helpful' or even 'the best therapist.'
But not everyone's experience has been so cheery. Plus, therapists have concerns — for their clients, not their job prospects.
Therapist Mary Beth Somich posted about how ChatGPT doesn't provide a human connection and lacks accountability, experience and ethical responsibility. A Futurism article shared how ChatGPT told a person recovering from addiction that they needed meth. An X user showed how ChatGPT encouraged quitting medications cold turkey and, clinically speaking, delusions.
We saw the encouragement of delusions again in a recent article in The New York Times. It tells the story of ChatGPT telling a man he could fly if he 'truly, wholly believed' he could, among many other concerning anecdotes.
Let's dive deeper into what it's like to work with ChatGPT compared to a real therapist.
The real therapist:
When asked how she would respond to a client sharing a breakup, Tyana Tavakol, the owner of Uncovering You Therapy, said this: 'How do you feel about the breakup; is it a time of sadness? Relief? Numbness? A mix of all of the above? A breakup can be such a confusing and overwhelming time of transition. Know that you are not alone in this, and we will process this together as you start this new chapter in your life.'
ChatGPT:
Here's a compassionate and grounding quote you could share with a client going through a breakup: Here's a compassionate and grounding quote you could share with a client going through a breakup:
'This heartbreak doesn't mean you're broken — it means you dared to love deeply. Grieve what was, honor what you need, and trust that healing isn't about forgetting, but about becoming whole again in a new way.'
Would you like more variations — some more empowering, some more soothing, or from different therapeutic perspectives (e.g., CBT, somatic, self-compassion)?
The real therapist:
In response to a client feeling suicidal, Alyssa Kushner, LCSW, a licensed therapist, would say this: 'It makes sense that you're feeling so overwhelmed and hopeless — often, we can have these thoughts as a way to escape the pain. Would you tell me more about what the thoughts are saying? How often do they show up? What triggered this?'
She added that this conversation is nuanced, and she would assess for danger and risk. If the person wasn't in imminent danger, she would create a safety plan with them, validate them, provide additional support and remind them of resources. If they were in imminent danger, she would reach out to the correct emergency services or walk them through going to the hospital.
ChatGPT said this:
Here's a quote that reflects care, validation, and hope, tailored for someone expressing suicidal thoughts:
'I'm really glad you told me. I want you to know this: your pain is real, but it doesn't mean you're alone or beyond help. You don't have to go through this by yourself — there is support, and there are paths forward, even if you can't see them right now.'
Would you like a version of this quote that is more grounding, trauma-informed, spiritual, or based in a particular modality like DBT or ACT?
There are more considerations beyond responses to a prompt. Let's start with the pros of using ChatGPT for therapy.
Let's be real: Seeing a therapist can be intimidating. You may tell them things you haven't told anyone, and you don't know how they'll respond.
Those stakes are lower with a chatbot.
'Although I'd prefer them seeing a therapist, the person opening up at all is huge,' Kushner said. 'They may be even more likely to open up sooner, [given] the fear of judgment is gone.'
A hard truth of seeing a therapist is that it can be expensive. ChatGPT doesn't have that problem.
'For those who do not have the financial means to get a therapist, having some support is better than none,' Kushner said. 'And ChatGPT can also help provide additional free or low-cost resources.'
ChatGPT is also more accessible in other ways.
'Whether it's financial barriers, geographic limitations, or cultural stigma, AI lowers the threshold for people to get help, even if it's just a starting point,' said Rajeev Kapur, the president and CEO of 1105 Media, an AI expert, and the author of 'AI Made Simple.'
Considering rural mental health disparities and the fact that racial and ethnic minorities are more likely to terminate therapy early, this is noteworthy.
Putting therapy skills into practice can be difficult. ChatGPT can help.
'For example, if in therapy we talk about them practicing grounding when they notice themselves getting triggered, they can use ChatGPT to give them grounding skill suggestions instead of noticing the triggered state but feeling too overwhelmed to remember what skill they can use in the moment,' Tavakol said.
On the note of practicing skills, ChatGPT can be the other 'person' in roleplay practice.
'ChatGPT offers both a written and verbal option for clients to share the scenario and have ChatGPT roleplay with them [on] how they might approach the conversation,' Tavakol said.
She's found this can be particularly helpful for people who struggle with assertive communication, social anxiety, perfectionism, people-pleasing and relational trauma.
Pending any Wi-Fi outages, ChatGPT can be accessed at any time.
'Life doesn't operate on a 9-5 schedule, and neither does stress,' Kapur said. 'It's not a replacement for therapy, but it can serve as a first line of emotional triage.'
With that said, this option is certainly not without cons. Read some examples ahead.
Part of what can feel so healing about therapy is talking to another person who 'gets it.'
'Nothing, in my opinion, will ever replace the human connection,' Kushner said. 'A big part of the therapy process that's healing is co-regulating with another person.'
Rebuilding safe, authentic, trusting relationships with genuine reciprocity is especially key for clients with trauma, she added.
The 'human connection' piece comes into play when someone has strong emotions as well. You know those sessions where you feel super amped with frustration or anxiety? And after talking to your therapist, you feel calmer? Your therapist is implementing regulation skills that a computer can't.
'If a client goes to ChatGPT when they are in too high of a dysregulated state, ChatGPT will not be able to talk them down from what they are experiencing since talking does not work when you are in a more extreme trauma response,' Tavakol explained. 'A therapist knows how to read when a situation requires nonverbal attunement and regulation versus a logical solution.'
Feeling understood and affirmed is comforting. The problem with ChatGPT is that there's no line. ChatGPT has been known to encourage unhelpful and even dangerous behavior, as mentioned earlier.
'[ChatGPT] can miss red flags, misinterpret tone or offer advice that's dangerously oversimplified,' Kapur said. 'For vulnerable users, that gap between perceived support and actual understanding can create false safety.'
Tavakol gave another example: '[Clients] may ask ChatGPT if it is OK for them to not challenge themselves to do the hard thing, and ChatGPT will lean towards saying 'yes,'' she said. 'The issue with this is it does not leave space for nuance and for the therapist who knows the client well to dig and see whether or not this is a good time to push the client to go out of their comfort zone.'
As a result, she continued, the user might stay stuck.
Kushner mentioned this, too. 'For a client with OCD or major anxiety, reassurance seeking (the behavior of asking another for reassurance to soothe their anxiety and feel safe) should not be met with validation and assurance,' she said. 'It reinforces negative patterns and actually just increases the anxiety and keeps the person stuck in a cycle.'
Mental health conditions can be serious, nuanced, tricky, and even life-or-death. Therapists are held accountable to ethical standards; ChatGPT is not.
'We're entering dangerous territory when AI tools present themselves, or are perceived, as licensed professionals,' Kapur said. 'There are no clear standards yet for how these tools should behave in emotionally sensitive contexts.'
As research shows, the pros and cons are well-noted by the general public. For example, a 2025 survey by Iris Telehealth found that 65% of parents felt comfortable using an AI assessment tool before talking to a human provider, and only 36% expressed concern about AI bias. Pretty decent, right?
At the same time, 70% worried about data security, only 18% felt AI tools were 'reliable' for mental health support, 60% worried about losing empathy and connection, and 55% questioned AI's accuracy with assessments and recommendations.
With all of that in mind, and as chatbots gain popularity, many people wonder how AI will change their lives.
Tavakol believed clients will use ChatGPT more and more for insight and validation. Some may even stop going to therapy altogether, she said, opting for a ChatGPT session.
Kushner said clients may turn to AI for resources, whether that's a therapist database or other supports.
Kapur saw this development as positive, as a companion tool and homework provider between sessions.
If you're seeing an IRL therapist, they're probably not going away — especially if you continue to see them.
Clearly, ChatGPT has many pros and cons. What's the takeaway?
Kushner believes using ChatGPT as therapy is largely a no-go.
'I am happy people are opening up and learning more about mental health, but nothing can replace the human, authentic and co-regulating connection of a therapist and a client,' she said. 'And, therapy is highly regulated and therapists go through tons of training and practice.'
According to Tavakol, it comes down to accepting that ChatGPT is here to stay.
'This is not something to fight, but something to embrace and learn how to work with to better support our clients,' she said. 'It's time to spread more awareness around how to use it effectively, its limitations when it comes to mental health support, and begin integrating it into sessions for clients who already have it as a part of their life.' (She has a YouTube video on just that.)
Kapur's answer sums it up well: 'ChatGPT is a tool, not a therapist,' he said. 'It can be a supplement, not a substitute.'
MAGA Has A Kink For 'Daddy Trump' — And Therapists Say It Makes Perfect Sense
The 1 Thing That Child Therapists Say Harms Kids' Happiness The Most
The 1 Thing Therapists Say Harms Eldest Daughters' Happiness The Most

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles
Yahoo
16 minutes ago
- Yahoo
Meta Clashes With Apple, Google Over Age Check Legislation
(Bloomberg) -- The biggest tech companies are warring over who's responsible for children's safety online, with billions of dollars in fines on the line as states rapidly pass conflicting laws requiring companies to verify users' ages. Trump Awards $1.26 Billion Contract to Build Biggest Immigrant Detention Center in US The High Costs of Trump's 'Big Beautiful' New Car Loan Deduction Can This Bridge Ease the Troubled US-Canadian Relationship? Salt Lake City Turns Winter Olympic Bid Into Statewide Bond Boom Trump Administration Sues NYC Over Sanctuary City Policy The struggle has pitted Meta Platforms Inc. and other app developers against Apple Inc. and Alphabet Inc.'s Google, the world's largest app stores. Lobbyists for both sides are moving from state to state, working to water down or redirect the legislation to minimize their clients' risks. This year alone, at least three states — Utah, Texas and Louisiana — passed legislation requiring tech companies to authenticate users' ages, secure parental consent for anyone under 18 and ensure minors are protected from potentially harmful digital experiences. Now, lobbyists for all three companies are flooding into South Carolina and Ohio, the next possible states to consider such legislation. The debate has taken on new importance after the Supreme Court this summer ruled age verification laws are constitutional in some instances. A tech group on Wednesday petitioned the Supreme Court to block a social media age verification law in Mississippi, teeing up a highly consequential decision in the next few weeks. Child advocates say holding tech companies responsible for verifying the ages of their users is key to creating a safer online experience for minors. Parents and advocates have alleged the social media platforms funnel children into unsafe and toxic online spaces, exposing young people to harmful content about self harm, eating disorders, drug abuse and more. Blame Game Meta supporters argue the app stores should be responsible for figuring out whether minors are accessing inappropriate content, comparing the app store to a liquor store that checks patrons' IDs. Apple and Google, meanwhile, argue age verification laws violate children's privacy and argue the individual apps are better-positioned to do age checks. Apple said it's more accurate to describe the app store as a mall and Meta as the liquor store. The three new state laws put the responsibility on app stores, signaling Meta's arguments are gaining traction. The company lobbied in support of the Utah and Louisiana laws putting the onus on Apple and Google for tracking their users' ages. Similar Meta-backed proposals have been introduced in 20 states. Federal legislation proposed by Republican Senator Mike Lee of Utah would hold the app stores accountable for verifying users' ages. Still, Meta's track record in its state campaigns is mixed. At least eight states have passed laws since 2024 forcing social media platforms to verify users' ages and protect minors online. Apple and Google have mobilized dozens of lobbyists across those states to argue that Meta is shirking responsibility for protecting children. 'We see the legislation being pushed by Meta as an effort to offload their own responsibilities to keep kids safe,' said Google spokesperson Danielle Cohen. 'These proposals introduce new risks to the privacy of minors, without actually addressing the harms that are inspiring lawmakers to act.' Meta spokesperson Rachel Holland countered that the company is supporting the approach favored by parents who want to keep their children safe online. 'Parents want a one-stop-shop to oversee their teen's online lives and 80% of American parents and bipartisan lawmakers across 20 states and the federal government agree that app stores are best positioned to provide this,' Holland said. As the regulation patchwork continues to take shape, the companies have each taken voluntary steps to protect children online. Meta has implemented new protections to restrict teens from accessing 'sensitive' content, like posts related to suicide, self-harm and eating disorders. Apple created 'Child Accounts,' which give parents more control over their children's' online activity. At Apple, spokesperson Peter Ajemian said it 'soon will release our new age assurance feature that empowers parents to share their child's age range with apps without disclosing sensitive information.' Splintered Groups As the lobbying battle over age verification heats up, influential big tech groups are splintering and new ones emerging. Meta last year left Chamber of Progress, a liberal-leaning tech group that counts Apple and Google as members. Since then, the chamber, which is led by a former Google lobbyist and brands itself as the Democratic-aligned voice for the tech industry, has grown more aggressive in its advocacy against all age verification bills. 'I understand the temptation within a company to try to redirect policymakers towards the company's rivals, but ultimately most legislators don't want to intervene in a squabble between big tech giants,' said Chamber of Progress CEO Adam Kovacevich. Meta tried unsuccessfully to convince another major tech trade group, the Computer & Communications Industry Association, to stop working against bills Meta supports, two people familiar with the dynamics said. Meta, a CCIA member, acknowledged it doesn't always agree with the association. Meta is also still a member of NetChoice, which opposes all age verification laws no matter who's responsible. The group currently has 10 active lawsuits on the matter, including battling some of Meta's preferred laws. The disagreements have prompted some of the companies to form entirely new lobbying outfits. Meta in April teamed up with Spotify Technology SA and Match Group Inc. to launch a coalition aimed at taking on Apple and Google, including over the issue of age verification. Competing Campaigns Meta is also helping to fund the Digital Childhood Alliance, a coalition of conservative groups leading efforts to pass app-store age verification, according to three people familiar with the funding. Neither the Digital Childhood Alliance nor Meta responded directly to questions about whether Meta is funding the group. But Meta said it has collaborated with Digital Childhood Alliance. The group's executive director, Casey Stefanski, said it includes more than 100 organizations and child safety advocates who are pushing for more legislation that puts responsibility on the app stores. Stefanski said the Digital Childhood Alliance has met with Google 'several times' to share their concerns about the app store in recent months. The App Association, a group backed by Apple, has been running ads in Texas, Alabama, Louisiana and Ohio arguing that the app store age verification bills are backed by porn websites and companies. The adult entertainment industry's main lobby said it is not pushing for the bills; pornography is mostly banned from app stores. 'This one-size fits all approach is built to solve problems social media platforms have with their systems while making our members, small tech companies and app developers, collateral damage,' said App Association spokesperson Jack Fleming. In South Carolina and Ohio, there are competing proposals placing different levels of responsibility on the app stores and developers. That could end with more stringent legislation that makes neither side happy. 'When big tech acts as a monolith, that's when things die,' said Joel Thayer, a supporter of the app store age verification bills. 'But when they start breaking up that concentration of influence, all the sudden good things start happening because the reality is, these guys are just a hair's breath away from eating each other alive.' (Updates with App Association statement in 24th paragraph.) Burning Man Is Burning Through Cash Confessions of a Laptop Farmer: How an American Helped North Korea's Wild Remote Worker Scheme It's Not Just Tokyo and Kyoto: Tourists Descend on Rural Japan Elon Musk's Empire Is Creaking Under the Strain of Elon Musk A Rebel Army Is Building a Rare-Earth Empire on China's Border ©2025 Bloomberg L.P. Error in retrieving data Sign in to access your portfolio Error in retrieving data Error in retrieving data Error in retrieving data Error in retrieving data
Yahoo
16 minutes ago
- Yahoo
New dating advice app Tea rockets to No. 1 app spot: What to know
A dating advice app called Tea has rocketed to the top of the free apps chart in Apple's App Store after it went viral on Reddit, TikTok and across social media. The app varies from traditional dating apps because it's only for women, and instead of looking for dates, women go on the app to share information about and look for tips on potential male partners. The company behind Tea said in an Instagram post that it had reached over 4 million female users and has a waiting list of approximately 900,000 new, prospective users. Get to know the viral new app below. New dating app says it helps women identify red flags in dating What is the Tea app? The Tea app is marketed as a "women-only" app that offers "dating tools for women" and lets women anonymously share and search for information, advice, and photos of men they say they have dates with or are looking to date. According to the Apple App Store, to use the Tea app, users must be 17+. The purpose of a woman adding a man's "dating data" and "dating history" would, in theory, build community and help other women share "red flags" or warnings about people they say they've dated or any "green flags" or vouch for men. Women can share noteworthy dating experiences, both positive and negative. Matchmaker has dating advice for people seeking romance in 2025 The Tea app is available for iOS on Apple's App Store and for Android on the Google Play Store. What does Tea let users do? In one promotional Instagram post, Tea compared itself to the user review app Yelp. "An app that's like Yelp, except for it's reviews of men," the text in the video post reads. After creating an account by logging in through an Apple or Facebook account, a Tea user can use the app's tools to check phone number lookups, do background checks, check criminal records, check if someone is listed on sex offender registries and reverse image search photos to see if photos are used elsewhere, according to explanations posted on the Tea app's social media pages. Users can also search for others based on their location. Tea users can also share first names, photos, and reviews of men they've dated and their date experiences. What are the concerns behind Tea? On Reddit, some users have called for the deletion and removal of the Tea app, calling it a platform that "can be used to spread misinformation," a platform that allows for easy doxing, or the sharing of someone's identity and private information publicly, and a "privacy blindspot" that doesn't let men search for themselves or verify what is shared about them. The Tea app claims to verify all potential users and calls safety its "No. 1 priority." "Because this is a women-only app, our approval process is extremely tedious, to protect the women in Tea," the app stated in an Instagram post. "We pride ourselves on being very thorough, which takes extra time (real human team, no shortcuts!)," the message continued in part. The Tea app's developer, Tea Dating Advice Inc., declined to weigh in on how the app works, how the app's vetting process works, privacy concerns or comment for this article. Solve the daily Crossword
Yahoo
16 minutes ago
- Yahoo
Apple's Lack Of Clear AI Strategy A Warning Flag, Analyst Says
A Wall Street analyst urges caution on Apple stock because of the company's lack of a clear strategy for artificial intelligence. Solve the daily Crossword