logo
Teens Are Exploring Relationships & Sexting With AI Chatbots — & Restrictions Aren't Working

Teens Are Exploring Relationships & Sexting With AI Chatbots — & Restrictions Aren't Working

Yahoo23-05-2025
In news that sounds like science fiction, teens are exploring relationships with artificial intelligence (AI) chatbots — and circumventing any restrictions designed to stop them. Teens are using their digital 'boyfriends' and 'girlfriends' for emotional connection and sexting, and it's becoming a big problem.
According to The Washington Post, teens are having conversations that are romantic, sexually graphic and violent, and more on 'ai companion' tools like Character.AI, Replika, Talkie, Talk AI, SpicyChat, and PolyBuzz. General generative AI tools like ChatGPT and Meta AI have also launched companion-chat tools.
More from SheKnows
Nicole Kidman Reveals She Discusses 'The Most Intimate Things' With Her Teenage Daughters: 'I Get To Be Their Guide'
Damian Redman of Saratoga Springs, New York, found PolyBuzz on his 8th grader's phone, and found that his son was having flirty conversations with AI female anime characters. 'I don't want to put yesterday's rules on today's kids. I want to wait and figure out what's going on,' he told the outlet.
'We're seeing teens experiment with different types of relationships — being someone's wife, being someone's father, being someone's kid. There's game and anime-related content that people are working though. There's advice,' Robbie Torney, senior director of AI programs at family advocacy group Common Sense Media, said in the article. 'The sex is part of it but it's not the only part of it.'
The outlet reported 10 different AI companions, citing workarounds, paid options, and prompts that teens can use to get past content restriction filters. That's scary stuff! Even if you are on top of it, it's hard to completely protect them from having harmful and/or explicit interactions.
One concerned parent recently took to Reddit, where they shared that they blocked Character.AI from their 14-year-old's phone, and later found they were on PolyBuzz.AI. 'I hate to think my child's first romantic (and sexual) interactions are with bots,' they wrote on the Parenting subreddit. 'It's just creepy. Am I the only parent having this problem? Thoughts?'
Some parents suggested focusing on more of a communication approach with your child instead of trying to block everything. 'We have 'had a conversation' and 'communicated' with our teenage son for YEARS,' one person wrote. 'We've used multiple parental control apps. All for naught. He still finds ways to access what he wants. We're decently tech-savvy, but so is he. And the reality is there's no good way to completely prevent a singularly-minded hormonal teenager from achieving his/her goal.'
Someone else wrote, 'There are more than dozens of these sites out there. Craving connection is a very human thing, which is only amplified in teenage years. Social media can do this which is why getting likes or being popular on social media is so desirable to teens, but this is an entire other drug. Forming 'personal' one on one relationships with AI chatbots is so dangerous. Keep them away from this drug at any cost.'
Experts back up this opinion. In April, Common Sense Media launched an AI Risk Assessment Team to assess AI platforms to report on the likelihood of causing harm. Social AI companions like Character.AI, Nomi, and Replika were all ranked unacceptable for teen users, as teens were using these platforms to bond emotionally and engage in sexual conversations.
According to Common Sense Media, this research found that the chatbots could generate 'harmful responses including sexual misconduct, stereotypes, and dangerous 'advice' that, if followed, could have life-threatening or deadly real-world impact for teens.'
The experts at the organization recommend no social AI companions should be allowed for anyone under the age of 18. They also recommend further research and regulations on AI companions due to the emotional and psychological impacts they can cause teens, whose brains are still developing.
For now, the best we can do is continue to monitor our teens' phones, keep having conversations about these issues, and advocate for change.Best of SheKnows
Celebrity Moms Who Were Honest About Miscarriage & Pregnancy Loss — Because It Matters
Every Single Time Shemar Moore Proved He's the Proudest First-Time Girl Dad
The Best Places to Buy Furniture for Teens Online
Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

Teens increasingly turning to AI for friendship as national loneliness crisis deepens
Teens increasingly turning to AI for friendship as national loneliness crisis deepens

Fox News

time3 hours ago

  • Fox News

Teens increasingly turning to AI for friendship as national loneliness crisis deepens

A new study shows that a third of American teenagers prefer chatting with artificial intelligence companions over having real friends. Common Sense Media's report, titled "Talk, Trust, and Trade-Offs: How and Why Teens Use AI Companions," revealed that the most widespread uses of AI are aged 13-17. The report explained further that the "use of AI companions is not a niche interest, but rather mainstream teen behavior" and that teens "find conversations with AI companions to be as satisfying or more satisfying than those with real-life friends." "AI companions are emerging at a time when kids and teens have never felt more alone," Common Sense Media Founder and CEO James P. Steyer said in the press release. "This isn't just about a new technology — it's about a generation that's replacing human connection with machines, outsourcing empathy to algorithms, and sharing intimate details with companies that don't have kids' best interests at heart. Our research shows that AI companions are far more commonplace than people may have assumed — and that we have a narrow window to educate kids and families about the well-documented dangers of these products." Although nearly half of teens used AI companions as a tool, the report also stated that 33% of teens use AI companions for social interactions and emotional support. For example, teens would use them for living out relationships, emotional support, role-playing, romantic interactions and friendship. A writer at Daze who cited the study raised awareness about the loneliness epidemic among young people and that it could lead to an invasion of privacy. "Some teenagers are telling AI their most intimate problems and secrets, which poses another problem – it's not a good idea to entrust this information to tech companies, some of whom have an extremely lax approach to data privacy. Would you really want Sam Altman or Elon Musk to have access to the contents of your teenage diary?" James Greig wrote in Daze. He added that it underscores a "larger crisis of youth loneliness" as teenagers stopped hanging out at malls and going to the movies, "which has corresponded with rising rates of depression and anxiety." "Being able to speak to an AI companion might alleviate the feeling of loneliness, and some people may find it helpful, but if it's becoming a replacement for socializing in the real world, then it risks entrenching the problem," Greig added.

DOGE is reportedly pushing an AI tool that would put half of all federal regulations on a 'delete list'
DOGE is reportedly pushing an AI tool that would put half of all federal regulations on a 'delete list'

Engadget

timea day ago

  • Engadget

DOGE is reportedly pushing an AI tool that would put half of all federal regulations on a 'delete list'

According to a report from The Washington Post , DOGE is using an AI tool to analyze federal regulations and determine which to get rid of. A DOGE PowerPoint presentation obtained by the publication notes that its "AI Solution" — reportedly called the DOGE AI Deregulation Decision Tool — found that 100,000 out of over 200,000 regulations "can be deleted." The document sets a September 1 goal deadline for agencies to complete their own deregulation lists using the tool, which it says can be done in under four weeks, and then "DOGE will roll-up a delete list of 50% of all Federal Regulations (100k Regulatory Rules)." The tool is targeting regulations that are no longer required by law, The Washington Post reports . After it makes its suggestions, staffers would review the proposed deletions before finalizing a plan. According to the PowerPoint, the tool has already been tried out by the Consumer Financial Protection Bureau (CFPB), where it's been used to write "100% of deregulations," and by the Department of Housing and Urban Development (HUD) for decisions on 1,083 regulatory sections. The Washington Post spoke to three HUD employees who confirmed it was recently used. One also said that the tool got things wrong on several occasions, misreading the language of the law at times. DOGE will reportedly start training other agencies on the tool this month. Head over to The Washington Post to read the full report.

Divisive new app lets women put bad dates on blast — and men are freaking out: ‘Digital vigilantism'
Divisive new app lets women put bad dates on blast — and men are freaking out: ‘Digital vigilantism'

New York Post

time2 days ago

  • New York Post

Divisive new app lets women put bad dates on blast — and men are freaking out: ‘Digital vigilantism'

There's a new app causing men to break into a cold sweat — and it's not because they forgot their wallet on a first date. Tea, a women-only app that lets users post anonymous Yelp-style reviews of men they've dated, has shot to the top of the Apple App Store — and smack into the middle of a digital war between safety and slander. The platform, launched in 2023, lets women share stories and warnings about exes, Tinder flops, and potential predators. Advertisement 3 Forget ghosting — this app has men begging to be left off the grid. Studio Romantic – Users can toss out 'green flags' or 'red flags' — or, in some cases, blast a guy's entire romantic résumé into cyberspace. The feed is full of candid commentary, catfish alerts, and more than a few 'avoid this man' declarations. Advertisement 'I see men freaking out today about this Tea app,' TikTokker @azalialexi said in a recent video. 'If you don't want things like this to exist then maybe look into advocating for women's safety and actually holding your fellow men accountable.' Tea's website claims the app was born after its founder, Sean Cook, 'witnessed his mother's terrifying experience with online dating — not only being catfished but unknowingly engaging with men who had criminal records.' It now boasts nearly 1 million users, and it's not just the safety features — like reverse image search and criminal background checks — that are turning heads. The public reviews are what really set the app ablaze. Advertisement 3 Launched in 2023, the app lets women dish on shady dates, dodgy exes and full-on predators — one swipe and horror story at a time. .tiktok/@theteapartygirls 'It's kind of like a Carfax situation,' Sabrina Henriquez, 28, who found out some of her exes had less-than-stellar ratings on Tea, told The Washington Post in a recent interview. 'It kind of saved [other women] from putting themselves in that situation.' Advertisement But not everyone's here for the gossip. 'I think the app has good intentions, it's just very messy,' Donovan James, 21, also told the outlet. 'You're always going to look bad in somebody's eyes.' Others worry it's turning into digital vigilantism. Apps like these or Facebook groups like 'Are We Dating The Same Guy' are the 'equivalent of whisper networks,' Chiara Wilkinson wrote for Dazed. Or as Dazed writer James Greig put it: 'It's digital vigilantism; the TikTok equivalent of a citizen's arrest.' Douglas Zytko, a professor at the University of Michigan at Flint, said to The Washington Post that the app is filling a void dating apps never addressed: safety. 3 Still, fears of false claims linger — and TikTok is crawling with jittery dudes doom-scrolling the damage. Mdv Edwards – 'There are multiple studies now showing that around 10 percent of overall cases of sexual assault are attributed to a dating app,' he noted. Advertisement Still, false accusations remain a fear. TikTok is now flooded with men nervously scrolling. 'Hot take: The tea app is toxic,' wrote @johnnysaysgo, who had a female friend go undercover to see what women were saying about him. 'These women were clearly just upset… I was honest with them and respectful.' User @ warned: 'Be careful.' He added that he can see the 'vision' behind the app but noted that he knows 'how vile' people who might use it could be. Advertisement And users like @kristakilduff are just enjoying the drama after getting accepted into the app. 'The men are not safe,' she said with a laugh in a recent clip. 'The Tea app has me weak — stay safe.' The backlash — and buzz — around Tea is just the latest sign that the digital dating landscape is shifting, and not necessarily for the better. As The Post previously reported, not all matches made in algorithm heaven are built to last. Advertisement A new study published in Computers in Human Behavior found that married couples who met online reported lower levels of satisfaction and stability than those who met IRL — a phenomenon dubbed the 'online dating effect.' Researchers pointed to factors like geographic distance, delayed family approval and lack of shared social circles as possible causes. So, while dating apps might be great for scoring first dates and flings, they may not always deliver happily ever after.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store