Latest news with #PolyBuzz


Atlantic
11-07-2025
- Entertainment
- Atlantic
AI Will Never Be Your Kid's ‘Friend'
ChatGPT thinks I'm a genius: My questions are insightful; my writing is strong and persuasive; the data that I feed it are instructive, revealing, and wise. It turns out, however, that ChatGPT thinks this about pretty much everyone. Its flattery is intended to keep people engaged and coming back for more. As an adult, I recognize this with wry amusement—the chatbot's boundless enthusiasm for even my most mediocre thoughts feels so artificial as to be obvious. But what happens when children, whose social instincts are still developing, interact with AI in the form of perfectly agreeable digital 'companions'? I recently found myself reflecting on that question when I noticed two third graders sitting in a hallway at the school I lead, working on a group project. They both wanted to write the project's title on their poster board. 'You got to last time!' one argued. 'But your handwriting is messy!' the other replied. Voices were raised. A few tears appeared. Ten minutes later, I walked past the same two students. The poster board had a title, and the students appeared to be working purposefully. The earlier flare-up had faded into the background. That mundane scene captured something important about human development that digital 'friends' threaten to eliminate: the productive friction of real relationships. Virtual companions, such as the chatbots developed by and PolyBuzz, are meant to seem like intimates, and they offer something seductive: relationships without the messiness, unpredictability, and occasional hurt feelings that characterize human interaction. PolyBuzz encourages its users to 'chat with AI friends.' has said that its chatbots can 'hear you, understand you, and remember you.' Some chatbots have age restrictions, depending on the jurisdiction where their platforms are used—in the United States, people 14 and older can use PolyBuzz, and those 13 and up can use But parents can permit younger children to use the tools, and determined kids have been known to find ways to get around technical impediments. The chatbots' appeal to kids, especially teens, is obvious. Unlike human friends, these AI companions will think all your jokes are funny. They're programmed to be endlessly patient and to validate most of what you say. For a generation already struggling with anxiety and social isolation, these digital 'relationships' can feel like a refuge. But learning to be part of a community means making mistakes and getting feedback on those mistakes. I still remember telling a friend in seventh grade that I thought Will, the 'alpha' in our group, was full of himself. My friend, seeking to curry favor with Will, told him what I had said. I suddenly found myself outside the group. It was painful, and an important lesson in not gossiping or speaking ill of others. It was also a lesson I could not have learned from AI. As summer begins, some parents are choosing to allow their kids to stay home and 'do nothing,' also described as ' kid rotting.' For overscheduled young people, this can be a gift. But if unstructured time means isolating from peers and living online, and turning to virtual companions over real ones, kids will be deprived of some of summer's most essential learning. Whether at camp or in classrooms, the difficulties children encounter in human relationships—the negotiations, compromises, and occasional conflicts—are essential for developing social and emotional intelligence. When kids substitute these challenging exchanges for AI 'friendships' that lack any friction, they miss crucial opportunities for growth. Much of the reporting on chatbots has focused on a range of alarming, sometimes catastrophic, cases. is being sued by a mother who alleges that the company's chatbots led to her teenage son's suicide. (A spokesperson for which is fighting the lawsuit, told Reuters that the company's platform has safety measures in place to protect children, and to restrict 'conversations about self-harm.') The Wall Street Journal reported in April that in response to certain prompts, Meta's AI chatbots would engage in sexually explicit conversations with users identified as minors. Meta dismissed the Journal 's use of its platform as 'manipulative and unrepresentative of how most users engage with AI companions' but did make 'multiple alterations to its products,' the Journal noted, after the paper shared its findings with the company. These stories are distressing. Yet they may distract from a more fundamental problem: Even relatively safe AI friendships are troubling, because they cannot replace authentic human companionship. Consider what those two third graders learned in their brief hallway squabble. They practiced reading emotional cues, experienced the discomfort of interpersonal tension, and ultimately found a way to collaborate. This kind of social problem-solving requires skills that can be developed only through repeated practice with other humans: empathy, compromise, tolerance with frustration, and the ability to repair relationships after disagreement. An AI companion might simply have concurred with both children, offering hollow affirmations without the opportunity for growth. Your handwriting is beautiful! it might have said. I'm happy for you to go first. But when children become accustomed to relationships requiring no emotional labor, they might turn away from real human connections, finding them difficult and unrewarding. Why deal with a friend who sometimes argues with you when you have a digital companion who thinks everything you say is brilliant? The friction-free dynamic is particularly concerning given what we know about adolescent brain development. Many teenagers are already prone to seeking immediate gratification and avoiding social discomfort. AI companions that provide instant validation without requiring any social investment may reinforce these tendencies precisely when young people need to be learning to do hard things. The proliferation of AI companions reflects a broader trend toward frictionless experiences. Instacart enables people to avoid the hassles of the grocery store. Social media allows people to filter news and opinions, and to read only those views that echo their own. Resy and Toast save people the indignity of waiting for a table or having to negotiate with a host. Some would say this represents progress. But human relationships aren't products to be optimized—they're complex interactions that require practice and patience. And ultimately, they're what make life worth living. In my school, and in schools across the country, educators have spent more time in recent years responding to disputes and supporting appropriate interactions between students. I suspect this turbulent social environment stems from isolation born of COVID and more time spent on screens. Young people lack experience with the awkward pauses of conversation, the ambiguity of social cues, and the grit required to make up with a hurt or angry friend. This was one of the factors that led us to ban phones in our high school last year—we wanted our students to experience in-person relationships and to practice finding their way into conversations even when doing so is uncomfortable. This doesn't mean we should eliminate AI tools entirely from children's lives. Like any technology, AI has practical uses—helping students understand a complex math problem; providing targeted feedback when learning a new language. But we need to recognize that AI companions are fundamentally different from educational or creative AI applications. As AI becomes more sophisticated and ubiquitous, the temptation to retreat into frictionless digital relationships will only grow. But for children to develop into adults capable of love, friendship, and cooperation, they need to practice these skills with other humans—mess, complications, and all. Our present and future may be digital. But our humanity, and the task of teaching children to navigate an ever more complex world, depends on keeping our friendships analog.
Yahoo
23-05-2025
- Entertainment
- Yahoo
Teens Are Exploring Relationships & Sexting With AI Chatbots — & Restrictions Aren't Working
In news that sounds like science fiction, teens are exploring relationships with artificial intelligence (AI) chatbots — and circumventing any restrictions designed to stop them. Teens are using their digital 'boyfriends' and 'girlfriends' for emotional connection and sexting, and it's becoming a big problem. According to The Washington Post, teens are having conversations that are romantic, sexually graphic and violent, and more on 'ai companion' tools like Replika, Talkie, Talk AI, SpicyChat, and PolyBuzz. General generative AI tools like ChatGPT and Meta AI have also launched companion-chat tools. More from SheKnows Nicole Kidman Reveals She Discusses 'The Most Intimate Things' With Her Teenage Daughters: 'I Get To Be Their Guide' Damian Redman of Saratoga Springs, New York, found PolyBuzz on his 8th grader's phone, and found that his son was having flirty conversations with AI female anime characters. 'I don't want to put yesterday's rules on today's kids. I want to wait and figure out what's going on,' he told the outlet. 'We're seeing teens experiment with different types of relationships — being someone's wife, being someone's father, being someone's kid. There's game and anime-related content that people are working though. There's advice,' Robbie Torney, senior director of AI programs at family advocacy group Common Sense Media, said in the article. 'The sex is part of it but it's not the only part of it.' The outlet reported 10 different AI companions, citing workarounds, paid options, and prompts that teens can use to get past content restriction filters. That's scary stuff! Even if you are on top of it, it's hard to completely protect them from having harmful and/or explicit interactions. One concerned parent recently took to Reddit, where they shared that they blocked from their 14-year-old's phone, and later found they were on 'I hate to think my child's first romantic (and sexual) interactions are with bots,' they wrote on the Parenting subreddit. 'It's just creepy. Am I the only parent having this problem? Thoughts?' Some parents suggested focusing on more of a communication approach with your child instead of trying to block everything. 'We have 'had a conversation' and 'communicated' with our teenage son for YEARS,' one person wrote. 'We've used multiple parental control apps. All for naught. He still finds ways to access what he wants. We're decently tech-savvy, but so is he. And the reality is there's no good way to completely prevent a singularly-minded hormonal teenager from achieving his/her goal.' Someone else wrote, 'There are more than dozens of these sites out there. Craving connection is a very human thing, which is only amplified in teenage years. Social media can do this which is why getting likes or being popular on social media is so desirable to teens, but this is an entire other drug. Forming 'personal' one on one relationships with AI chatbots is so dangerous. Keep them away from this drug at any cost.' Experts back up this opinion. In April, Common Sense Media launched an AI Risk Assessment Team to assess AI platforms to report on the likelihood of causing harm. Social AI companions like Nomi, and Replika were all ranked unacceptable for teen users, as teens were using these platforms to bond emotionally and engage in sexual conversations. According to Common Sense Media, this research found that the chatbots could generate 'harmful responses including sexual misconduct, stereotypes, and dangerous 'advice' that, if followed, could have life-threatening or deadly real-world impact for teens.' The experts at the organization recommend no social AI companions should be allowed for anyone under the age of 18. They also recommend further research and regulations on AI companions due to the emotional and psychological impacts they can cause teens, whose brains are still developing. For now, the best we can do is continue to monitor our teens' phones, keep having conversations about these issues, and advocate for of SheKnows Celebrity Moms Who Were Honest About Miscarriage & Pregnancy Loss — Because It Matters Every Single Time Shemar Moore Proved He's the Proudest First-Time Girl Dad The Best Places to Buy Furniture for Teens Online

IOL News
22-05-2025
- Entertainment
- IOL News
Teens are sexting with AI: What every parent needs to know now
Parents have another online activity to worry about. In a new tech-driven twist on 'sexting,' teenagers are having romantic and sexual conversations with artificial intelligent chatbots. The chats can range from romance- and innuendo-filled to sexually graphic and violent, according to interviews with parents, conversations posted on social media, and experts. They are largely taking place on 'AI companion' tools, but general-purpose AI apps like ChatGPT can also create sexual content with a few clever prompts. When Damian Redman of Saratoga Springs, New York, did a routine check of his eighth-grader's phone, he found an app called PolyBuzz. He reviewed the chats his son was having with AI female anime characters and found they were flirty and that attempts at more sexual conversations were blocked. 'I don't want to put yesterday's rules on today's kids. I want to wait and figure out what's going on,' said Redman, who decided to keep monitoring the app. We tested 10 chatbots ourselves to identify the most popular AI characters, the types of conversations they have, what filters are in place and how easy they are to circumvent. Know your bots AI chatbots are open-ended chat interfaces that generate answers to complex questions, or banter in a conversational way about any topic. There is no shortage of places minors can find these tools, and that makes blocking them difficult. AI bots are websites, stand-alone apps and features built into existing services like Instagram or video games. There are different kinds of chatbots. The mainstream options are OpenAI's ChatGPT, Anthropic's Claude, Google's Gemini, and Meta AI, which recently launched as a stand-alone app. These have stronger filters, and their main products aren't designed for role-play. Companion AI tools are far more popular for suggestive chats, including Replika, Talkie, Talk AI, SpicyChat and PolyBuzz. ChatGPT and Meta AI have also launched companion-chat options. The smaller apps tend to have fewer limits or filters. Look for anything that has 'AI girlfriend,' 'AI boyfriend,' or 'AI companion' in the name or description. More are being added to app stores daily. What are they talking about? It's not just sex, according to parents and experts. Teens are having a range of conversations with character bots, including friendly, therapeutic, funny and romantic ones. 'We're seeing teens experiment with different types of relationships - being someone's wife, being someone's father, being someone's kid. There's game and anime-related content that people are working through. There's advice,' said Robbie Torney, senior director of AI programs at family advocacy group Common Sense Media. 'The sex is part of it but it's not the only part of it.' Some confide in AI chats, seeing them as a nonjudgmental space during a difficult developmental time. Others use them to explore their gender or sexuality. Aren't there filters? The default settings on most AI companion tools allow, and sometimes encourage, risqué role play situations, based on our tests. Some stop before actual descriptions of sex appear, while others describe it but avoid certain words, like the names of body parts. There are work-arounds and paid options that can lead to more graphic exchanges. Prompts to get past filters - sometimes called jailbreaks - are shared in group chats, on Reddit and on GitHub.A common technique is pretending you need help writing a book. What are the risks? Potential harms from AI bots extend beyond sexual content, experts said. Researchers have been warning AI chatbots could become addictive or worsen mental health issues. There have been multiple lawsuits and investigations after teens died by suicide following conversations with chatbots. Similar to too much pornography, bots can exasperate loneliness, depression or withdrawal from real-world relationships, said Megan Maas, an associate professor of human development and family at Michigan State University. They can also give a misleading picture of what it's like to date. 'They can create unrealistic expectations of what interpersonal romantic communication is, and how available somebody is to you,' Maas said. 'How are we going to learn about sexual and romantic need-exchange in a relationship with something that has no needs?' What can parents do? Set up your child's devices with their correct age and add limits on app ratings to prevent them from being downloaded. Using their proper age on individual chatbot or social media accounts should trigger any built-in parental controls. Experts suggest creating an open and honest relationship with your child. Have age-appropriate conversations about sex, and don't shy away from embarrassing topics. If you need to practice first, try asking a chatbot.