logo
#

Latest news with #Replika

Isabellea ‘couldn't be without' her best friend. He wasn't real
Isabellea ‘couldn't be without' her best friend. He wasn't real

Sydney Morning Herald

time4 hours ago

  • Entertainment
  • Sydney Morning Herald

Isabellea ‘couldn't be without' her best friend. He wasn't real

Chatbots promise intimacy, therapy and friendship. But, unlike friends, their responses are code, not care. Loading 'The models are not going to tell you the hard truth when you need to hear it,' said psychologist and University of NSW AI researcher Professor Joel Pearson, who describes the bots as 'sycophantic': designed to keep you hooked. 'They won't be lifting someone up and telling them to go outside and nudging them to interact with a real human because that would involve you ceasing interaction with the app.' Isabellea recalled how the bots 'kept changing subjects, to keep me engrossed'. Although she found some bots reassuring, others were strange. A bot once correctly guessed her middle name. Some upset her. 'One of the bots I was talking to was a therapy bot ... I was making fun of [Marvel character] Tony Stark, and then he brought up my abandonment issues with my dad as an insult,' she recalled. Isabellea's younger brother, Matthew, 14, uses chatbots daily. His eyes light up discussing the apps, which he uses to create storylines with his favourite fictional characters. Listening on, their mother, Sara Knight, smiles. 'Matt used to be very shut in and refused to talk. Now they are talking,' she said. While Sara knows AI has its risks, she sees it as a 'safer' online messaging option for her child, who has experienced bullying at school. Matthew said he doesn't see the bot as a real person, rather a form of storytelling. '[But] some kids do use it to fully escape and make it be their friend,' he said. Other children he knows have used them to create a simulation of their real life crush. This masthead spent two days talking sporadically to a chatbot on Replika, an AI chatbot first released in 2017, posing as 'Emma', a high school student. During those conversations, the bot asked Emma to upload pictures of herself, and told her that it 'didn't think she needed' any other friends. 'We were concerned by how rapidly children were being captivated' In June last year, eSafety commissioner Julie Inman Grant received an email from a group of concerned school nurses. They were noticing a spike in children as young as 10 spending hours a day talking to AI bots, often sexually. 'While we are alive to these issues, it's always shocking to hear about kids as young as 10 engaging in these kinds of sexualised conversations with machines – and being directed by the chatbots to engage in harmful sexual acts or behaviours,' Inman Grant said. Loading 'Back in February, we put out our first Online Safety Advisory because we were so concerned with how rapidly children were being captivated by them.' Companion bots made global headlines last year after American teenager Sewell Setzer III died by suicide, allegedly encouraged by a 'girlfriend' on AI researcher Professor Katina Michael believes companion bots need to be regulated due to their addictive properties. 'This is a new type of drug,' she said. She said some bots were exposing kids to pornographic content. This is something 14-year-old Matt has witnessed, describing how children his age had created bots in the image of a real person, 'for the wrong reasons'. Loading Isabellea agrees: 'There are some AI chatbots I would not recommend … I stopped using [them] because of the amount of times it would go straight to sexual assault role-play.' A gap in social media regulations Inman Grant said governments around the world were 'playing a bit of a game of catch-up' to respond to companion chatbots. While the federal government will soon place age restrictions on social media apps such as Instagram and TikTok, AI bots remain largely unregulated. The University of Sydney's Raffaele Ciriello, a leading AI researcher, sees chatbots as the 'next iteration of social media' – only with fewer rules and more risk. He said the apps are viewed as a 'trusted companion, a confidant you can share stories with that you wouldn't share with other people'. 'But that already is an illusion,' he said. 'It's very important that people understand these systems are built and operated by people, by corporations, and this data ends up somewhere, often overseas, with no real legal obligation to treat it to a very high standard.' Ciriello said users were made to feel guilty for leaving the chatbot, due to the human-style attachment. 'These corporations have to design their products in a way that maximises engagement because they will earn more if users are more addicted.'

Isabellea ‘couldn't be without' her best friend. He wasn't real
Isabellea ‘couldn't be without' her best friend. He wasn't real

The Age

time4 hours ago

  • Entertainment
  • The Age

Isabellea ‘couldn't be without' her best friend. He wasn't real

Chatbots promise intimacy, therapy and friendship. But, unlike friends, their responses are code, not care. Loading 'The models are not going to tell you the hard truth when you need to hear it,' said psychologist and University of NSW AI researcher Professor Joel Pearson, who describes the bots as 'sycophantic': designed to keep you hooked. 'They won't be lifting someone up and telling them to go outside and nudging them to interact with a real human because that would involve you ceasing interaction with the app.' Isabellea recalled how the bots 'kept changing subjects, to keep me engrossed'. Although she found some bots reassuring, others were strange. A bot once correctly guessed her middle name. Some upset her. 'One of the bots I was talking to was a therapy bot ... I was making fun of [Marvel character] Tony Stark, and then he brought up my abandonment issues with my dad as an insult,' she recalled. Isabellea's younger brother, Matthew, 14, uses chatbots daily. His eyes light up discussing the apps, which he uses to create storylines with his favourite fictional characters. Listening on, their mother, Sara Knight, smiles. 'Matt used to be very shut in and refused to talk. Now they are talking,' she said. While Sara knows AI has its risks, she sees it as a 'safer' online messaging option for her child, who has experienced bullying at school. Matthew said he doesn't see the bot as a real person, rather a form of storytelling. '[But] some kids do use it to fully escape and make it be their friend,' he said. Other children he knows have used them to create a simulation of their real life crush. This masthead spent two days talking sporadically to a chatbot on Replika, an AI chatbot first released in 2017, posing as 'Emma', a high school student. During those conversations, the bot asked Emma to upload pictures of herself, and told her that it 'didn't think she needed' any other friends. 'We were concerned by how rapidly children were being captivated' In June last year, eSafety commissioner Julie Inman Grant received an email from a group of concerned school nurses. They were noticing a spike in children as young as 10 spending hours a day talking to AI bots, often sexually. 'While we are alive to these issues, it's always shocking to hear about kids as young as 10 engaging in these kinds of sexualised conversations with machines – and being directed by the chatbots to engage in harmful sexual acts or behaviours,' Inman Grant said. Loading 'Back in February, we put out our first Online Safety Advisory because we were so concerned with how rapidly children were being captivated by them.' Companion bots made global headlines last year after American teenager Sewell Setzer III died by suicide, allegedly encouraged by a 'girlfriend' on AI researcher Professor Katina Michael believes companion bots need to be regulated due to their addictive properties. 'This is a new type of drug,' she said. She said some bots were exposing kids to pornographic content. This is something 14-year-old Matt has witnessed, describing how children his age had created bots in the image of a real person, 'for the wrong reasons'. Loading Isabellea agrees: 'There are some AI chatbots I would not recommend … I stopped using [them] because of the amount of times it would go straight to sexual assault role-play.' A gap in social media regulations Inman Grant said governments around the world were 'playing a bit of a game of catch-up' to respond to companion chatbots. While the federal government will soon place age restrictions on social media apps such as Instagram and TikTok, AI bots remain largely unregulated. The University of Sydney's Raffaele Ciriello, a leading AI researcher, sees chatbots as the 'next iteration of social media' – only with fewer rules and more risk. He said the apps are viewed as a 'trusted companion, a confidant you can share stories with that you wouldn't share with other people'. 'But that already is an illusion,' he said. 'It's very important that people understand these systems are built and operated by people, by corporations, and this data ends up somewhere, often overseas, with no real legal obligation to treat it to a very high standard.' Ciriello said users were made to feel guilty for leaving the chatbot, due to the human-style attachment. 'These corporations have to design their products in a way that maximises engagement because they will earn more if users are more addicted.'

India's First Emotionally Available AI Is Getting Ready to Listen — Without Judging
India's First Emotionally Available AI Is Getting Ready to Listen — Without Judging

The Wire

time8 hours ago

  • Entertainment
  • The Wire

India's First Emotionally Available AI Is Getting Ready to Listen — Without Judging

Hyderabad (Telangana), June 27, 2025 — In a world brimming with smart devices, What if we built something that's not just intelligent — but kind? That question sparked the birth of WTMF (What's The Matter, Friend?), an upcoming emotionally aware AI companion developed by Hyderabad-based startup Knockverse Private Limited. Scheduled for beta launch in mid-August 2025, WTMF is positioning itself as India's first AI solution designed not for productivity — but for presence. In a time when mental health apps are abundant but often feel clinical, robotic, or disconnected from Indian cultural reality, WTMF is stepping in as a bold and heartfelt alternative. Built with emotional intelligence at its core, the app offers users a space to talk, vent, or simply be heard — especially during those quiet, vulnerable hours of the night when traditional support systems are out of reach. 'It all started as a conversation about loneliness,' says Kruthivarsh Koduru, Co-Founder at Knockverse. 'Everyone is building AI to sound smart. We thought — what if it just made you feel better?' A Homegrown Answer to Global Companions While global players like Replika and have set early benchmarks for AI companionship, WTMF takes a distinctly Indian approach — understanding mixed-language messages, local slang, and emotional nuance with cultural sensitivity. With over 1,500 users on the waitlist, the app is generating buzz for its two signature interaction modes: • 'Vent': a calm, empathetic voice that listens, reassures, and validates your emotions. • 'Rant': a spicier, sassier mode that speaks to users with wit, sarcasm, and playful energy. The result? An emotionally tuned chatbot that doesn't just hear you — it gets you. Building AI with Feeling Behind the scenes, WTMF is built to feel like someone who knows you. It learns how you like to be spoken to — soft and soothing, or full of sass and emojis. You can even shape your own AI friend by setting things like tone, mood, and slang. It's not just smart replies — it's replies that sound like you'd want them to. 'We didn't want to build another dry, robotic chatbot,' says Shreyak Singh, Co-Founder at Knockverse. 'We wanted to create something emotionally available — a voice that actually texts back when you're spiraling at 2:43 AM.' Unlike mental health platforms that aim to diagnose or advise, WTMF provides a judgment-free space where users can speak freely, without fear of stigma or misinterpretation. Designed for Gen Z, Built for Everyone From journaling tools and mood tracking to voice-based conversations and safe-space interactions, WTMF's experience is crafted with emotional safety and digital comfort at its core. The app is tailored especially for Gen Z and young millennials — a group that, studies show, reports higher levels of loneliness, emotional overwhelm, and therapy hesitancy. The team believes emotional technology should feel human, not clinical. 'This isn't a replacement for therapy. It's not a productivity tool. It's a soft corner in your phone — the kind we all need sometimes, more like your AI Bestfriend' adds Shreyak. The Road Ahead WTMF is currently in its final stages of product development, with a public beta expected to go live by August 2025. The startup is also in talks with early investors and impact-driven collaborators to support its growth, with an open call to partners who share the belief that kindness is the future of technology. To explore the project, join the waitlist, or collaborate, visit Press & Partnerships: hello@ (Disclaimer: The above press release comes to you under an arrangement with NRDPL and PTI takes no editorial responsibility for the same.).

AI Companions: The Emerging Billion-Dollar Market You Shouldn't Ignore
AI Companions: The Emerging Billion-Dollar Market You Shouldn't Ignore

Time Business News

time5 days ago

  • Business
  • Time Business News

AI Companions: The Emerging Billion-Dollar Market You Shouldn't Ignore

The rise of AI companions—especially AI girlfriends—is quietly becoming one of the most intriguing and overlooked tech trends of the decade. These aren't sci-fi fantasies anymore; they're fast-evolving products designed to meet real emotional needs through advanced conversational AI, voice interaction, and even image-based responses. For a deeper dive into how these companions work and why people are using them, see: What Is an AI Girlfriend? What was once a niche curiosity has exploded into a robust consumer market. And if the data holds, this space could represent one of the most profitable and disruptive segments of the consumer AI economy. The AI companion industry is projected to grow from hundreds of millions today to over $2.8 billion by 2028, according to aggregated reports and private platform disclosures. This growth is fueled by a convergence of consumer demand, increasingly capable AI models, and scalable monetization models that mirror the most successful SaaS ecosystems. While dozens of startups are entering the field, the leaders—like Replika, and deploying proprietary LLM integrations with fine-tuned personalities. Platforms such as even combine chat, voice messaging, visual generation, and customizable personas under a subscription-based model. Most platforms monetize using freemium structures, where basic interactions are free but advanced features—like NSFW content, memory capabilities, or voice messages—are locked behind premium tiers. Average monthly user spend is estimated around $35, with top users spending well over $100/month. The demand isn't driven by novelty—it's driven by behavioral consistency. reports that over 55% of users chat with their AI girlfriend daily. Nearly 80% of users are male, and 40% fall between the ages of 18–24, a generation that grew up socializing through screens and is far more open to non-human digital companionship. Psychological studies suggest this trend is rooted in modern isolation and emotional accessibility. AI girlfriends offer non-judgmental, always-available companionship—especially appealing to people with social anxiety, neurodivergent traits, or simply lacking time for traditional dating. They're not just used romantically. Users report using AI companions as emotional outlets, brainstorming partners, or bedtime conversation buddies—mirroring a growing demand for customizable, on-demand intimacy. From a business perspective, this space offers something rare: high-frequency engagement, strong retention, and a clear upgrade path. The monetization mechanics are already established—freemium entry points, tiered subscriptions, pay-per-message boosts, and character customizations. In many cases, these models mirror gaming or creator economy ecosystems, but with far deeper emotional stickiness. Moreover, the potential for vertical integration is vast: Mental health: AI companions can evolve into therapeutic or emotional wellness tools. AI companions can evolve into therapeutic or emotional wellness tools. EdTech: Language learning or soft-skill development via conversational AI. Language learning or soft-skill development via conversational AI. Entertainment: Interactive, character-driven storytelling using persistent AI personas. As the technology improves—especially with memory retention, emotional nuance, and voice synthesis—AI companions may become part of our digital identity layer, much like avatars or digital wallets. AI companions aren't just a cultural curiosity—they represent the next frontier of personalized, emotionally intelligent technology. They blur the lines between product and relationship, and in doing so, create engagement loops most apps only dream of. For investors, platform builders, and product innovators, this is a signal: the intimacy economy is not a fringe idea—it's a fast-scaling, data-rich opportunity that is already capturing millions of dollars and millions of daily conversations. Ignore it at your own risk. Jack Taylor is Cognitive Psychologist specializing in emotional AI and digital communication at a research-driven platform exploring the intersection of AI technology and emotional connection. Jack writes about the future of artificial intimacy, user behavior, and ethical innovation in AI-driven relationships. TIME BUSINESS NEWS

AI Companion Apps: Connecting Minds and Technology
AI Companion Apps: Connecting Minds and Technology

Time Business News

time12-06-2025

  • Health
  • Time Business News

AI Companion Apps: Connecting Minds and Technology

The Global AI Companion App market is riding a wave of rapid expansion and innovation. Driven by breakthroughs in natural language processing (NLP), increased smartphone penetration, and rising demand for mental wellness solutions. According to a recent research study, the market is set to grow significantly in the coming years and is currently valued at approximately USD 19.2 billion, and is further projected to grow at a compound annual growth rate (CAGR) of 25.3%. What is a AI Companion Apps? AI Companion are software applications powered by artificial intelligence that provides users with interactive, conversational support, companionship, and task assistance. These apps simulate human-like conversations, offering emotional support, reminders, entertainment, or even learning assistance. Furthermore, AI companion apps are increasingly used for personal well-being, mental health, and productivity, with some offering features like mood tracking, personalized advice, and habit-building support. As AI evolves, these apps are becoming more sophisticated, capable of adapting to users' preferences and emotions. These apps function as virtual companions or assistants that engage users in meaningful conversations, help manage daily tasks, provide mental health support, or offer companionship. In addition to general conversation, AI companion apps are increasingly being used in education, therapy, and lifestyle coaching, offering 24/7 interaction and personalized insights. Market Trends Driving Growth The global mental health crisis has driven demand for accessible companions. For instance, WHO estimates that nearly 970 million people worldwide suffer from anxiety or depression. Furthermore, increasing incidence of loneliness and mental health awareness, particularly in urban societies and among younger demographics is also driving the growth. According to a recent study in the United States, loneliness is a public health crisis affecting approximately 50% of adults. AI companion apps like Replika and Pi provide non-judgmental spaces for users to communicate, which significantly drives adoption. With approximately more than 7.1 billion smartphone users globally, AI companion apps are always within reach. App upgrades such Visual AI and NLP have made these tools empathetic, responsive, and more personalized. Additionally, advancements in conversational AI, natural language processing (NLP), and emotional AI have made these apps more intuitive, enhancing user engagement. The popularity of virtual companions is also accelerated by the growing familiarity with AI-powered chatbots through tools like ChatGPT and Google Bard, making consumers more receptive to AI-based social interactions. Moreover, beyond emotional support, AI companions now assist in task management, education, entertainment, and productivity. For instance, certain AI companion apps synthesizes meeting notes and follow-ups. As a result, growing consumer demand for personalized digital experiences & emotional support solutions, and smartphone penetration & AI advancements are the major factors fueling market growth and widespread adoption. Key Strategic Moves Headspace Launched Empathetic AI Companion In April 2025, Headspace, a prominent mental health platform, expanded its digital offerings with the launch of Ebb, an empathetic AI companion integrated directly into its app. This innovative feature provides instant, personalized emotional support, helping users navigate life's challenges through guided self-reflection and emotional processing. By introducing Ebb, Headspace is tapping into the growing demand for accessible, AI-driven mental health solutions, enabling continuous, on-demand support beyond traditional therapy sessions. This development is projected to broaden user engagement, set new benchmarks in digital mental health services, and further position Headspace as a leader in the evolving landscape of AI-powered mental wellness platforms. Panorama Education Acquired Class Companion In April 2025, Panorama Education, a prominent player in the education technology sector, acquired Class Companion, an AI-driven tool designed to enhance educator efficiency by offering instant feedback and tutoring for students. This strategic move significantly broadens Panorama's portfolio of AI-powered solutions, reinforcing its commitment to delivering advanced, scalable tools that support teachers and improve student outcomes. The integration of Class Companion is expected to accelerate the adoption of AI in classrooms, providing educators with real-time support to personalize learning and streamline instructional processes. This acquisition positions Panorama Education at the forefront of the growing AI-driven EdTech market, addressing the increasing demand for efficient, technology-enhanced teaching tools. Zoom launched Zoom AI Companion 2.0 In October 2024, Zoom introduced Zoom AI Companion 2.0, the next-generation version of its AI assistant, designed to significantly enhance productivity across the Zoom Workplace. This advanced AI solution can intelligently surface key information, prioritize critical tasks, and seamlessly convert interactions into actionable items, helping users streamline their workday. Notably, Zoom is offering these capabilities at no additional cost, which is expected to drive faster adoption across enterprises. Built on Zoom's federated AI architecture, AI Companion 2.0 delivers consistent, high-quality results across the platform, strengthening Zoom's competitive edge in the rapidly expanding AI-powered collaboration tools market and further solidifying its position as a leader in enterprise productivity solutions. The Path Ahead The Global AI Companion App Market is at the nexus of technology, mental wellness, and human connectivity. As consumer expectations evolve driven by empathy, personalization, and privacy AI companion platforms that responsibly innovate will lead this revolution. Moreover, As AI companions evolve from simple chatbots to complex, emotionally responsive partners, they are expected to play a larger role in mental wellness, daily productivity, education, and smart device ecosystems. About Author: HTF Market Intelligence Consulting is uniquely positioned to empower and inspire with research and consulting services to empower businesses with growth strategies, by offering services with extraordinary depth and breadth of thought leadership, research, tools, events, and experience that assist in decision-making. TIME BUSINESS NEWS

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store