logo
#

Latest news with #emotionalSupport

ChatGPT conversations could be shared with court
ChatGPT conversations could be shared with court

Russia Today

time2 days ago

  • Business
  • Russia Today

ChatGPT conversations could be shared with court

The tech industry has yet to resolve how to protect user privacy in sensitive interactions with AI, CEO of industry leader OpenAI Sam Altman has admitted. Current systems lack adequate safeguards for confidential conversations, he warned, amid a surge in the use of AI chatbots by millions of users – including children – for therapy and emotional support. Speaking to the This Past Weekend podcast published last week, Altman said users should not expect legal confidentiality when using ChatGPT, while he cited the absence of a legal or policy framework governing AI. 'People talk about the most personal sh** in their lives to ChatGPT,' he said. Many AI users – particularly young people – treat the chatbot like a therapist or life coach for advice on relationship and emotional issues, Altman revealed. However unlike conversations with lawyers or therapists, which are protected by legal privilege or confidentiality, no such protections currently exist for interactions with AI. 'We haven't figured that out yet for when you talk to ChatGPT,' he added. Altman said the issue of confidentiality and privacy in AI interactions needs urgent attention. 'So if you go talk to ChatGPT about your most sensitive stuff and then there's like a lawsuit or whatever, we could be required to produce that, and I think that's very screwed up,' he said. OpenAI claims it deletes free-tier ChatGPT conversations after 30 days, however, some chats could be stored for legal or security reasons. The company is facing a lawsuit from The New York Times over alleged copyright infringement over the use of Times articles in training its AI models. The case has compelled OpenAI to preserve user conversations from millions of ChatGPT users, barring those by enterprise clients, an order the company has appealed, citing 'overreach.' Latest research has found that ChatGPT has been linked to psychosis in some users. According to researchers, concerns are growing that AI chatbots could exacerbate psychiatric conditions as they are increasingly used in personal and emotional contexts.

Beloved ballpark therapy bunny, Alex the great, has died after complications from cancer
Beloved ballpark therapy bunny, Alex the great, has died after complications from cancer

Al Arabiya

time22-07-2025

  • Sport
  • Al Arabiya

Beloved ballpark therapy bunny, Alex the great, has died after complications from cancer

SAN FRANCISCO (AP) — A beloved therapy bunny named Alex The Great, who provided snuggles and comfort from ballparks to NBA arenas, airports, farmers markets, and even Easter egg hunts and NASCAR races, has died. He was 4. A floppy-eared Flemish giant who was larger than life in both size and spirit, Alex suffered complications from cancer treatment and died early Monday, his owners said. The rabbit had undergone care at the renowned UC Davis Veterinary Hospital in recent days. Alex appeared June 4 sporting his signature cap for Padres-Giants at Oracle Park. At 4 months old, he attended his first Giants game in April 2021–believed to be the first bunny in the stands at the waterfront ballpark. He loved wearing bow ties and riding in his remote-controlled car, which Alex did in November 2021 following an Arizona Fall League appearance at Scottsdale Stadium, where he saw now-Angels catcher Logan OHoppe as a rising prospect. Owners Kei Kato and Josh Row saved Alex from a slaughterhouse–but really it was the bunny who saved them. They took Alex on all their trips and he spent hours at San Francisco International Airport with a golden retriever friend offering travelers emotional support. 'He saved us and saved so many people,' Kato said via text message Monday. 'All the stories people are sharing are so overwhelming.' Fans stopped in awe when they saw Alex The Great wherever he went, often surprised by his size and always eager to snap a photo or selfie. Kato and Row were thrilled to share him with the world because Alex had brought them so much love and joy and they wanted to spread that to anyone who might need a lift or a smile. Or provide a chance to pet Alex's soft orange fur or give him a hug. 'We remember him well for his surprise frequent visits to the ballpark,' Giants CEO Larry Baer said in a text message. 'We remember the comfort he brought those who loved him and the joy he brought so many.' Kato lost her brewery restaurant during the pandemic, and adopting Alex provided her with a new purpose. He helped Kato deal with the anxiety and stress of no longer having her main source of income and the fulfillment her business brought. 'I lost it all because of COVID so I've been really stressed a lot,' Kato said at the ballpark that spring night in 2021. 'We support local. I was a local. He's well trained too.' When Alex became such a hit on the big screen, quick-thinking Daniel Kurish of the Marlins media relations staff went to find the bunny in the seventh inning to deliver some Miami gear. Less than a month later in May 2021, Alex appeared at a Suns-Warriors game at Chase Center. Of course, they loved him there too. He'd also pop up outside the arena in Thrive City every now and then to greet fans before games. 'Let his legend continue,' Kato and Row wrote on Alex's social media, 'he was very loved.'

UAE experts warn AI may feel like 'real' therapist, delays mental health help
UAE experts warn AI may feel like 'real' therapist, delays mental health help

Khaleej Times

time12-07-2025

  • Health
  • Khaleej Times

UAE experts warn AI may feel like 'real' therapist, delays mental health help

For 27-year-old Sara (name changed upon request), ChatGPT was a good resource for work-related help. She used it for fact-checking, clarifying ideas, and getting help on the go. However, it soon began to shift to something more. 'I began using it during emotionally tough situations at work, in the family and even in relationships,' she said. 'I love how it analyses everything like it reads my mind. It gives me another perspective, and I love the reassurance I get from it. It makes me feel like my feelings are valid.' Over time, Sara started using ChatGPT to reflect on her habits and personality. 'It became like a coach, helping me understand myself better,' she said. 'I'd love to go to therapy one day even if it's just for self-awareness. However, therapy can be expensive and out of budget sometimes. It's really comforting to have something private, discreet, and available 24/7. Especially when I feel a panic attack coming on.' Experts say that the increasing trend of youngsters turning to ChatGPT for mental support is 'not surprising' but is extremely 'concerning' for a number of reasons. 'It's not surprising that more young people are turning to AI platforms like ChatGPT for emotional support,' said Dr Alexandre Machado, Clinical Neuropsychologist at Hakkini mental health clinic in Dubai. 'It's easy, anonymous, and always available, kind of like having a friend in your pocket.' Concerns, hidden dangers However, the real danger lies hidden, said Dr Waleed Alomar, specialist psychiatrist at Medcare Royal Speciality Hospital in Al Qusais. 'It's concerning that some chatbots are presenting themselves as therapists,' he said. 'While users might [initially] be aware that they are chatting with a bot, many, particularly young people, can easily get carried away and start to feel like they are speaking to a real person or even a licensed professional.' He added that this is an issue because artificial intelligence does not always recognise the line between everyday sadness and a serious mental health issue. 'Since chatbots lack the credentials to diagnose or treat serious mental health conditions, they cannot connect users with human care when a person genuinely needs a mental health expert's support,' he said. 'While a chatbot may offer a brief sense of relief, it might also delay people from pursuing the professional help they truly need, leaving them feeling even more isolated.' His comments were echoed by Dr Alexandre who said there were plenty of case studies to prove how dangerous the trend can be. 'For example, a man in Belgium ended his life after being influenced by a bot, and a young boy in the UK once tried to assassinate the queen based on AI advice,' he said. 'These cases show how dangerous it can be to rely on unregulated AI for emotional support.' Benefits of 'instant' support Despite the obvious concerns, the experts agree that there were some benefits to having AI as a mental health support tool. 'AI tools are accessible anytime, which they may find especially helpful during those late-night hours when emotions can feel overwhelming,' said Dr Waleed. 'For a generation that has grown up with on-demand services, having support available 'anytime' is a real breakthrough. Also using AI for mental health support provides a sense of anonymity and a non-judgmental space.' Dr Alexandre added that while the tools cannot replace a therapist, they can help in some situations. 'It's easy, anonymous, and always available, kind of like having a friend in your pocket,' he said. 'But it's important to remember that AI can't adapt like a human can. Use it as a tool, but don't let it take over.'

It's Hard for Britney Spears To See Her Ex Happy With Someone Else, Says ‘Source'
It's Hard for Britney Spears To See Her Ex Happy With Someone Else, Says ‘Source'

Yahoo

time10-07-2025

  • Entertainment
  • Yahoo

It's Hard for Britney Spears To See Her Ex Happy With Someone Else, Says ‘Source'

has reportedly been having a difficult time letting go of her ex, Sam Asghari, following their split. The former couple finalized their divorce in December 2024, but the pop icon has struggled to move on from the model, who has since found love. A source claimed that the singer still relies on him for emotional support, but he isn't having any of it. Additionally, she doesn't want to see him happy with someone else, but he won't look back. Britney Spears' ex, Sam Asghari, seemingly doesn't want to hear from her anymore as she continues relying on him even after their split. A source told RadarOnline that she has been a 'trainwreck' and on a 'self-destructive path' since they parted ways. However, he has reportedly grown 'sick of listening to her' complaining about her family. The insider alleged, 'That's why he left in the first place.' Moreover, Asghari, currently dating a Los Angeles real estate agent, Brooke Irvine, 'doesn't want to hear it anymore.' The source claimed that he has 'moved on and found love.' However, it appears it's hard for Spears to see her ex-husband 'finding happiness with somebody else.' She has seemingly been experiencing difficulty in leaving their relationship in the past, but the actor-trainer 'is having none of it.' Some allegedly believe 'it's fueling her recent bizarre behavior.' A second insider further weighed in on Britney Spears' split from her ex, Sam Asghari, and the impact it had on her. They claimed she 'has been on this self-destructive path for a while' and expressed concern regarding the recent state of affairs. 'This is getting seriously scary,' the source stated, adding that she only trusts her ex-husband. As for Asghari, he won't be 'looking back,' remains 'deliriously happy' with his girlfriend, and doesn't 'want to wreck his future with her.' The former couple was married for a little over a year before splitting and eventually divorcing. The post It's Hard for Britney Spears To See Her Ex Happy With Someone Else, Says 'Source' appeared first on Reality Tea.

The New Corporate Memo: Let AI Ease The Pain
The New Corporate Memo: Let AI Ease The Pain

Gizmodo

time06-07-2025

  • Business
  • Gizmodo

The New Corporate Memo: Let AI Ease The Pain

A troubling new trend is crystallizing in the tech industry. A company at the forefront of AI development lays off thousands of its human employees, then encourages them to seek comfort from the very technology supplanting them. It's the automation of suffering, and it's happening now. This week, Matt Turnbull, an Executive Producer at Xbox Game Studios Publishing, became a case study. Following Microsoft's decision to cut thousands jobs from its gaming division, Turnbull took to LinkedIn. With what seems like good intentions, he encouraged former employees to turn to AI tools like ChatGPT and Copilot to manage the emotional and logistical fallout. 'These are really challenging times, and if you're navigating a layoff or even quietly preparing for one, you're not alone and you don't have to go it alone,' his post began. 'I know these types of tools engender strong feelings in people, but I'd be remiss in not trying to offer the best advice I can under the circumstances.' He continued: 'I've been experimenting with ways to use LLM AI tools (like ChatGPT or Copilot) to help reduce the emotional and cognitive load that comes with job loss.' AI Is Not in the Memo, but It Haunts Every Layoff at Xbox The message landed with a surreal thud. Microsoft, which just ended your employment, was now outsourcing your emotional support to a bot. The July layoffs hit Xbox Game Studios. Alongside the job cuts, Microsoft announced that ambitious titles like Perfect Dark and Everwild are being canceled, and at least one studio, The Initiative, one of Microsoft's newer, high-profile studios, is being closed entirely. In his now deleted post captured by Aftermath, Turnbull even offered prompt templates to help the newly unemployed start their conversations with the AI. These folks are absolute sociopaths. — Julien Eveillé – THRESHOLD 30% OFF (@PATALOON) July 4, 2025 He categorized the prompts like a self help guide for the digital age: Career Planning Resume & LinkedIn Help Networking & Outreach Emotional Clarity & Confidence The message is clear: AI is your new therapist and outplacement service, rolled into one. Where a hefty severance package from a large corporation once included connections to human career coaches, AI now appears to be the cheaper, more scalable solution. While the prompts themselves may be useful, the gesture feels hollow coming from a leader at the company responsible for the layoffs. This is a stark redefinition of corporate care: outsourced, AI assisted, and quietly depersonalized. It's a chilling reframing of the social contract, where even empathy is routed through software. This is the tech world's cynical feedback loop. The same industry obsessed with automating jobs is now positioning its products as the cure for the emotional damage it inflicts. Microsoft, which has invested over $13 billion into OpenAI, has a direct financial stake in this solution. When an executive at a Microsoft owned studio promotes ChatGPT or its own Copilot as the first resource for the unemployed, it blurs the line between genuine concern and brand alignment. Empathy becomes a use case. Trauma becomes another customer journey. Traditionally, outplacement services offered a human touch. As LLMs become more powerful, the corporate pressure to automate post layoff support will only grow. A chatbot can rewrite your resume, coach you for interviews, and talk you down from a mental spiral, at least in theory. But what gets lost in that shift? What happens to the human dignity of grief, reflection, and real connection during a time of professional crisis? Even Turnbull acknowledged the tension in his post: 'No AI tool is a replacement for your voice or your lived experience. But at a time when mental energy is scarce, these tools can help get you unstuck faster, calmer, and with more clarity.' Turnbull's post isn't an isolated incident; it's a flare signaling a major cultural shift in tech, where recovery is being privatized, individualized, and automated. There's a strange, unnerving optimism embedded in all this: the belief that you can prompt your way out of pain. But pain isn't a productivity issue. And a layoff isn't a user experience problem. If the only support a worker receives is from a chatbot trained on the internet's vast archive of trauma, we are witnessing the dawn of something much darker than a downturn. We are seeing the first wave of algorithmic grief management, sanctioned by the very forces that deemed human workers disposable in the first place.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store