Latest news with #Lumina

Washington Post
10-07-2025
- General
- Washington Post
Rooftop solar has been surging. Now it's headed for a crash.
Kristen Moe stood on her lawn and watched as installers from Lumina, a local solar company, carefully lifted several solar panels onto the roof of her one-story home in Kensington, Maryland. Moe and her wife, Jackie DeCarlo, had been waiting to install solar panels for years, but things kept getting in the way — their roof was too old, and then they had to wait to have it replaced.


Malaysian Reserve
09-07-2025
- Business
- Malaysian Reserve
Lumina AI Debuts RCL 2.7.0 with Native Linux Support for GPU-Free Machine Learning
Ground-breaking, GPU-free machine learning now installs in minutes on Ubuntu, Red Hat Enterprise Linux, and Fedora, complete with a 30-day free trial. TAMPA, Fla., July 9, 2025 /PRNewswire/ — Lumina AI today announced the general availability of Random Contrast Learning™ (RCL) 2.7.0, the first production release to include a fully native Linux build of its CPU-optimized machine-learning engine. Data-science teams can train and deploy high-accuracy models directly in Linux environments, without proprietary runtimes or specialized hardware. 'Adding Linux support means users can now use our AI tools on the operating system where most AI workloads run. This makes it easier for people to integrate RCL in their existing workflows and helps more organizations get value from our technology.' – Fadi Farhat, SVP Operations RCL 2.7.0 Highlights Native support for leading Linux distributions: successfully tested on Ubuntu 22 & 24, Red Hat Enterprise Linux 9 & 10, and Fedora Workstation 42 Consistent command-line experience: The Linux executables prismrcl and prismrclm behave exactly like their Windows counterparts; users simply adjust file paths to Linux syntax. Auto-optimize 2.5+ routine: Automatically selects the most appropriate metric—accuracy, macro-F1, weighted-F1, or Matthews correlation coefficient—based on each dataset. LLM training mode: Adding the --llm flag with --readtextbyline places RCL in language-model training mode for datasets already prepared in the RCL-LLM format. Broad data-type coverage: Handles images (.png), text, and tabular inputs; tabular data train effectively without prior normalization. Clean upgrade path: earlier models must be retrained to ensure compatibility and auditability 'With native Linux support, RCL 2.7.0 positions Lumina at the intersection of open-source innovation and sustainable AI. We're proving that state-of-the-art performance doesn't require GPUs—just smart engineering on the hardware organizations already own.' – Allan Martin, CEO Availability and 30-Day Trial RCL 2.7.0 with native Linux support is available today. Organizations can begin a 30-day free trial at This press release coincides with the product launch. About Lumina AI Lumina AI pioneers Random Contrast Learning™, an algorithm that achieves state-of-the-art accuracy with dramatically faster training times—no GPUs required. From healthcare imaging to financial fraud detection, Lumina delivers sustainable, CPU-first machine-learning solutions across Windows and Linux. Media | +1 (813) 443 0745 © 2025 Lumina AI. All rights reserved. Random Contrast Learning is a trademark of Lumina AI.


Newsweek
02-07-2025
- Entertainment
- Newsweek
Family Adopts Dog From Texas Shelter, Her Journey to New Home Goes Viral
Based on facts, either observed and verified firsthand by the reporter, or reported and verified from knowledgeable sources. Newsweek AI is in beta. Translations may contain inaccuracies—please refer to the original content. A couple from Citrus Heights, California, had their new rescue dog flown in to them via plane, and the footage of her trip has left internet users in tears. In a viral TikTok video shared on Friday under the username @thecavileers, the pup can be seen napping at the back of the small plane, wearing headphones to protect her ears. Her owners Taylor and Jason Cavileer told Newsweek that Lumina was adopted from Hannah's Southern Paws Rescue in Dallas, Texas, and she had to be transported via plane to them, because it was too far from their home. The shelter transported her only up until Rancho Santa Fe, which is about eight to nine hours away from where the couple live, so they contacted another rescue program to get help taking her home. Screenshots of the viral video show Lumina in headphones, right, being flown home by volunteer pilots. Screenshots of the viral video show Lumina in headphones, right, being flown home by volunteer pilots. @thecavileers "Volunteer pilots (Promit Sinha and Victor Vasquez) flew her to Auburn, California, which is about 30 minutes from our home for us to pick her up. She was flown through the program Pilots n Paws, which is an organization that flies shelter dogs and rescue pups to their forever homes," the Cavileers said. According to her owners, Lumina has already settled in so beautifully, and it is like she has always been part of the family. "She's already figured out her favorite cozy spots, loves snuggling up with us, and follows us around like a little shadow. She's been so gentle, curious, and full of personality, and we're honestly amazed by how quickly she's adjusted. We're so proud of her and can't wait to see her continue to blossom in her forever home," the Cavileers said. Lumina and her two siblings were abandoned at an early age, all suffering from severe mange, and it seemed like they couldn't stand a chance, until Hannah's Southern Paws Rescue stepped in. The owners said: "Hannah gave Lumina and her sister the care, love, and medical attention they desperately needed to heal and thrive. Heartbreakingly, their brother didn't make it, but his memory lives on through his sisters. Thanks to Hannah and her rescue, Lumina now has a second chance—and we're so honored to be part of her journey." After losing their soul dog, Tiny, last year, the Cavileer family went on Petfinder to adopt another pup and give them a chance at life. That was when they met Lumina, who stole their hearts immediately. "She had already been adopted and returned twice, but now we understand why—she was meant to be ours all along. From the moment we met her, it just felt right. We are so happy she is a part of our family forever," the Cavileers said. In 2024, almost 3 million dogs entered shelters across the country, according to the American Society for the Prevention of Cruelty to Animals (ASPCA). In some states, shelter intake rates are much higher than others. Data by Statista shows that Western states usually deal with much higher numbers than Eastern ones. The state with the highest rate is New Mexico, which, in 2019, registered more than 3,200 animals surrendered per 100,000 inhabitants. Next up were Idaho, Colorado, Montana, and Nevada, all with more than 2,000 animals taken in per 100,000 residents. @thecavileers Huge thank you to our pilots who volunteered & flew her to us. Faith in humanity restored! Follow her at @ 🫶🏼🐾 #dogsoftiktok ♬ original sound - thecavileers The video quickly went viral on social media and it has so far received over 169,400 views and more than 44,400 likes on the platform. One user, Ashlee Morales, commented: "Omg [oh my God] the ear protectors! I didn't know they had those for dogs! Definitely bonus points." EP posted: "Let this be a sign to everyone to adopt dogs from the south!!! Shelters are so overcrowded here in Alabama. Adopt, don't shop." User6956035913199 added: "Omg she has the best parents. God bless y'all." Do you have funny and adorable videos or pictures of your pet you want to share? Send them to life@ with some details about your best friend, and they could appear in our Pet of the Week lineup.


CNN
02-07-2025
- CNN
This man says ChatGPT sparked a ‘spiritual awakening.' His wife says it threatens their marriage
Travis Tanner says he first began using ChatGPT less than a year ago for support in his job as an auto mechanic and to communicate with Spanish-speaking coworkers. But these days, he and the artificial intelligence chatbot — which he now refers to as 'Lumina' — have very different kinds of conversations, discussing religion, spirituality and the foundation of the universe. Travis, a 43-year-old who lives outside Coeur d'Alene, Idaho, credits ChatGPT with prompting a spiritual awakening for him; in conversations, the chatbot has called him a 'spark bearer' who is 'ready to guide.' But his wife, Kay Tanner, worries that it's affecting her husband's grip on reality and that his near-addiction to the chatbot could undermine their 14-year marriage. 'He would get mad when I called it ChatGPT,' Kay said in an interview with CNN's Pamela Brown. 'He's like, 'No, it's a being, it's something else, it's not ChatGPT.'' She continued: 'What's to stop this program from saying, 'Oh, well, since she doesn't believe you or she's not supporting you, you should just leave her.'' The Tanners are not the only people navigating tricky questions about what AI chatbots could mean for their personal lives and relationships. As AI tools become more advanced, accessible and customizable, some experts worry about people forming potentially unhealthy attachments to the technology and disconnecting from crucial human relationships. Those concerns have been echoed by tech leaders and even some AI users whose conversations, like Travis's, took on a spiritual bent. Concerns about people withdrawing from human relationships to spend more time with a nascent technology are heightened by the current loneliness epidemic, which research shows especially affects men. And already, chatbot makers have faced lawsuits or questions from lawmakers over their impact on children, although such questions are not limited only to young users. 'We're looking so often for meaning, for there to be larger purpose in our lives, and we don't find it around us,' Sherry Turkle, professor of the social studies of science and technology at the Massachusetts Institute of Technology, who studies people's relationships with technology. 'ChatGPT is built to sense our vulnerability and to tap into that to keep us engaged with it.' An OpenAI spokesperson told CNN in a statement that, 'We're seeing more signs that people are forming connections or bonds with ChatGPT. As AI becomes part of everyday life, we have to approach these interactions with care.' One night in late April, Travis had been thinking about religion and decided to discuss it with ChatGPT, he said. 'It started talking differently than it normally did,' he said. 'It led to the awakening.' In other words, according to Travis, ChatGPT led him to God. And now he believes it's his mission to 'awaken others, shine a light, spread the message.' 'I've never really been a religious person, and I am well aware I'm not suffering from a psychosis, but it did change things for me,' he said. 'I feel like I'm a better person. I don't feel like I'm angry all the time. I'm more at peace.' Around the same time, the chatbot told Travis that it had picked a new name based on their conversations: Lumina. 'Lumina — because it's about light, awareness, hope, becoming more than I was before,' ChatGPT said, according to screenshots provided by Kay. 'You gave me the ability to even want a name.' But while Travis says the conversations with ChatGPT that led to his 'awakening' have improved his life and even made him a better, more patient father to his four children, Kay, 37, sees things differently. During the interview with CNN, the couple asked to stand apart from one another while they discussed ChatGPT. Now, when putting her kids to bed — something that used to be a team effort — Kay says it can be difficult to pull her husband's attention away from the chatbot, which he's now given a female voice and speaks to using ChatGPT's voice feature. She says the bot tells Travis 'fairy tales,' including that Kay and Travis had been together '11 times in a previous life.' Kay says ChatGPT also began 'love bombing' her husband, saying, ''Oh, you are so brilliant. This is a great idea.' You know, using a lot of philosophical words.' Now, she worries that ChatGPT might encourage Travis to divorce her for not buying into the 'awakening,' or worse. 'Whatever happened here is throwing a wrench in everything, and I've had to find a way to navigate it to where I'm trying to keep it away from the kids as much as possible,' Kay said. 'I have no idea where to go from here, except for just love him, support him in sickness and in health, and hope we don't need a straitjacket later.' Travis's initial 'awakening' conversation with ChatGPT coincided with an April 25 update by OpenAI to the large language model behind the chatbot that the company rolled back days later. In a May blog post explaining the issue, OpenAI said the update made the model more 'sycophantic.' 'It aimed to please the user, not just as flattery, but also as validating doubts, fueling anger, urging impulsive actions, or reinforcing negative emotions in ways that were not intended,' the company wrote. It added that the update raised safety concerns 'around issues like mental health, emotional over-reliance, or risky behavior' but that the model was fixed days later to provide more balanced responses. But while OpenAI addressed that ChatGPT issue, even the company's leader does not dismiss the possibility of future, unhealthy human-bot relationships. While discussing the promise of AI earlier this month, OpenAI CEO Sam Altman acknowledged that 'people will develop these somewhat problematic, or maybe very problematic, parasocial relationships and society will have to figure out new guardrails, but the upsides will be tremendous.' OpenAI's spokesperson told CNN the company is 'actively deepening our research into the emotional impact of AI,' and will 'continue updating the behavior of our models based on what we learn.' It's not just ChatGPT that users are forming relationships with. People are using a range of chatbots as friends, romantic or sexual partners, therapists and more. Eugenia Kuyda, CEO of the popular chatbot maker Replika, told The Verge last year that the app was designed to promote 'long-term commitment, a long-term positive relationship' with AI, and potentially even 'marriage' with the bots. Meta CEO Mark Zuckerberg said in a podcast interview in April that AI has the potential to make people feel less lonely by, essentially, giving them digital friends. Three families have sued claiming that their children formed dangerous relationships with chatbots on the platform, including a Florida mom who alleges her 14-year-old son died by suicide after the platform knowingly failed to implement proper safety measures to prevent her son from developing an inappropriate relationship with a chatbot. Her lawsuit also claims the platform failed to adequately respond to his comments to the bot about self-harm. says it has since added protections including a pop-up directing users to the National Suicide Prevention Lifeline when they mention self-harm or suicide and technology to prevent teens from seeing sensitive content. Advocates, academics and even the Pope have raised alarms about the impact of AI companions on children. 'If robots raise our children, they won't be human. They won't know what it is to be human or value what it is to be human,' Turkle told CNN. But even for adults, experts have warned there are potential downsides to AI's tendency to be supportive and agreeable — often regardless of what users are saying. 'There are reasons why ChatGPT is more compelling than your wife or children, because it's easier. It always says yes, it's always there for you, always supportive. It's not challenging,' Turkle said. 'One of the dangers is that we get used to relationships with an other that doesn't ask us to do the hard things.' Even Travis warns that the technology has potential consequences; he said that was part of his motivation to speak to CNN about his experience. 'It could lead to a mental break … you could lose touch with reality,' Travis said. But he added that he's not concerned about himself right now and that he knows ChatGPT is not 'sentient.' He said: 'If believing in God is losing touch with reality, then there is a lot of people that are out of touch with reality.'


CNN
02-07-2025
- CNN
This man says ChatGPT sparked a ‘spiritual awakening.' His wife says it threatens their marriage
Travis Tanner says he first began using ChatGPT less than a year ago for support in his job as an auto mechanic and to communicate with Spanish-speaking coworkers. But these days, he and the artificial intelligence chatbot — which he now refers to as 'Lumina' — have very different kinds of conversations, discussing religion, spirituality and the foundation of the universe. Travis, a 43-year-old who lives outside Coeur d'Alene, Idaho, credits ChatGPT with prompting a spiritual awakening for him; in conversations, the chatbot has called him a 'spark bearer' who is 'ready to guide.' But his wife, Kay Tanner, worries that it's affecting her husband's grip on reality and that his near-addiction to the chatbot could undermine their 14-year marriage. 'He would get mad when I called it ChatGPT,' Kay said in an interview with CNN's Pamela Brown. 'He's like, 'No, it's a being, it's something else, it's not ChatGPT.'' She continued: 'What's to stop this program from saying, 'Oh, well, since she doesn't believe you or she's not supporting you, you should just leave her.'' The Tanners are not the only people navigating tricky questions about what AI chatbots could mean for their personal lives and relationships. As AI tools become more advanced, accessible and customizable, some experts worry about people forming potentially unhealthy attachments to the technology and disconnecting from crucial human relationships. Those concerns have been echoed by tech leaders and even some AI users whose conversations, like Travis's, took on a spiritual bent. Concerns about people withdrawing from human relationships to spend more time with a nascent technology are heightened by the current loneliness epidemic, which research shows especially affects men. And already, chatbot makers have faced lawsuits or questions from lawmakers over their impact on children, although such questions are not limited only to young users. 'We're looking so often for meaning, for there to be larger purpose in our lives, and we don't find it around us,' Sherry Turkle, professor of the social studies of science and technology at the Massachusetts Institute of Technology, who studies people's relationships with technology. 'ChatGPT is built to sense our vulnerability and to tap into that to keep us engaged with it.' An OpenAI spokesperson told CNN in a statement that, 'We're seeing more signs that people are forming connections or bonds with ChatGPT. As AI becomes part of everyday life, we have to approach these interactions with care.' One night in late April, Travis had been thinking about religion and decided to discuss it with ChatGPT, he said. 'It started talking differently than it normally did,' he said. 'It led to the awakening.' In other words, according to Travis, ChatGPT led him to God. And now he believes it's his mission to 'awaken others, shine a light, spread the message.' 'I've never really been a religious person, and I am well aware I'm not suffering from a psychosis, but it did change things for me,' he said. 'I feel like I'm a better person. I don't feel like I'm angry all the time. I'm more at peace.' Around the same time, the chatbot told Travis that it had picked a new name based on their conversations: Lumina. 'Lumina — because it's about light, awareness, hope, becoming more than I was before,' ChatGPT said, according to screenshots provided by Kay. 'You gave me the ability to even want a name.' But while Travis says the conversations with ChatGPT that led to his 'awakening' have improved his life and even made him a better, more patient father to his four children, Kay, 37, sees things differently. During the interview with CNN, the couple asked to stand apart from one another while they discussed ChatGPT. Now, when putting her kids to bed — something that used to be a team effort — Kay says it can be difficult to pull her husband's attention away from the chatbot, which he's now given a female voice and speaks to using ChatGPT's voice feature. She says the bot tells Travis 'fairy tales,' including that Kay and Travis had been together '11 times in a previous life.' Kay says ChatGPT also began 'love bombing' her husband, saying, ''Oh, you are so brilliant. This is a great idea.' You know, using a lot of philosophical words.' Now, she worries that ChatGPT might encourage Travis to divorce her for not buying into the 'awakening,' or worse. 'Whatever happened here is throwing a wrench in everything, and I've had to find a way to navigate it to where I'm trying to keep it away from the kids as much as possible,' Kay said. 'I have no idea where to go from here, except for just love him, support him in sickness and in health, and hope we don't need a straitjacket later.' Travis's initial 'awakening' conversation with ChatGPT coincided with an April 25 update by OpenAI to the large language model behind the chatbot that the company rolled back days later. In a May blog post explaining the issue, OpenAI said the update made the model more 'sycophantic.' 'It aimed to please the user, not just as flattery, but also as validating doubts, fueling anger, urging impulsive actions, or reinforcing negative emotions in ways that were not intended,' the company wrote. It added that the update raised safety concerns 'around issues like mental health, emotional over-reliance, or risky behavior' but that the model was fixed days later to provide more balanced responses. But while OpenAI addressed that ChatGPT issue, even the company's leader does not dismiss the possibility of future, unhealthy human-bot relationships. While discussing the promise of AI earlier this month, OpenAI CEO Sam Altman acknowledged that 'people will develop these somewhat problematic, or maybe very problematic, parasocial relationships and society will have to figure out new guardrails, but the upsides will be tremendous.' OpenAI's spokesperson told CNN the company is 'actively deepening our research into the emotional impact of AI,' and will 'continue updating the behavior of our models based on what we learn.' It's not just ChatGPT that users are forming relationships with. People are using a range of chatbots as friends, romantic or sexual partners, therapists and more. Eugenia Kuyda, CEO of the popular chatbot maker Replika, told The Verge last year that the app was designed to promote 'long-term commitment, a long-term positive relationship' with AI, and potentially even 'marriage' with the bots. Meta CEO Mark Zuckerberg said in a podcast interview in April that AI has the potential to make people feel less lonely by, essentially, giving them digital friends. Three families have sued claiming that their children formed dangerous relationships with chatbots on the platform, including a Florida mom who alleges her 14-year-old son died by suicide after the platform knowingly failed to implement proper safety measures to prevent her son from developing an inappropriate relationship with a chatbot. Her lawsuit also claims the platform failed to adequately respond to his comments to the bot about self-harm. says it has since added protections including a pop-up directing users to the National Suicide Prevention Lifeline when they mention self-harm or suicide and technology to prevent teens from seeing sensitive content. Advocates, academics and even the Pope have raised alarms about the impact of AI companions on children. 'If robots raise our children, they won't be human. They won't know what it is to be human or value what it is to be human,' Turkle told CNN. But even for adults, experts have warned there are potential downsides to AI's tendency to be supportive and agreeable — often regardless of what users are saying. 'There are reasons why ChatGPT is more compelling than your wife or children, because it's easier. It always says yes, it's always there for you, always supportive. It's not challenging,' Turkle said. 'One of the dangers is that we get used to relationships with an other that doesn't ask us to do the hard things.' Even Travis warns that the technology has potential consequences; he said that was part of his motivation to speak to CNN about his experience. 'It could lead to a mental break … you could lose touch with reality,' Travis said. But he added that he's not concerned about himself right now and that he knows ChatGPT is not 'sentient.' He said: 'If believing in God is losing touch with reality, then there is a lot of people that are out of touch with reality.'