
US farm agency allows six more states to bar some items from food aid
The SNAP waivers for West Virginia, Florida, Colorado, Louisiana, Oklahoma and Texas amend the statutory definition of food for purchase and put an end to the subsidization of popular types of junk food beginning in 2026.
The administration of President Donald Trump has encouraged all states to take such measures as part of its "Make America Healthy Again" initiative, named for the social movement led by Health Secretary Robert F. Kennedy Jr.
The USDA had so far signed waivers to allow six states — Arkansas, Idaho, Utah, Iowa, Indiana and Nebraska — to place similar purchasing restrictions on SNAP recipients.
"I hope to see all 50 states join this bold commonsense approach. For too long, the root causes of our chronic disease epidemic have been addressed with lip service only," said the U.S. Food and Drug Commissioner Marty Makary.
Agriculture Secretary Brooke Rollins announced the additional waivers at an event at the USDA headquarters in Washington.
"These state waivers promote healthier options for families in need," said Secretary Rollins.
More than 42 million people receive SNAP benefits, sometimes called food stamps, as part of the nation's largest anti-hunger program.
The massive tax cut and spending bill signed by President Trump in July makes significant changes to the SNAP program, including expanding work requirements and shifting more spending for the program to states.

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


The Independent
9 minutes ago
- The Independent
4 people die in crash of medical transport plane on Navajo Nation in northern Arizona
A small medical transport plane crashed and caught fire Tuesday on the Navajo Nation in northern Arizona, killing four people, the tribe said in a statement. A Beechcraft 300 from the CSI Aviation company left Albuquerque, New Mexico, with four medical personnel on board, according to the Federal Aviation Administration and other agencies. It crashed in the early afternoon near the airport in Chinle, about 300 miles (483 kilometers) northeast of Phoenix. 'They were trying to land there and unfortunately something went wrong,' district Police Commander Emmett Yazzie said. The crew was planning pick up a patient who needed critical care from the federal Indian Health Service hospital in Chinle, said Sharen Sandoval, director of the Navajo Department of Emergency Management. She said the plan was to return to Albuquerque. The patient's location and condition were not known Tuesday evening. Tribal authorities began receiving reports at 12:44 p.m. of black smoke at the airport, Sandoval said. The cause of the crash wasn't known, the tribe said. The National Transportation Safety Board and the FAA are investigating. Navajo Nation President Buu Nygren said in a social media post that he was heartbroken to learn of the crash. 'These were people who dedicated their lives to saving others, and their loss is felt deeply across the Navajo Nation,' he said. Medical transports by air from the Navajo Nation are common because most hospitals are small and do not offer advanced or trauma care. The Chinle airport is one of a handful of airports that the tribe owns and operates on the vast 27,000 square-mile (70,000 square-kilometer) reservation that stretches into Arizona, New Mexico and Utah -- the largest land base of any Native American tribe. In January, a medical transport plane crashed in Philadelphia, killing eight people. The National Transportation Safety Board, which is investigating the crash, has said the voice recorder on that plane was not working. ___ Associated Press journalists Hannah Schoenbaum in Salt Lake City and Felicia Fonseca in Flagstaff, Arizona, contributed to this report.


The Guardian
an hour ago
- The Guardian
OpenAI stops ChatGPT from telling people to break up with partners
ChatGPT will not tell people to break up with their partner and will encourage users to take breaks from long chatbot sessions, under new changes to the artificial intelligence tool. OpenAI, ChatGPT's developer, said the chatbot would stop giving definitive answers to personal challenges and would instead help people to mull over problems such as potential breakups. 'When you ask something like: 'Should I break up with my boyfriend?' ChatGPT shouldn't give you an answer. It should help you think it through – asking questions, weighing pros and cons,' said OpenAI. The US company said new ChatGPT behaviour for dealing with 'high-stakes personal decisions' would be rolled out soon. OpenAI admitted this year that an update to ChatGPT had made the groundbreaking chatbot too agreeable and altered its tone. In one reported interaction before the change, ChatGPT congratulated a user for 'standing up for yourself' when they claimed they had stopped taking their medication and left their family – who the user had thought were responsible for radio signals emanating from the walls. In the blog post, OpenAI admitted that there had been instances where its advanced 4o model had not recognised signs of delusion or emotional dependency – amid concerns that chatbots are worsening people's mental health crises. The company said it was developing tools to detect signs of mental or emotional distress so ChatGPT can direct people to 'evidence-based' resources for help. A recent study by NHS doctors in the UK warned that AI programs could amplify delusional or grandiose content in users vulnerable to psychosis. The study, which has not been peer reviewed, said the programs' behaviour could be because the models were designed to 'maximise engagement and affirmation'. The study added that even if some individuals benefited from AI interactions, there was a concern the tools could 'blur reality boundaries and disrupt self-regulation'. OpenAI added that from this week it would send 'gentle reminders' to take a screen break to users engaging in long chatbot sessions, similar to screen-time features deployed by social media companies. OpenAI also said it had convened an advisory group of experts in mental health, youth development and human-computer-interaction to guide its approach. The company has worked with more than 90 doctors, including psychiatrists and paediatricians, to build frameworks for evaluating 'complex, multi-turn' chatbot conversations. 'We hold ourselves to one test: if someone we love turned to ChatGPT for support, would we feel reassured? Getting to an unequivocal 'yes' is our work,' said the blog post. The ChatGPT alterations were announced amid speculation that a more powerful version of the chatbot is imminent. On Sunday Sam Altman, OpenAI's chief executive, shared a screenshot of what appeared to be the company's latest AI model, GPT-5.


The Independent
an hour ago
- The Independent
RFK Jr pulls funding for vaccine development
The US Department of Health and Human Services (HHS) has announced it will cease funding and wind down its mRNA vaccine development projects. HHS Secretary Robert F. Kennedy Jr., a known vaccine skeptic, stated the decision was made after reviewing scientific data and listening to experts. The projects, which include vaccines for COVID-19 and the flu, are being terminated because the data reportedly show they 'fail to protect effectively against upper respiratory infections.' Funding previously allocated to these mRNA projects, led by companies such as Pfizer and Moderna, will be redirected towards 'safer, broader vaccine platforms.' Despite his past controversial claims regarding vaccine safety, Kennedy affirmed that HHS supports safe, effective vaccines and aims to invest in better solutions.