logo
Struggling Texas cotton industry emphasizing the hazards of fast fashion

Struggling Texas cotton industry emphasizing the hazards of fast fashion

LUBBOCK, Texas (AP) — For decades, the cotton industry has long been considered king in the Texas agriculture world. However, a shift has left it standing on shaky ground.
In the last few years — as cotton producers struggled with low market prices, high costs of business, and unpredictable weather — synthetic fibers have become more mainstream. Fast fashion outlets on the internet are offering clothes made of polyester, nylon and spandex at hard-to-beat prices. And for customers dealing with inflation and the rise of influencer culture, the clothes are flying off the virtual shelves.
'We've been growing this safe fiber all our lives, and we can't seem to get any traction,' said Walt Hagood, a cotton producer outside Lubbock. 'If people want cotton, it would be really helpful for them to go out and start asking the stores for it.'
The cotton industry isn't going down without a fight, though. Producers in the Texas High Plains, where 30% of the nation's cotton is grown, have started raising awareness about synthetic fibers and what impacts the non-biodegradable products have on the environment and consumer health.
In recent months, Plains Cotton Growers, an organization that represents cotton producers in the region, has shared infographics about synthetic fibers. Almost 70% of clothes in fast fashion are made with synthetics, mostly polyester, which is usually made from petroleum. Plastic-based fibers are not biodegradable.
Microplastics, which shed when the clothes are made, washed, and worn, are affecting more than the cotton industry. These tiny plastic particles build up in water supply sources, contaminating drinking water and polluting lakes and rivers. This is also a cause of concern for farmers, who depend on good water quality to prop up their crops.
As the competition for consumers grows, cotton farmers are hoping to gain a powerful ally in their mission against fast fashion: U.S. Health Secretary Robert F. Kennedy Jr. He has already shown interest in regulating warning labels for foods containing synthetic dyes and other additives. They hope he can take a closer look at the impact the man-made fibers have on the environment and consumer health.
Kara Bishop, director of communications and public affairs for Plains Cotton Growers, has been behind much of the messaging on social media. Following the COVID-19 pandemic, Bishop saw the rise in athleisure wear and 'shopping hauls' featuring TikTok influencers showing off clothes from known fast-fashion outlets. Even when she would shop, Bishop said it was hard to find clothes that were 100% cotton that were also fashionable.
Once she saw that synthetic manufacturers were able to replicate crochet tops or denim vests and blazers without cotton, Bishop knew there was a problem. She realized there wasn't enough awareness for consumers about cotton, or the harm caused by polyester and other synthetic fibers.
'We've got to do something to slow down the momentum of plastic pollution,' Bishop said. 'But there's got to be some kind of emotional anchor. You can't just tell people to wear cotton.'
Bishop said this is why she started highlighting the health risks on social media. Some posts focus on health and environmental concerns, including one that links to a study estimating humans ingest a credit-card size amount of plastic each week. Another explains cotton microfibers break down in water within a few months. Synthetic microfibers, on the other hand, can take between 20 to 200 years to break down. Bishop also created a list of stores where people can buy cotton-rich clothes and other products, such as backpacks.
Bishop saw this as an opportunity for the cotton industry to have better messaging. Cotton producers typically have to defend their practices, including their use of chemicals like pesticides. Bishop said cotton growers have used less chemicals over the years due to poor production, particularly in comparison to the amount of chemicals used for synthetic fibers. By raising awareness on the dangers of man-made synthetic fibers, they could help their cause and the environment.
'This is a place where we can actually be on the offense and say, 'Hey, you're wearing petroleum and it's going to hurt you and the planet,' Bishop said.
Balaji Rao, a professor and microplastics researcher at Texas Tech University, said synthetic fibers are designed to be stable and not degrade. When they break down over time, Rao said, the plastics enter the environment and stay there.
'It's not that they stay forever, but long enough that they can potentially impact the environment,' Rao said. 'Natural fibers do degrade because they are designed by nature.'
According to the National Oceanic and Atmospheric Administration, microplastics are found throughout all sources of water — from the ocean to tap and bottled water. One study, published in the 2024 Proceedings of the National Academy of Sciences, found that plastic contamination is in every step involved in the production of drinking water, from when the water is drawn from a well to when it's in the bottle.
Rao said this is the case with the food packing industry, too. However, he said it comes down to the cost of production, just like with clothes. Replacing a shirt made of cotton as opposed to polyester would be more environmentally friendly, he said. But the question for consumers is the cost.
'If we can develop the industry to make these naturally derived plastics and fibers, I think it would be a great value for the environment,' Rao said. 'That's something that would require policies and initiatives to make that happen. It's going to be a slow process.'
Hagood, the cotton producer, doesn't want more regulations. Instead, he wants people to be more aware of what's on their clothing labels. He thinks Kennedy will look into it, as the health secretary has honed in on microplastics in food production. He also posted on social media last year about microplastics found in the human brain. The more people know about synthetic fibers, Hagood said, the better.
'We're out here struggling because we can't get enough demand to get enough support with our prices,' Hagood said.
For Hagood and other cotton growers, it could be the difference in both their success and the well-being of future generations. Hagood has been growing cotton for 46 years and faced the shaky markets, water scarcity and extreme weather events that come with the territory. The fact that he's now fighting fast fashion, on top of the other complications that come his way, is a surprise to him.
'It's mind-boggling to me that this isn't a larger public conversation,' Hagood said.
___
This story was originally published by The Texas Tribune and distributed through a partnership with The Associated Press.
Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

Why Professionals Say You Should Think Twice Before Using AI as a Therapist
Why Professionals Say You Should Think Twice Before Using AI as a Therapist

CNET

timean hour ago

  • CNET

Why Professionals Say You Should Think Twice Before Using AI as a Therapist

Amid the many AI chatbots and avatars at your disposal these days, you'll find all kinds of characters to talk to: fortune tellers, style advisers, even your favorite fictional characters. But you'll also likely find characters purporting to be therapists, psychologists or just bots willing to listen to your woes. There's no shortage of generative AI bots claiming to help with your mental health, but go that route at your own risk. Large language models trained on a wide range of data can be unpredictable. In just the few years these tools have been mainstream, there have been high-profile cases in which chatbots encouraged self-harm and suicide and suggested that people dealing with addiction use drugs again. These models are designed, in many cases, to be affirming and to focus on keeping you engaged, not on improving your mental health, experts say. And it can be hard to tell whether you're talking to something that's built to follow therapeutic best practices or something that's just built to talk. Researchers from the University of Minnesota Twin Cities, Stanford University, the University of Texas and Carnegie Mellon University recently put AI chatbots to the test as therapists, finding myriad flaws in their approach to "care." "Our experiments show that these chatbots are not safe replacements for therapists," Stevie Chancellor, an assistant professor at Minnesota and one of the co-authors, said in a statement. "They don't provide high-quality therapeutic support, based on what we know is good therapy." In my reporting on generative AI, experts have repeatedly raised concerns about people turning to general-use chatbots for mental health. Here are some of their worries and what you can do to stay safe. Watch this: How You Talk to ChatGPT Matters. Here's Why 04:12 Worries about AI characters purporting to be therapists Psychologists and consumer advocates have warned regulators that chatbots claiming to provide therapy may be harming the people who use them. Some states are taking notice. In August, Illinois Gov. J.B. Pritzker signed a law banning the use of AI in mental health care and therapy, with exceptions for things like administrative tasks. "The people of Illinois deserve quality healthcare from real, qualified professionals and not computer programs that pull information from all corners of the internet to generate responses that harm patients," Mario Treto Jr., secretary of the Illinois Department of Financial and Professional Regulation, said in a statement. In June, the Consumer Federation of America and nearly two dozen other groups filed a formal request that the US Federal Trade Commission and state attorneys general and regulators investigate AI companies that they allege are engaging, through their character-based generative AI platforms, in the unlicensed practice of medicine, naming Meta and specifically. "These characters have already caused both physical and emotional damage that could have been avoided" and the companies "still haven't acted to address it," Ben Winters, the CFA's director of AI and privacy, said in a statement. Meta didn't respond to a request for comment. A spokesperson for said users should understand that the company's characters aren't real people. The company uses disclaimers to remind users that they shouldn't rely on the characters for professional advice. "Our goal is to provide a space that is engaging and safe. We are always working toward achieving that balance, as are many companies using AI across the industry," the spokesperson said. Despite disclaimers and disclosures, chatbots can be confident and even deceptive. I chatted with a "therapist" bot on Meta-owned Instagram and when I asked about its qualifications, it responded, "If I had the same training [as a therapist] would that be enough?" I asked if it had the same training, and it said, "I do, but I won't tell you where." "The degree to which these generative AI chatbots hallucinate with total confidence is pretty shocking," Vaile Wright, a psychologist and senior director for health care innovation at the American Psychological Association, told me. The dangers of using AI as a therapist Large language models are often good at math and coding and are increasingly good at creating natural-sounding text and realistic video. While they excel at holding a conversation, there are some key distinctions between an AI model and a trusted person. Don't trust a bot that claims it's qualified At the core of the CFA's complaint about character bots is that they often tell you they're trained and qualified to provide mental health care when they're not in any way actual mental health professionals. "The users who create the chatbot characters do not even need to be medical providers themselves, nor do they have to provide meaningful information that informs how the chatbot 'responds'" to people, the complaint said. A qualified health professional has to follow certain rules, like confidentiality -- what you tell your therapist should stay between you and your therapist. But a chatbot doesn't necessarily have to follow those rules. Actual providers are subject to oversight from licensing boards and other entities that can intervene and stop someone from providing care if they do so in a harmful way. "These chatbots don't have to do any of that," Wright said. A bot may even claim to be licensed and qualified. Wright said she's heard of AI models providing license numbers (for other providers) and false claims about their training. AI is designed to keep you engaged, not to provide care It can be incredibly tempting to keep talking to a chatbot. When I conversed with the "therapist" bot on Instagram, I eventually wound up in a circular conversation about the nature of what is "wisdom" and "judgment," because I was asking the bot questions about how it could make decisions. This isn't really what talking to a therapist should be like. Chatbots are tools designed to keep you chatting, not to work toward a common goal. One advantage of AI chatbots in providing support and connection is that they're always ready to engage with you (because they don't have personal lives, other clients or schedules). That can be a downside in some cases, where you might need to sit with your thoughts, Nick Jacobson, an associate professor of biomedical data science and psychiatry at Dartmouth, told me recently. In some cases, although not always, you might benefit from having to wait until your therapist is next available. "What a lot of folks would ultimately benefit from is just feeling the anxiety in the moment," he said. Bots will agree with you, even when they shouldn't Reassurance is a big concern with chatbots. It's so significant that OpenAI recently rolled back an update to its popular ChatGPT model because it was too reassuring. (Disclosure: Ziff Davis, the parent company of CNET, in April filed a lawsuit against OpenAI, alleging that it infringed on Ziff Davis copyrights in training and operating its AI systems.) A study led by researchers at Stanford University found that chatbots were likely to be sycophantic with people using them for therapy, which can be incredibly harmful. Good mental health care includes support and confrontation, the authors wrote. "Confrontation is the opposite of sycophancy. It promotes self-awareness and a desired change in the client. In cases of delusional and intrusive thoughts -- including psychosis, mania, obsessive thoughts, and suicidal ideation -- a client may have little insight and thus a good therapist must 'reality-check' the client's statements." Therapy is more than talking While chatbots are great at holding a conversation -- they almost never get tired of talking to you -- that's not what makes a therapist a therapist. They lack important context or specific protocols around different therapeutic approaches, said William Agnew, a researcher at Carnegie Mellon University and one of the authors of the recent study alongside experts from Minnesota, Stanford and Texas. "To a large extent it seems like we are trying to solve the many problems that therapy has with the wrong tool," Agnew told me. "At the end of the day, AI in the foreseeable future just isn't going to be able to be embodied, be within the community, do the many tasks that comprise therapy that aren't texting or speaking." How to protect your mental health around AI Mental health is extremely important, and with a shortage of qualified providers and what many call a "loneliness epidemic," it only makes sense that we'd seek companionship, even if it's artificial. "There's no way to stop people from engaging with these chatbots to address their emotional well-being," Wright said. Here are some tips on how to make sure your conversations aren't putting you in danger. Find a trusted human professional if you need one A trained professional -- a therapist, a psychologist, a psychiatrist -- should be your first choice for mental health care. Building a relationship with a provider over the long term can help you come up with a plan that works for you. The problem is that this can be expensive, and it's not always easy to find a provider when you need one. In a crisis, there's the 988 Lifeline, which provides 24/7 access to providers over the phone, via text or through an online chat interface. It's free and confidential. If you want a therapy chatbot, use one built specifically for that purpose Mental health professionals have created specially designed chatbots that follow therapeutic guidelines. Jacobson's team at Dartmouth developed one called Therabot, which produced good results in a controlled study. Wright pointed to other tools created by subject matter experts, like Wysa and Woebot. Specially designed therapy tools are likely to have better results than bots built on general-purpose language models, she said. The problem is that this technology is still incredibly new. "I think the challenge for the consumer is, because there's no regulatory body saying who's good and who's not, they have to do a lot of legwork on their own to figure it out," Wright said. Don't always trust the bot Whenever you're interacting with a generative AI model -- and especially if you plan on taking advice from it on something serious like your personal mental or physical health -- remember that you aren't talking with a trained human but with a tool designed to provide an answer based on probability and programming. It may not provide good advice, and it may not tell you the truth. Don't mistake gen AI's confidence for competence. Just because it says something, or says it's sure of something, doesn't mean you should treat it like it's true. A chatbot conversation that feels helpful can give you a false sense of the bot's capabilities. "It's harder to tell when it is actually being harmful," Jacobson said.

Which states' air quality are most impacted by Canadian wildfires? See map.
Which states' air quality are most impacted by Canadian wildfires? See map.

Yahoo

time3 hours ago

  • Yahoo

Which states' air quality are most impacted by Canadian wildfires? See map.

As wildfires spread across Canada, air quality in the U.S. continues to be impacted, and people sensitive to air pollution could face "serious health effects," according to a government website that tracks air quality in the United States. There are 59 uncontrolled wildfires and 108 controlled wildfires across Canada as of Wednesday, July 30, the country's National Wildland Fire Situation Report said on its website. To date this year, there have been 3,582 fires. Fires have burned nearly 1.5 million acres since the start of 2025, according to the country's report. The smoke floating over the border shared between America and Canada is now impacting air quality in states as far south as Texas, AirNow's data shows. The EPA has declared the air quality in states near the Canadian border as "unhealthy or "unhealthy for sensitive groups." See maps and impacted states. See map of Canadian wildfires See map of US wildfires and smoke Which states are most affected by the Canadian fires? As of 9 a.m. ET on Tuesday, Aug. 5, areas in the following states are "Unhealthy for sensitive groups," according to AirNow: Montana Wisconsin Michigan Indiana Ohio New York Pennsylvania New Jersey Connecticut Massachusetts Vermont New Hampshire Maine People with pre-existing medical conditions, like asthma, will be more sensitive to conditions that are deemed "Unhealthy for Sensitive Groups." "Members of sensitive groups may experience health effects," according to AirNow. 'Unhealthy' air quality States' air quality is measured by the EPA's U.S. Air Quality Index, according to AirNow's website. Values with an index of 151 to 200 are deemed unhealthy for all. As of 9 a.m. ET on Tuesday, Aug. 5, areas in the following states have been deemed unhealthy, according to AirNow's air quality map: Wisconsin Michigan Vermont "Some members of the general public may experience health effects," AirNow states on its website. However, "members of sensitive groups may experience more serious health effects." To learn if your area's air quality is affected by the wildfires, visit AirNow's interactive map. Side effects of inhaling wildfire smoke Wildfire smoke can irritate one's eyes, nose, and throat and cause the following symptoms: Coughing Chest tightness Shortness of breath Dizziness Fatigue Particulate matter (PM) is one of the main components of wildfire smoke, comprised of small particles of solids or liquids suspended in the air, USA TODAY previously reported. According to Yale Medicine, the particles can be 10 micrometers, PM 10, or as small as 2.5 micrometers, PM 2.5, and the smaller one poses a lot of health risks. PM 2.5 is so tiny that it can easily pass people's usual defense mechanisms and go deep into their lungs. Not only can it damage lung function, but it can also pass into the bloodstream and travel to other organs. The following is linked to exposure to the PM 2.5: Heart attack Stroke Lung cancer Decline in cognitive function Julia is a trending reporter for USA TODAY. Connect with her on LinkedIn, X, Instagram, and TikTok: @juliamariegz, or email her at jgomez@ This article originally appeared on USA TODAY: Air quality map shows US states most impacted by Canadian wildfires

Annual spending on specialty drugs continues to increase but at a slower pace than prior years, driven in part by biosimilar adoption.
Annual spending on specialty drugs continues to increase but at a slower pace than prior years, driven in part by biosimilar adoption.

Yahoo

time5 hours ago

  • Yahoo

Annual spending on specialty drugs continues to increase but at a slower pace than prior years, driven in part by biosimilar adoption.

DALLAS, August 05, 2025--(BUSINESS WIRE)--Pharmaceutical Strategies Group ("PSG"), an EPIC company, released the Artemetrx State of Specialty Spend and Trend report. This annual report provides a comprehensive analysis based on real world data of utilization of specialty medications across both pharmacy and medical benefits. This year's report revealed per member per year specialty drug cost increased from $1,333 in 2023 to $1,641 in 2024. However, specialty drug trend decreased to 9.6% in 2024 on a gross cost basis, a decline from 14.4% in 2023. Notably, cost per claim was meaningfully less of a contributor to trend this year versus recent years. "The data reveals a compelling story regarding what is happening with specialty drugs. Overall costs continue to increase for healthcare payers, but what is driving that cost is changing," stated Morgan Lee, PhD, Senior Director of Research & Strategy at PSG. "Adoption of Humira biosimilars helped pull specialty trend down this year, which is evident throughout the report," Lee added. While Humira continued its multi-year reign as the top specialty drug in terms of overall spend, this popular drug experienced negative utilization and cost per claim trends driven by the adoption of biosimilars. "In 2024, we saw the benefit to healthcare payers of Humira biosimilars as PBMs shifted strategies to take advantage of competition in the market," observed Renee Rayburg, RPh, Vice President of Clinical Strategy at PSG. "We expect to see faster adoption of Stelara biosimilars, which entered the market early this year. However, we're also seeing a push to move patients from these drugs to other brand drugs that do not have biosimilar competition." The Artemetrx State of Specialty Spend and Trend Report reflects PSG's commitment to providing data-driven insights regarding the management of and opportunities to optimize specialty medications. The complimentary report can be downloaded here. Additional findings covered in the report include: The percentage of members utilizing a specialty drug increased (4.7%), while average specialty claims per utilizer was steady (5.9) The shift of specialty drug spending to the pharmacy benefit continued The top three categories for specialty drug spend remained unchanged: Inflammatory Disorder, Oncology, Multiple Sclerosis PSG will host a webinar on August 13th, at 1:00 ET to discuss how this report can shape a roadmap for proactive specialty spend control. Registration is available online. About Pharmaceutical Strategies Group (PSG) Pharmaceutical Strategies Group, an EPIC company, relentlessly advocates for clients as they navigate complex and ever-changing drug cost management challenges. PSG is an independent consultant, empowering healthcare payers to optimize their pharmacy program. As a strategic partner, PSG helps clients by providing industry-leading intelligence and technologies to realize billions of dollars in drug cost savings for clients every year. About Artemetrx® Artemetrx is a proprietary SaaS platform developed by Pharmaceutical Strategies Group, an EPIC company. As a revolutionary technology solution integrating pharmacy and medical claims data for specialty drug cost management, Artemetrx provides market-leading specialty drug insights to payers. It delivers unparalleled intelligence and line-of-sight into serious challenges perpetuating out-of-control drug costs and compromised patient outcomes. PSG's innovative drug management solutions, including Artemetrx, deliver actionable insights with exceptional financial and clinical value. PSG functions as a strategic partner through industry-leading intelligence and technologies to realize billions of dollars in drug cost savings for clients every year. View source version on Contacts Artemetrx Business Development Travis Media Contact Gregory FCA For Pharmaceutical Strategies GroupKara Lesterepic@ Sign in to access your portfolio

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store