NAACP files intent to sue Elon Musk's xAI company over supercomputer air pollution
The xAI data center began operating last year, powered by pollution-emitting gas turbines, without first applying for a permit. Officials have said an exemption allowed them to operate for up to 364 days without a permit, but Southern Environmental Law Center attorney Patrick Anderson said at a news conference that there is no such exemption for turbines — and that regardless, it has now been more than 364 days.
The SELC is representing the NAACP in its legal challenge against xAI and its permit application, now being considered by the Shelby County Health Department.
Musk's xAI said the turbines will be equipped with technology to reduce emissions — and that it's already boosting the city's economy by investing billions of dollars in the supercomputer facility, paying millions in local taxes and creating hundreds of jobs. The company also is spending $35 million to build a power substation and $80 million to build a water recycling plant to the support Memphis, Light, Gas and Water, the local utility.
Opponents say the supercomputing center is stressing the power grid, and that the turbines emit smog and carbon dioxide, pollutants that cause lung irritation such as nitrogen oxides, and the carcinogen formaldehyde, experts say.
The chamber of commerce in Memphis made a surprise announcement in June 2024 that xAI planned to build a supercomputer in the city. The data center quickly set up shop in an industrial park south Memphis, near factories and a gas-powered plant operated by the Tennessee Valley Authority.
The SELC has claimed the use of the turbines violates the Clean Air Act, and that residents who live near the xAI facility already face cancer risks at four times the national average. The group also has sent a petition to the Environmental Protection Agency.
Critics say xAI installed the turbines without any oversight or notice to the community. The SELC also hired a firm to fly over the site and saw that 35 turbines — not 15 as the company requests in its permit — are located there.
The permit itself says emissions from the site 'will be an area source for hazardous air pollutants.' A permit would allow the health department, which has received 1,700 public comments about the permit, to monitor air quality near the facility.
At a community gathering hosted by the county health department in April, many of the people speaking in opposition cited the additional pollution burden in a city that already received an 'F' grade for ozone pollution from the American Lung Association.
A statement read by xAI's Brent Mayo at the meeting said the company wants to 'strengthen the fabric of the community,' and estimated that tax revenues from the data center are likely to exceed $100 million by next year.
'This tax revenue will support vital programs like public safety, health and human services, education, firefighters, police, parks and so much more,' said the statement, a copy of which was obtained by the Associated Press.
The company also apparently wants to expand: The chamber of commerce said in March that xAI had purchased a 1 million square-foot property at a second location, not far from the current facility.
One nearby neighborhood dealing with decades of industrial pollution is Boxtown, a tight-knit community founded by freed slaves in the 1860s. It was named Boxtown after residents used material dumped from railroad boxcars to fortify their homes. The area features houses, wooded areas and wetlands, and its inhabitants are mostly working class residents.
Boxtown won a victory in 2021 against two corporations that sought to build an oil pipeline through the area. Valero and Plains All American Pipeline canceled the project after protests by residents and activists led by State Rep. Justin J. Pearson, who called it a potential danger to the community and an aquifer that provides clean drinking water to Memphis.
Pearson, who represents nearby neighborhoods, said 'clean air is a human right' as he called for people in Memphis to unite against xAI.
'There is not a person, no matter how wealthy or how powerful, that can deny the fact that everybody has a right to breathe clean air,' said Pearson, who compared the fight against xAI to David and Goliath.
'We're all right to be David, because we know how the story ends,' he said.
Sainz writes for the Associated Press. AP writer Travis Loller contributed to this report from Nashville, Tenn.

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


The Verge
an hour ago
- The Verge
Tesla's ‘robotaxi' rides in San Francisco have a human at the wheel
Tesla's newly-launched ride-hailing service in San Francisco isn't quite ready for the 'robotaxi' designation. After launching its robotaxi rides in Austin, Texas, with a 'safety monitor' in the passenger seat last month, a video of Tesla's service in San Francisco shows a vehicle arriving with a human at the wheel, as reported earlier by Business Insider. California requires companies to obtain three permits to operate a commercial robotaxi service. So far, the state has granted Tesla just one of the permits, allowing it to run a ride-hailing service with humans in the driver's seat. The Alphabet-owned Waymo is currently the only company with the permits to offer commercial driverless rides in San Francisco. Even though the service in Texas and California hasn't yet achieved Musk's promise of operating with 'no one in the car,' Musk told Tesla investors last week that he plans to expand robotaxi service to Florida, Nevada, and Arizona. Posts from this author will be added to your daily email digest and your homepage feed. See All by Emma Roth Posts from this topic will be added to your daily email digest and your homepage feed. See All Electric Cars Posts from this topic will be added to your daily email digest and your homepage feed. See All Elon Musk Posts from this topic will be added to your daily email digest and your homepage feed. See All News Posts from this topic will be added to your daily email digest and your homepage feed. See All Tech Posts from this topic will be added to your daily email digest and your homepage feed. See All Tesla Posts from this topic will be added to your daily email digest and your homepage feed. See All Transportation

Los Angeles Times
an hour ago
- Los Angeles Times
Fewer Americans see discrimination as anti-DEI push gains traction, poll shows
WASHINGTON — Slightly less than half of U.S. adults believe that Black people face 'a great deal' or 'quite a bit' of discrimination in the United States, according to a poll. That's a decline from the solid majority, 60%, who thought Black Americans faced high levels of discrimination in the spring of 2021, months after racial reckoning protests in response to the police killing of George Floyd. Significant numbers of Americans also think diversity, equity and inclusion efforts, also known as DEI, are backfiring against the groups they're intended to help, according to the survey from The Associated Press-NORC Center for Public Affairs Research, including many people who belong to those groups. The findings suggest Americans' views on racial discrimination have shifted substantially since four years ago, when many companies launched efforts to promote diversity within their workforces and the products they sold. Since then, many of those companies have reversed themselves and retreated from their diversity practices, a trend that's accelerated this year under pressure from President Trump, a Republican who has sought to withhold federal money from schools and companies that promote DEI. Now, it's clear that views are changing as well as company policies. Claudine Brider, a 48-year-old Black Democrat in Compton, California, says the concept of DEI has made the workplace difficult for Black people and women in new ways. 'Anytime they're in a space that they're not expected to be, like seeing a Black girl in an engineering course ... they are seen as only getting there because of those factors,' Brider said. 'It's all negated by someone saying, 'You're only here to meet a quota.'' The poll finds 45% of U.S. adults think Black people face high levels of discrimination, down from 60% in the spring of 2021. There was a similar drop in views about the prevalence of serious discrimination against Asian people, which fell from 45% in the 2021 poll — conducted a month after the Atlanta spa shootings, which killed eight people, including six women of Asian descent — to 32% in the current survey. There's no question the country has backtracked from its 'so-called racial reckoning' and the experiences of particular groups such as Black people are being downplayed, said Phillipe Copeland, a professor at Boston University School of Social Work. Americans' views about discrimination haven't shifted when it comes to all groups, though. Just under half of U.S. adults, 44%, now say Hispanic people face at least 'quite a bit of discrimination,' and only 15% say this about white people. Both numbers are similar to when the question was last asked in April 2021. The poll indicates that less than half of Americans think DEI has a benefit for the people it's intended to help. About 4 in 10 U.S. adults say DEI reduces discrimination against Black people, while about one-third say this about Hispanic people, women and Asian people. Many — between 33% and 41% — don't think DEI makes a difference either way. About one-quarter of U.S. adults believe that DEI actually increases discrimination against these groups. Black and Hispanic people are more likely than white people to think DEI efforts end up increasing discrimination against people like them. About 4 in 10 Black adults and about one-third of Hispanic adults say DEI increases discrimination against Black people, compared with about one-quarter of white adults. There is a similar split between white adults and Black and Hispanic adults on assessments of discrimination against Hispanic people. Among white people, it's mostly Democrats who think DEI efforts reduce discrimination against Black and Hispanic people. Only about one-quarter of white independents and Republicans say the same. Pete Parra, a 59-year-old resident of Gilbert, Ariz., thinks that DEI is making things harder for racial minorities now. He worries about how his two adult Hispanic sons will be treated when they apply for work. 'I'm not saying automatically just give it to my sons,' said Parra, who leans toward the Democratic Party. But he's concerned that now factors other than merit may take priority. 'If they get passed over for something,' he said, 'they're not going to know (why).' The poll shows that Americans aren't any more likely to think white people face discrimination than they were in 2021. And more than half think DEI doesn't make a difference when it comes to white people or men. But a substantial minority — about 3 in 10 U.S. adults — think DEI increases discrimination against white people. Even more white adults, 39%, hold that view, compared with 21% of Hispanic adults and 13% of Black adults. The recent political focus on DEI has included the idea that white people are more often overlooked for career and educational opportunities because of their race. John Bartus, a 66-year-old registered Republican in Twin Falls, Idaho, says that DEI might have been 'a good thing for all races of people, but it seems like it's gone far left.' It's his impression that DEI compels companies to hire people based on their race or if they identify as LGBTQ+. 'The most qualified person ought to get a job based on their merit or based on their educational status,' Bartus said. Brider, the Black California resident, objects to the notion that white people face the same level of discrimination as Black people. But while she thinks the aims of DEI are admirable, she also sees the reality as flawed. 'I do think there needs to be something that ensures that there is a good cross-section of people in the workplace,' Brider said. 'I just don't know what that would look like, to be honest.' Tang and Thomson-Deveaux write for the Associated Press. The AP-NORC poll of 1,437 adults was conducted July 10-14, using a sample drawn from NORC's probability-based AmeriSpeak Panel, which is designed to be representative of the U.S. population. The margin of sampling error for adults overall is plus or minus 3.6 percentage points.


Time Magazine
2 hours ago
- Time Magazine
Police and Courts Are Turning to AI. Is the System Ready for It?
Can AI be used to make the criminal justice system more fair and efficient, or will it only reinforce harmful biases? Experts say that it has so far been deployed in worrying ways—but that there is potential for positive impact. Today, AI tech has reached nearly every aspect of the criminal justice system. It is being used in facial recognition systems to identify suspects; in 'predictive policing' strategies to formulate patrol routes; in courtrooms to assist with case management; and by public defenders to cull through evidence. But while advocates point to an increase in efficiency and fairness, critics raise serious questions around privacy and accountability. Last month, the Council on Criminal Justice launched a nonpartisan task force on AI, to study how AI could be used in the criminal justice system safely and ethically. The group's work will be supported by researchers at RAND, and they will eventually take their findings and make recommendations to policymakers and law enforcement. 'There's no question that AI can yield unjust results,' says Nathan Hecht, the task force's chair and a former Texas Supreme Court Chief Justice. 'This task force wants to bring together tech people, criminal justice people, community people, experts in various different areas, and really sit down to see how we can use it to make the system better and not cause the harm that it's capable of.' Risks of AI in law enforcement Many courts and police departments are already using AI, Hecht says. 'It's very piecemeal: Curious people going, 'Oh, wow, there's this AI out here, we could use it over in the criminal court.' But because there are few standards for how to deploy AI, civil rights watchdogs have grown concerned that law enforcement agencies are using it in dangerous ways. Thousands of agencies have come to rely upon facial recognition technology sold by companies like Clearview, which hosts a database of billions of images scraped off the internet. In many databases, Black people are overrepresented, in part because they live in communities that are overpoliced. AI technology is also worse at discerning differences in Black people's faces, which can lead to higher misidentification rates. Last year, the Innocence Project, a legal nonprofit, found that there have been at least seven wrongful arrests from facial recognition technology, six of which involved wrongfully accused Black people. Walter Katz, the organization's director of policy, says that police sometimes make arrests solely based on AI's facial recognition findings as opposed to having the AI serve as a starting point for a larger investigation. 'There's an over-reliance on AI outputs,' he says. Katz says that when he went to a policing conference last fall, 'it was AI everywhere.' Vendors were aggressively hawking technology tools that claimed to solve real problems in police departments. 'But in making that pitch, there was little attention to any tradeoffs or risks,' he says. For instance, critics worry that many of these AI tools will increase surveillance of public spaces, including the monitoring of peaceful protesters—or that so-called 'predictive policing' will intensify law enforcement's crackdowns on over-policed areas. Where AI could help However, Katz concedes that AI does have a place in the criminal justice system. 'It'll be very hard to wish AI away—and there are places where AI can be helpful,' he says. For that reason, he joined the Council on Criminal Justice's AI task force. 'First and foremost is getting our arms wrapped around how fast the adoption is. And if everyone comes from the understanding that having no policy whatsoever is probably the wrong place to be, then we build from there.' Hecht, the task force's chair, sees several areas where AI could be helpful in the courtroom, for example, including improving the intake process for arrested people, or helping identify who qualifies for diversion programs, which allow offenders to avoid convictions. He also hopes the task force will provide recommendations on what types of AI usage explicitly should not be approved in criminal justice, and steps to preserve the public's privacy. 'We want to try to gather the expertise necessary to reassure the users of the product and the public that this is going to make your experience with the criminal justice system better—and after that, it's going to leave you alone,' he says. Meanwhile, plenty of other independent efforts are trying to use AI to improve the justice processes. One startup, JusticeText, hopes to use AI to narrow the gap between resources of prosecutors and public defenders, the latter of whom are typically severely understaffed and underresourced. JusticeText built a tool for public defenders that sorts through hours of 911 calls, police body camera footage, and recorded interrogations, in order to analyze it and determine if, for example, police have made inconsistent statements or asked leading questions. 'We really wanted to see what it looks like to be a public defender-first, and try to level that playing field that technology has in many ways exacerbated in past years,' says founder and CEO Devshi Mehrotra. JusticeText is working with around 75 public defender agencies around the country. Recidiviz, a criminal justice reform nonprofit, has also been testing several ways of integrating AI into their workflows, including giving parole officers AI-generated summaries of clients. 'You might have 80 pages of case notes going back seven years on this person that you're not going to read if you have a caseload of 150 people, and you have to see each one of them every month,' says Andrew Warren, Recidiviz's co-founder. 'AI could give very succinct highlights of what this person has already achieved and what they could use support on.' The challenge for policymakers and the Council on Criminal Justice's task force, then, is to determine how to develop standards and oversight mechanisms so that the good from AI's efficiency gains outweigh its ability to amplify existing biases. Hecht, at the task force, also hopes to protect from a future in which a black box AI makes life-changing decisions on its own. 'Should we ensure our traditional ideas of human justice are protected? Of course. Should we make sure that able judges and handlers of the criminal justice system are totally in control? Of course,' he says. 'But saying we're going to keep AI out of the justice system is hopeless. Law firms are using it. The civil justice system is using it. It's here to stay.'