Advanced AI models generate up to 50 times more CO₂ emissions than more common LLMs when answering the same questions
The more accurate we try to make AI models, the bigger their carbon footprint — with some prompts producing up to 50 times more carbon dioxide emissions than others, a new study has revealed.
Reasoning models, such as Anthropic's Claude, OpenAI's o3 and DeepSeek's R1, are specialized large language models (LLMs) that dedicate more time and computing power to produce more accurate responses than their predecessors.
Yet, aside from some impressive results, these models have been shown to face severe limitations in their ability to crack complex problems. Now, a team of researchers has highlighted another constraint on the models' performance — their exorbitant carbon footprint. They published their findings June 19 in the journal Frontiers in Communication.
"The environmental impact of questioning trained LLMs is strongly determined by their reasoning approach, with explicit reasoning processes significantly driving up energy consumption and carbon emissions," study first author Maximilian Dauner, a researcher at Hochschule München University of Applied Sciences in Germany, said in a statement. "We found that reasoning-enabled models produced up to 50 times more CO₂ emissions than concise response models."
To answer the prompts given to them, LLMs break up language into tokens — word chunks that are converted into a string of numbers before being fed into neural networks. These neural networks are tuned using training data that calculates the probabilities of certain patterns appearing. They then use these probabilities to generate responses.
Reasoning models further attempt to boost accuracy using a process known as "chain-of-thought." This is a technique that works by breaking down one complex problem into smaller, more digestible intermediary steps that follow a logical flow, mimicking how humans might arrive at the conclusion to the same problem.
Related: AI 'hallucinates' constantly, but there's a solution
However, these models have significantly higher energy demands than conventional LLMs, posing a potential economic bottleneck for companies and users wishing to deploy them. Yet, despite some research into the environmental impacts of growing AI adoption more generally, comparisons between the carbon footprints of different models remain relatively rare.
To examine the CO₂ emissions produced by different models, the scientists behind the new study asked 14 LLMs 1,000 questions across different topics. The different models had between 7 and 72 billion parameters.
The computations were performed using a Perun framework (which analyzes LLM performance and the energy it requires) on an NVIDIA A100 GPU. The team then converted energy usage into CO₂ by assuming each kilowatt-hour of energy produces 480 grams of CO₂.
Their results show that, on average, reasoning models generated 543.5 tokens per question compared to just 37.7 tokens for more concise models. These extra tokens — amounting to more computations — meant that the more accurate reasoning models produced more CO₂.
The most accurate model was the 72 billion parameter Cogito model, which answered 84.9% of the benchmark questions correctly. Cogito released three times the CO₂ emissions of similarly sized models made to generate answers more concisely.
"Currently, we see a clear accuracy-sustainability trade-off inherent in LLM technologies," said Dauner. "None of the models that kept emissions below 500 grams of CO₂ equivalent [total greenhouse gases released] achieved higher than 80% accuracy on answering the 1,000 questions correctly."
RELATED STORIES
—Replika AI chatbot is sexually harassing users, including minors, new study claims
—OpenAI's 'smartest' AI model was explicitly told to shut down — and it refused
—AI benchmarking platform is helping top companies rig their model performances, study claims
But the issues go beyond accuracy. Questions that needed longer reasoning times, like in algebra or philosophy, caused emissions to spike six times higher than straightforward look-up queries.
The researchers' calculations also show that the emissions depended on the models that were chosen. To answer 60,000 questions, DeepSeek's 70 billion parameter R1 model would produce the CO₂ emitted by a round-trip flight between New York and London. Alibaba Cloud's 72 billion parameter Qwen 2.5 model, however, would be able to answer these with similar accuracy rates for a third of the emissions.
The study's findings aren't definitive; emissions may vary depending on the hardware used and the energy grids used to supply their power, the researchers emphasized. But they should prompt AI users to think before they deploy the technology, the researchers noted.
"If users know the exact CO₂ cost of their AI-generated outputs, such as casually turning themselves into an action figure, they might be more selective and thoughtful about when and how they use these technologies," Dauner said.
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles
Yahoo
25 minutes ago
- Yahoo
Trump's AI plan calls for massive data centers. Here's how it may affect energy in the U.S.
President Donald Trump's plan to boost artificial intelligence and build data centers across the U.S. could speed up a building boom that was already expected to strain the nation's ability to power it. The White House released the 'AI Action Plan' Wednesday, vowing to expedite permitting for construction of energy-intensive data centers as it looks to make the country a leader in a business that tech companies and others are pouring billions of dollars into. The plan says to combat 'radical climate dogma,' a number of restrictions — including clean air and water laws — could be lifted, aligning with Trump's 'American energy dominance' agenda and his efforts to undercut clean energy. Here's what you need to know. What AI means for the environment Massive amounts of electricity are needed to support the complex servers, equipment and more for AI. Electricity demand from data centers worldwide is set to more than double by 2030, to slightly more than the entire electricity consumption of Japan today, the International Energy Agency said earlier this year. In many cases, that electricity may come from burning coal or natural gas. These fossil fuels emit planet-warming greenhouse gas emissions, including carbon dioxide and methane. This in turn is tied to extreme weather events that are becoming more severe, frequent and costly. The data centers used to fuel AI also need a tremendous amount of water to keep cool. That means they can strain water sources in areas that may have little to spare. What Big Tech is saying and doing about finding all that power Typically, tech giants, up-and-comers and other developers try to keep an existing power plant online to meet demand, experts say, and most existing power plants in the U.S. are still producing electricity using fossil fuels — most often natural gas. In certain areas of the U.S., a combination of renewables and energy storage in the form of batteries are coming online. But tapping into nuclear power is especially of interest as a way to reduce data center-induced emissions while still meeting demand and staying competitive. Amazon said last month it would spend $20 billion on data center sites in Pennsylvania, including one alongside a nuclear power plant. The investment allows Amazon to plug right into the plant, a scrutinized but faster approach for the company's development timeline. Meta recently signed a deal to secure nuclear power to meet its computing needs. Microsoft plans to buy energy from the Three Mile Island nuclear power plant, and Google previously signed a contract to purchase it from multiple small modular reactors in the works. What's at stake in the kind of energy that powers data centers Data centers are often built where electricity is cheapest, and often, that's not from renewables. And sometimes data centers are cited as a reason to extend the lives of traditional, fossil-fuel-burning power plants. But just this week, United Nations Secretary-General António Guterres called on the world's largest tech players to fuel their data center needs entirely with renewables by 2030. It's necessary to use fewer fossil fuels, he said. Experts say it's possible for developers, investors and the tech industry to decarbonize. However, though industry can do a lot with clean energy, the emerging demands are so big that it can't be clean energy alone, said University of Pennsylvania engineering professor Benjamin Lee. More generative AI, ChatGPT and massive data centers means 'relying on wind and solar alone with batteries becomes really, really expensive,' Lee added, hence the attention on natural gas, but also nuclear. What does AI growth mean for my electricity bills? Regardless of what powers AI, the simple law of supply and demand makes it all but certain that costs for consumers will rise. New data center projects might require both new energy generation and existing generation. Developers might also invest in batteries or other infrastructure like transmission lines. All of this costs money, and it needs to be paid for from somewhere. 'In a lot of places in the U.S., they are seeing that rates are going up because utilities are making these moves to try to plan,' said Amanda Smith, a senior scientist at research organization Project Drawdown. 'They're planning transmission infrastructure, new power plants for the growth and the load that's projected, which is what we want them to do," she added. "But we as ratepayers will wind up seeing rates go up to cover that.' ___ Alexa St. John is an Associated Press climate reporter. Follow her on X: @alexa_stjohn. Reach her at ___ Read more of AP's climate coverage at ___ The Associated Press' climate and environmental coverage receives financial support from multiple private foundations. AP is solely responsible for all content. Find AP's standards for working with philanthropies, a list of supporters and funded coverage areas at Alexa St. John, The Associated Press

Associated Press
28 minutes ago
- Associated Press
Trump's AI plan calls for massive data centers. Here's how it may affect energy in the U.S.
President Donald Trump's plan to boost artificial intelligence and build data centers across the U.S. could speed up a building boom that was already expected to strain the nation's ability to power it. The White House released the 'AI Action Plan' Wednesday, vowing to expedite permitting for construction of energy-intensive data centers as it looks to make the country a leader in a business that tech companies and others are pouring billions of dollars into. The plan says to combat 'radical climate dogma,' a number of restrictions — including clean air and water laws — could be lifted, aligning with Trump's 'American energy dominance' agenda and his efforts to undercut clean energy. Here's what you need to know. What AI means for the environment Massive amounts of electricity are needed to support the complex servers, equipment and more for AI. Electricity demand from data centers worldwide is set to more than double by 2030, to slightly more than the entire electricity consumption of Japan today, the International Energy Agency said earlier this year. In many cases, that electricity may come from burning coal or natural gas. These fossil fuels emit planet-warming greenhouse gas emissions, including carbon dioxide and methane. This in turn is tied to extreme weather events that are becoming more severe, frequent and costly. The data centers used to fuel AI also need a tremendous amount of water to keep cool. That means they can strain water sources in areas that may have little to spare. What Big Tech is saying and doing about finding all that power Typically, tech giants, up-and-comers and other developers try to keep an existing power plant online to meet demand, experts say, and most existing power plants in the U.S. are still producing electricity using fossil fuels — most often natural gas. In certain areas of the U.S., a combination of renewables and energy storage in the form of batteries are coming online. But tapping into nuclear power is especially of interest as a way to reduce data center-induced emissions while still meeting demand and staying competitive. Amazon said last month it would spend $20 billion on data center sites in Pennsylvania, including one alongside a nuclear power plant. The investment allows Amazon to plug right into the plant, a scrutinized but faster approach for the company's development timeline. Meta recently signed a deal to secure nuclear power to meet its computing needs. Microsoft plans to buy energy from the Three Mile Island nuclear power plant, and Google previously signed a contract to purchase it from multiple small modular reactors in the works. What's at stake in the kind of energy that powers data centers Data centers are often built where electricity is cheapest, and often, that's not from renewables. And sometimes data centers are cited as a reason to extend the lives of traditional, fossil-fuel-burning power plants. But just this week, United Nations Secretary-General António Guterres called on the world's largest tech players to fuel their data center needs entirely with renewables by 2030. It's necessary to use fewer fossil fuels, he said. Experts say it's possible for developers, investors and the tech industry to decarbonize. However, though industry can do a lot with clean energy, the emerging demands are so big that it can't be clean energy alone, said University of Pennsylvania engineering professor Benjamin Lee. More generative AI, ChatGPT and massive data centers means 'relying on wind and solar alone with batteries becomes really, really expensive,' Lee added, hence the attention on natural gas, but also nuclear. What does AI growth mean for my electricity bills? Regardless of what powers AI, the simple law of supply and demand makes it all but certain that costs for consumers will rise. New data center projects might require both new energy generation and existing generation. Developers might also invest in batteries or other infrastructure like transmission lines. All of this costs money, and it needs to be paid for from somewhere. 'In a lot of places in the U.S., they are seeing that rates are going up because utilities are making these moves to try to plan,' said Amanda Smith, a senior scientist at research organization Project Drawdown. 'They're planning transmission infrastructure, new power plants for the growth and the load that's projected, which is what we want them to do,' she added. 'But we as ratepayers will wind up seeing rates go up to cover that.' ___ Alexa St. John is an Associated Press climate reporter. Follow her on X: @alexa_stjohn. Reach her at [email protected]. ___ Read more of AP's climate coverage at ___ The Associated Press' climate and environmental coverage receives financial support from multiple private foundations. AP is solely responsible for all content. Find AP's standards for working with philanthropies, a list of supporters and funded coverage areas at


Politico
44 minutes ago
- Politico
Trump is all-in on AI sandboxes. Do they work?
With help from Anthony Adragna Among the many, many ambitious ideas baked into his AI Action Plan, President Donald Trump wants to inculcate a 'try-first' culture for innovation in the U.S. The plan, released Wednesday, includes a strong push for industries to hurry up with adopting AI — arguing that 'many of America's most critical sectors,' notably health care, have been slow to integrate it into their operations. And when it comes to the nitty-gritty of how this is supposed to happen, the White House suggests a classic tech-world mechanism: Regulatory sandboxes. 'Sandbox' is a term that gets thrown around a lot in tech, usually referring to a closed environment to test software. When talking about policy, a sandbox is a little different: It essentially gives companies a temporary hall pass on pre-existing regulations — like those for medical devices or data privacy — to pilot new technologies to the public. The idea is that, based on data collected during the trial period, companies can adjust their product designs, and governments can tailor their technology regulations. Federal agencies have previously implemented sandboxes for emerging tech, most notably in financial services. The action plan calls on the Food and Drug Administration and other agencies to establish similar programs for AI — though it's unclear where the funding will come from. Sandboxes for AI systems have already popped up in various states. Last year, Utah enacted a law that has allowed companies to run some tryouts: The ElizaChat platform got approval for a 12-month trial of mental health chatbots, and a firm called Dentacor was allowed to test AI-enabled radiograph diagnostic tools. Texas signed similar legislation into law in June, as did Delaware on Wednesday. To many free-marketers, sandboxes can achieve multiple aims — developing technology, and also putting pressure on regulations to adapt. 'Just because we've been regulating one way for a long, long time doesn't mean we always have to,' Adam Thierer, senior fellow at the R Street Institute, told DFD. 'We can try to find ways to innovate within boundaries – that's what sandboxes are.' Others are skeptical of the whole argument that somehow industry is slow-rolling AI, and needs a government-sanctioned space to try things out. 'Companies aren't held back by lack of permission to test AI,' said Lexi Reese, a former Google VP who ran for Congress in California on a tech-centric platform in 2023. 'They're already deploying it without oversight.' Sandboxes have also been used in the fintech sector for at least a decade, allowing institutions like banks to test out digital systems with real customers. Trump's action plan notably pushed forward on this as well, urging the Securities and Exchange Commission to develop more programs to allow financial firms to test out AI. Hilary Allen, a law professor at American University who specializes in sandboxes, considers herself a skeptic: She said the fintech experience indicates that the downsides of sandboxes outweigh the potential benefits for AI innovations, since they often lead to regulatory capture. 'Sandboxes have been a disappointment, and a lot of regulatory agencies are moving away from them in the fintech space,' she told DFD. She said there's little evidence that these regimes, which are expensive to implement, lead to more sophisticated policies. Instead, what ends up happening is that these sandbox periods stretch on indefinitely, since pilot businesses may grow to a point where agencies are hesitant to shut them down by reinstating regulations. Effectively, the experiment becomes permanent, even without guardrails. (She suggests that clear rules for sunsetting the experiments could help to mitigate the problem.) 'Regulators are in an awkward position,' she said, 'because they become sort of a cheerleader for the firm they have selected, and that leads to natural capture dynamics. Plus, AI companies often skirt regulations in the first place. What the sandboxes really do, said Allen, is attract investors, who get interested because it indicates that an AI company is getting favorable legal treatment. Victoria LaCivita, a spokesperson for the White House's Office of Science and Technology Policy, did not respond directly to Allen's concerns when asked by DFD. But she said in a statement that the administration was fostering 'a pro innovation environment that will foster positive, transformative uses of AI.' Despite the skeptics, sandboxes are still a go-to tool for governments around the world. Even the tech policy hardliners in the European Union have included a sandbox provision in the otherwise strict AI Act. 'The fact that the U.S. is following suit is a wonderful sign,' said Kevin Cochrane, CMO of the global cloud company Vultr. 'Every national government needs to accelerate up policy around AI.' COTTON'S PUSH ON CHINESE ENGINEERS AT DOD The Republican chair of Senate Intelligence is demanding answers from the Defense Department after Microsoft was found to be using China-based engineers to support DOD cloud computing, as ProPublica reported In a letter obtained Thursday by POLITICO, Sen. Tom Cotton (R-Ark.) told Defense Secretary Pete Hegseth, 'we must put in place the protocols and processes to adopt innovative technology quickly, effectively and safely.' Specifically, Cotton asked Hegseth for details on a two-week review of the Defense Department's current cloud contracts; all security classification guides given to Microsoft or other subcontractors; and plans for an agency-wide review of contracting practices to ensure against 'leveraging loopholes' that place systems at risk. Microsoft declined to comment on Cotton's letter. The Defense Department did not immediately respond to POLITICO's inquiry. TikTok won't get more extensions Commerce Secretary Howard Lutnick says that TikTok will have to go offline in the U.S. if China doesn't accept the administration's deal for the app's sale. 'You can't have Chinese control and have something on 100 million American phones,' Lutnick told CNBC Thursday. He added that the proposed deal has been sent to Chinese officials, and that TikTok is an 'unofficial' part of current trade negotiations with Beijing. TikTok did not immediately respond to DFD's inquiry. The app has been a perpetual headache for Trump since he took office. He thrice extended a deadline that Congress set in 2024 for buyers from a nonadversarial country to take majority ownership of TikTok. The administration has been negotiating for months with China and ByteDance, TikTok's Beijing-based parent company. They almost reached a deal in April, but Chinese officials walked away when Trump announced a slew of new tariffs on the country. Trump has reportedly gathered a consortium of U.S. buyers like Oracle for the deal, though Blackstone withdrew its involvement in July. post of the day THE FUTURE IN 5 LINKS Stay in touch with the whole team: Aaron Mak (amak@ Mohar Chatterjee (mchatterjee@ Steve Heuser (sheuser@ Nate Robson (nrobson@ and Daniella Cheslow (dcheslow@