
What AI's insatiable appetite for power means for our future
By Kurt Knutsson, CyberGuy Report
Published June 20, 2025
Every time you ask ChatGPT a question, to generate an image or let artificial intelligence summarize your email, something big is happening behind the scenes. Not on your device, but in sprawling data centers filled with servers, GPUs and cooling systems that require massive amounts of electricity.
The modern AI boom is pushing our power grid to its limits. ChatGPT alone processes roughly 1 billion queries per day, each requiring data center resources far beyond what's on your device.
In fact, the energy needed to support artificial intelligence is rising so quickly that it has already delayed the retirement of several coal plants in the U.S., with more delays expected. Some experts warn that the AI arms race is outpacing the infrastructure meant to support it. Others argue it could spark long-overdue clean energy innovation.
AI isn't just reshaping apps and search engines. It's also reshaping how we build, fuel and regulate the digital world. The race to scale up AI capabilities is accelerating faster than most infrastructure can handle, and energy is becoming the next major bottleneck.
TRUMP'S NUCLEAR STRATEGY TAKES SHAPE AS FORMER MANHATTAN PROJECT SITE POWERS UP FOR AI RACE AGAINST CHINA
Here's a look at how AI is changing the energy equation, and what it might mean for our climate future.
Sign up for my FREE CyberGuy Report
Get my best tech tips, urgent security alerts and exclusive deals delivered straight to your inbox. Plus, you'll get instant access to my Ultimate Scam Survival Guide – free when you join.
Why AI uses so much power, and what drives the demand
Running artificial intelligence at scale requires enormous computational power. Unlike traditional internet activity, which mostly involves pulling up stored information, AI tools perform intensive real-time processing. Whether training massive language models or responding to user prompts, AI systems rely on specialized hardware like GPUs (graphics processing unit) that consume far more power than legacy servers. GPUs are designed to handle many calculations in parallel, which is perfect for the matrix-heavy workloads that power generative AI and deep learning systems.
To give you an idea of scale: one Nvidia H100 GPU, commonly used in AI training, consumes up to 700 watts on its own. Training a single large AI model like GPT-4 may require thousands of these GPUs running continuously for weeks. Multiply that across dozens of models and hundreds of data centers, and the numbers escalate quickly. A traditional data center rack might use around 8 kilowatts (kW) of power. An AI-optimized rack using GPUs can demand 45-55 kW or more. Multiply that across an entire building or campus of racks, and the difference is staggering.
WHAT IS ARTIFICIAL INTELLIGENCE (AI)?
Cooling all that hardware adds another layer of energy demand. Keeping AI servers from overheating accounts for 30-55% of a data center's total power use. Advanced cooling methods like liquid immersion are helping, but scaling those across the industry will take time.
On the upside, AI researchers are developing more efficient ways to run these systems. One promising approach is the "mixture of experts" model architecture, which activates only a portion of the full model for each task. This method can significantly reduce the amount of energy required without sacrificing performance. How much power are we talking about?
In 2023, global data centers consumed about 500 terawatt-hours (TWh) of electricity. That is enough to power every home in California, Texas and Florida combined for an entire year. By 2030, the number could triple, with AI as the main driver.
To put it into perspective, the average home uses about 30 kilowatt-hours per day. One terawatt-hour is a billion times larger than a kilowatt-hour. That means 1 TWh could power 33 million homes for a day.
5 AI TERMS YOU KEEP HEARING AND WHAT THEY ACTUALLY MEAN AI's energy demand is outpacing the power grid
The demand for AI is growing faster than the energy grid can adapt. In the U.S., data center electricity use is expected to surpass 600 TWh by 2030, tripling current levels. Meeting that demand requires the equivalent of adding 14 large power plants to the grid. Large AI data centers can each require 100–500 megawatts (MW), and the largest facilities may soon exceed 1 gigawatt (GW), which is about as much as a nuclear power plant or a small U.S. state. One 1 GW data center could consume more power than the entire city of San Francisco. Multiply that by a few dozen campuses across the country, and you start to see how quickly this demand adds up.
To keep up, utilities across the country are delaying coal plant retirements, expanding natural gas infrastructure and shelving clean energy projects. In states like Utah, Georgia and Wisconsin, energy regulators have approved new fossil fuel investments directly linked to data center growth. By 2035, data centers could account for 8.6% of all U.S. electricity demand, up from 3.5% today.
Despite public pledges to support sustainability, tech companies are inadvertently driving a fossil fuel resurgence. For the average person, this shift could increase electricity costs, strain regional energy supplies and complicate state-level clean energy goals.
Can big tech keep its green energy promises?
Tech giants Microsoft, Google, Amazon and Meta all claim they are working toward a net-zero emissions future. In simple terms, this means balancing the amount of greenhouse gases they emit with the amount they remove or offset, ideally bringing their net contribution to climate change down to zero.
These companies purchase large amounts of renewable energy to offset their usage and invest in next-generation energy solutions. For example, Microsoft has a contract with fusion start-up Helion to supply clean electricity by 2028.
However, critics argue these clean energy purchases do not reflect the reality on the ground. Because the grid is shared, even if a tech company buys solar or wind power on paper, fossil fuels often fill the gap for everyone else.
Some researchers say this model is more beneficial for company accounting than for climate progress. While the numbers might look clean on a corporate emissions report, the actual energy powering the grid still includes coal and gas. Microsoft , Google and Amazon have pledged to power their data centers with 100% renewable energy, but because the grid is shared, fossil fuels often fill the gap when renewables aren't available.
Some critics argue that voluntary pledges alone are not enough. Unlike traditional industries, there is no standardized regulatory framework requiring tech companies to disclose detailed energy usage from AI operations. This lack of transparency makes it harder to track whether green pledges are translating into meaningful action, especially as workloads shift to third-party contractors or overseas operations.
AI CYBERSECURITY RISKS AND DEEPFAKE SCAMS ON THE RISE The future of clean energy for AI and its limits
To meet soaring energy needs without worsening emissions, tech companies are investing in advanced energy projects. These include small nuclear reactors built directly next to data centers, deep geothermal systems and nuclear fusion.
While promising, these technologies face enormous technical and regulatory hurdles. Fusion, for example, has never reached commercial break-even, meaning it has yet to produce more energy than it consumes. Even the most optimistic experts say we may not see scalable fusion before the 2030s.
Beyond the technical barriers, many people have concerns about the safety, cost and long-term waste management of new nuclear systems. While proponents argue these designs are safer and more efficient, public skepticism remains a real hurdle. Community resistance is also a factor. In some regions, proposals for nuclear microreactors or geothermal drilling have faced delays due to concerns over safety, noise and environmental harm. Building new data centers and associated power infrastructure can take up to seven years, due to permitting, land acquisition and construction challenges.
Google recently activated a geothermal project in Nevada, but it only generates enough power for a few thousand homes. The next phase may be able to power a single data center by 2028. Meanwhile, companies like Amazon and Microsoft continue building sites that consume more power than entire citie.
SCAMMERS CAN EXPLOIT YOUR DATA FROM JUST ONE CHATGPT SEARCH Will AI help or harm the environment?
This is the central debate. Advocates argue that AI could ultimately help accelerate climate progress by optimizing energy grids, modeling emissions patterns and inventing better clean technology. Microsoft and Google have both cited these uses in their public statements. But critics warn that the current trajectory is unsustainable. Without major breakthroughs or stricter policy frameworks, the energy cost of AI may overwhelm climate gains. A recent forecast estimated that AI could add 1.7 gigatons of carbon dioxide to global emissions between 2025 and 2030, roughly 4% more than the entire annual emissions of the U.S.
Water use, rare mineral demand and land-use conflicts are also emerging concerns as AI infrastructure expands. Large data centers often require millions of gallons of water for cooling each year, which can strain local water supplies. The demand for critical minerals like lithium, cobalt and rare earth elements — used in servers, cooling systems and power electronics — creates additional pressure on supply chains and mining operations. In some areas, communities are pushing back against land being rezoned for large-scale tech development.
Rapid hardware turnover is also adding to the environmental toll. As AI systems evolve quickly, older GPUs and accelerators are replaced more frequently, creating significant electronic waste. Without strong recycling programs in place, much of this equipment ends up in landfills or is exported to developing countries.
The question isn't just whether AI can become cleaner over time. It's whether we can scale the infrastructure needed to support it without falling back on fossil fuels. Meeting that challenge will require tighter collaboration between tech companies, utilities and policymakers. Some experts warn that AI could either help fight climate change or make it worse, and the outcome depends entirely on how we choose to power the future of computing.
HOW TO LOWER YOUR CAR INSURANCE COSTS IN 2025 Kurt's key takeaways
AI is revolutionizing how we work, but it is also transforming how we use energy. Data centers powering AI systems are becoming some of the world's largest electricity consumers. Tech companies are betting big on futuristic solutions, but the reality is that many fossil fuel plants are staying online longer just to meet AI's rising energy demand. Whether AI ends up helping or hurting the climate may depend on how quickly clean energy breakthroughs catch up and how honestly we measure progress.
CLICK HERE TO GET THE FOX NEWS APP
Is artificial intelligence worth the real-world cost of fossil resurgence? Let us know your thoughts by writing to us at Cyberguy.com/Contact .
For more of my tech tips and security alerts, subscribe to my free CyberGuy Report Newsletter by heading to Cyberguy.com/Newsletter
Ask Kurt a question or let us know what stories you'd like us to cover
Follow Kurt on his social channels
Answers to the most asked CyberGuy questions:
New from Kurt:
Copyright 2025 CyberGuy.com. All rights reserved. Print Close
URL
https://www.foxnews.com/tech/what-ais-insatiable-appetite-power-means-our-future
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles
Yahoo
15 minutes ago
- Yahoo
We're about to find out who's really paying for tariffs
Earnings season gets going in earnest this week, giving investors a fuller picture of how tariffs are impacting the bottom line—and specifically how much of the cost is being eaten by companies and how much is being passed on to consumers. So far, tariffs have yet to fuel a surge in inflation, but their effects are expected to show up more later this year. Earnings for the second quarter will heat up this week, with more at stake than usual as they represent a fuller picture on how tariffs are actually affecting businesses and consumers. The top U.S. banks will report, starting with JPMorgan Chase, Citigroup and Wells Fargo on Tuesday. In the tech sector, streaming leader Netflix and chip giant TSMC report on Thursday. Among industrials, results from Alcoa, GE Aerospace, and 3M are also due this week. The consensus estimate on Wall Street is that earnings from S&P 500 companies grew just 4% in the second quarter from a year ago, the slowest pace since 2023 and down from first-quarter growth of 13%. That comes as President Donald Trump's trade war has yet to fuel a big inflation inflation spike, though tariffs are expected to show up more in economic data later this year. The consumer price index will come out on Tuesday, and analysts expect a 0.3% monthly increase for June, up from May's 0.1% pace. The producer price index is due on Wednesday, and is also expected to show acceleration to 0.2% from 0.1%. The uptick could be a due to companies running out of inventories that were stockpiled ahead of the tariffs, forcing them to incorporate more of those costs in the price of their goods. Capital Economics said last week that Wall Street doesn't see Corporate America shouldering much of the future tariff burden, and exporters don't appear to be cutting their prices aggressively to offset the tariffs. A survey published last week by KPMG found more than 80% of companies plan to hike prices in the next six months, and 73% said they have already passed on up to half of tariff-related costs to consumers. But that was still not enough to preserve earnings, as 57% of firms said their gross margins are falling. Meanwhile, economists at Goldman Sachs expect companies will pass on 70% of tariff costs to consumers via higher prices, according to a note earlier this month. If that pans out, it would be a heavier blow than some earlier forecasts. Chris Harvey, Wells Fargo Securities' head of equity strategy, said if tariffs settle around 10%, then a third of the cost could be eaten by the importer, a third by companies, and a third by consumers. 'That's not a big impact,' he told CNBC on May 30. That 10% target looks increasingly optimistic, as Trump has continued to push for aggressive rates. Goldman Sachs expects the effective rate to eventually settle around 17%. But companies that pass on tariff costs also risk a backlash. The KPMG survey said 34% of companies said customer pushback is a challenge, and 45% said sales are already beginning to dip. And there's one consumer in particular that companies need to avoid annoying: Trump. In May, he warned Walmart not to hike prices after the retail giant said on an earnings call that prices could go up on a wide array of products. 'Walmart should STOP trying to blame Tariffs as the reason for raising prices throughout the chain,' Trump posted on Truth Social. 'Walmart made BILLIONS OF DOLLARS last year, far more than expected. Between Walmart and China they should, as is said, 'EAT THE TARIFFS,' and not charge valued customers ANYTHING. I'll be watching, and so will your customers!!!' Capital Economics said last week it suspects U.S. firms will eat more costs, 'if only in the short run for political reasons.' Either way, the upcoming earnings reports will reveal more definitively who is eating how much. More pain on the consumer side could fuel inflation and prevent the Federal Reserve from lowering rates, weighing on the stock market. More pain on the corporate side will erode earnings—and also weigh on the stock market. This story was originally featured on Error in retrieving data Sign in to access your portfolio Error in retrieving data Error in retrieving data Error in retrieving data Error in retrieving data
Yahoo
18 minutes ago
- Yahoo
Pentagon to start using Grok as part of a $200 million deal with Musk's xAI
The Pentagon has signed on to use Grok, the AI chatbot built by Elon Musk's company xAI, as part of a new $200 million agreement that opens the door for its deployment across the federal government, the company announced Monday. The announcement comes amid Musk's public breakup with President Trump and days after Grok generated antisemitic responses and praised Adolf Hitler. The rollout is part of "Grok for Government," a newly launched suite of tools designed for use by federal agencies, local governments, and national security operations. xAI said its products, including its latest Grok 4 model, will now be available for purchase through the General Services Administration (GSA), allowing any federal office to adopt the technology. The move aligns with the Trump administration's push for more aggressive adoption of artificial intelligence across the government. Since taking office in January, Mr. Trump has championed AI as a pillar of national security and innovation. Musk himself briefly served in the Trump administration earlier this year, overseeing the White House's Department of Government Efficiency, or DOGE, before stepping down in May amid a public break with Mr. Trump over his sweeping tax and spending bill. Musk has since emerged as a sharp critic of that legislation, even floating the idea of launching a third political party. Despite the rift, xAI has continued to expand its government work. The new offering includes custom national security tools, AI-powered science and health applications, and cleared engineering support for classified environments. The announcement comes just days after Grok generated antisemitic responses to user prompts and referenced Hitler as part of what the company called an effort to make the model "less politically correct." Hours later, Musk wrote in a post on X that "Grok was too compliant to user prompts. Too eager to please and be manipulated, essentially. That is being addressed." The posts were later deleted and xAI said it "quickly" patched the issue. One day later, xAI launched an upgraded version of Grok it described as a major leap forward. Musk also announced that Grok would be used in Teslas. But the latest version was not without kinks, too: Grok checked with Musk's views before answering a question, according to The Associated Press. Grok was introduced in late 2023 as a more unfiltered alternative to other chatbots like ChatGPT, and is already integrated into Musk's social media platform X, formerly known as Twitter. "America is the world leader in AI," xAI said in Monday's post announcing the Pentagon deal. "We're excited to contribute back to the country that made xAI uniquely possible here." Sen. Lindsey Graham says "a turning point, regarding Russia's invasion of Ukraine, is coming" Trump pushes senators to make $9.4 trillion in spending cuts Student's unique talent that's for the birds


CNN
21 minutes ago
- CNN
Nvidia to resume H20 GPU chip sales to China, launches compliant model
Nvidia (NVDA) said it plans to resume sales of its H20 artificial intelligence chip to China, days after its CEO, who is visiting Beijing, met US President Donald Trump. Nvidia's AI chips have been a key focus of US export controls designed to keep the most advanced chips out of Chinese hands amid national security concerns, restrictions that the US-listed company said would cut its revenue by $15 billion. The world's most valuable firm is filing applications with the US government to resume sales to China of the H20 graphics processing unit (GPU), and expects to get the licenses soon, Nvidia said in a statement. 'The US government has assured NVIDIA that licenses will be granted, and NVIDIA hopes to start deliveries soon,' it said. Nvidia, which has criticized the export restrictions the Trump administration imposed in April that stopped it from selling its H20 chip in China, also said it has introduced a new model tailored to meet regulatory rules in the Chinese market. The White House did not immediately respond to a request for comment. The US government has expressed concern that the Chinese military could use AI chips to develop weapons. Nvidia CEO Jensen Huang is scheduled to hold a media briefing in Beijing on Wednesday when he attends a supply chain expo, his second visit to China after a trip in April where he stressed the importance of the Chinese market. 'The Chinese market is massive, dynamic, and highly innovative, and it's also home to many AI researchers. Therefore, it is indeed crucial for American companies to establish roots in the Chinese market,' Huang told Chinese state broadcaster CCTV on Tuesday. Nvidia has faced increased competition from Chinese tech giant Huawei and other makers of graphics processing units – the chips used to train artificial intelligence. But Chinese companies, including its big tech firms, still crave Nvidia chips due to the company's computing platform known as CUDA. Huang's visit is being closely watched in both China and the United States, where a bipartisan pair of senators last week sent a letter to the CEO asking him to abstain from meeting companies that are working with military or intelligence bodies. The senators also asked Huang to refrain from meeting with entities named on the United States' restricted export list. The move to resume sales of the H20 chips comes amid easing tensions between Washington and Beijing, with China relaxing controls on rare earth exports and the United States allowing chip design software services to resume in China. 'The uncertainties between the US and China remain high and despite a pause in H20's ban, Chinese companies will continue to diversify their options to better protect their supply chain integrity,' said He Hui, research director of semiconductors at Omdia. The H20 chip was developed specifically for the Chinese market after US export restrictions were imposed on national security grounds in late 2023. The AI chip was Nvidia's most powerful legally available product in China until it was effectively banned by Washington in April. The H20 ban forced Nvidia to write off $5.5 billion in inventories, and Huang told the Stratechery podcast earlier this year that the company also had to walk away from $15 billion in sales. Nvidia also announced the development of a new AI chip designed specifically for China, called the RTX Pro GPU. The company described the model as 'fully compliant' with US export controls and suitable for digital twin AI applications in sectors such as smart factories and logistics. In May, Reuters reported Nvidia was preparing to launch a new AI chip, based on the RTX Pro 6000D, in China at a significantly lower price point than the H20. The graphics processing unit would be part of Nvidia's latest generation Blackwell-architecture AI processors and was expected to be priced well below the H20 due to its weaker specifications and simpler manufacturing requirements, sources said. China generated $17 billion in revenue for Nvidia in the fiscal year ending January 26, accounting for 13% of the company's total sales, based on its latest annual report. Huang has consistently highlighted China as a critical market for Nvidia's growth.