logo
#

Latest news with #H100GPUs

How much energy does your AI prompt use? I went to a data center to find out.
How much energy does your AI prompt use? I went to a data center to find out.

Mint

time27-06-2025

  • Business
  • Mint

How much energy does your AI prompt use? I went to a data center to find out.

There they were. Golden boxes, louder than a toddler on a red-eye, hotter than a campfire in a heat wave, pricier than a private Caribbean island. Yes, real, working Nvidia GPUs. I was under strict 'look, don't touch" orders—as if I'd, what, lick the mesh metal enclosure. Just standing there, I could hear and feel the electricity being devoured. We've all heard about AI's insatiable energy appetite. By 2028, data centers like this one I visited in Ashburn, Va., could consume up to 12% of all U.S. electricity, according to a report from the Energy Department and Lawrence Berkeley National Lab. And yes, we're the problem, it's us. (Insert Taylor Swift-related groan here.) Every time we ask AI to write an email, draw an anime-style George Washington or generate a video of a cat doing a back flip, we're triggering another roar in those massive halls of GPUs. What I wanted to know was, how much power do my AI tasks actually use? The equivalent of charging a phone? A laptop? Cooking a steak on an electric grill? Powering my house? After digging into the research, visiting a data center, bugging just about every major AI company and, yes, firing up that grill, I got some answers. But not enough. Tech companies need to tell us more about the energy they're using on our behalf. Let's start with a recent, popular example: 'a video of a cat diving off an Olympic diving board." The moment you hit enter, that prompt gets routed to a massive data center. When it arrives, it kicks off inference, where pretrained AI models interpret and respond to your request. In most cases, rows of powerful Nvidia graphics processing units get to work turning your weird idea into a weirder reality. Rival chips from companies like Amazon, Google or Groq are also starting to be used for inference. The model training itself happens earlier, with Nvidia chips. The facility where I saw that 'SuperPod" of Nvidia H100 GPUs was run by Equinix, one of the world's largest operators of data centers that provide cloud infrastructure—and now, AI. Chris Kimm, Equinix senior vice president of customer success, said that while AI training can happen just about anywhere, inference is best done geographically closer to users to deliver the best speed and efficiency. Figuring out how much energy your individual AI prompts use would be a lot easier if the major AI companies actually shared the darn info. Google, Microsoft and Meta declined. Google and Meta pointed me to their sustainability reports. OpenAI shared something. Chief Executive Sam Altman said that the average ChatGPT query uses about 0.34 watt-hours of energy. OpenAI wouldn't break out details on text, image or video energy usage. Researchers have stepped in to fill the gap. Sasha Luccioni, the AI and climate lead at open-source AI platform Hugging Face, has run tests to estimate the energy required to generate different types of content. Along with other researchers, she also maintains an AI Energy Score leaderboard. Since the top AI players use their own proprietary models, she relies on open-source alternatives. The energy required to generate content varies widely depending on the model and GPU setup. Compare Luccioni's findings with charging a typical smartphone, which uses around 10 watt-hours of energy: • Text: A lightweight, single-GPU Llama model from Meta used about 0.17 watt-hours, while a larger Llama model running across multiple GPUs used 1.7 watt-hours. • Images: Generating a single 1024 x 1024 image with one GPU also used 1.7 watt-hours. • Video: This is the most intensive. Even making 6-second, standard-definition videos used anywhere between 20 and 110 watt-hours. I wanted to better understand the stakes—literally. So I grabbed an electric grill from Home Depot, a power meter and my video producer, David Hall. About 10 minutes and 220 watt-hours later, we had a thin, medium-well steak. Translation: The energy it took to cook a decent dinner was about the same as generating two AI videos, at the high end. (Watch the video above for more steak breakdowns.) Remember the short AI film I made using Google Veo and Runway a few weeks ago? We generated about a thousand 8-second, 720p clips for our film. Going by these estimates, we might have used roughly 110,000 watt-hours. That's nearly 500 steaks! But, as I said, Luccioni doesn't have the power-consumption data for the commercial AI tools, and her numbers aren't a perfect match: On the one hand, our video was higher quality than the 6-second, 480p clips in Luccioni's research. On the other hand, the popular video models are likely optimized for greater efficiency, experts say. 'Until we get access to these models," Luccioni said, 'all we can do is estimate." Her tests also use Nvidia's last-generation Hopper chips. Nvidia has seen a jump in energy efficiency with its latest Blackwell Ultra chips, according to Josh Parker, the company's head of sustainability. 'We're using 1/30th of the energy for the same inference workloads that we were just a year ago," Parker said. That said, plenty are still using those older chips. The pod I saw at Equinix's facility? It cost over $9 million in just Nvidia hardware alone. You don't just toss that in the dumpster when new ones come out. And I've only covered electricity. These hot GPUs also require a lot of water to stay cool, but that's a whole other story. Data-center providers and tech companies I spoke to all said the same thing: Demand for these GPU-filled buildings keeps multiplying. Just driving through Ashburn, I saw five massive data centers going up. The companies also stressed the improving efficiency of models and chips, and their efforts to shift to cleaner, more renewable energy sources. No matter how efficient things get, more of us are using AI. We could all buy more efficient air conditioners, but if the planet keeps getting hotter, we're going to crank the AC more—and burn more energy. Luccioni hopes we at least consider energy use when we use these tools, maybe think twice about generating a dozen cat videos. And it's on the companies to start sharing real numbers, so that we can make informed choices. Back to Virginia, and those screaming GPUs. Turns out, they weren't generating Olympic kitty videos. They were owned by Bristol Myers Squibb—and they were searching for new cures to diseases. Not all AI prompts are what you'd call a waste of energy.

The AI talent wars are white hot
The AI talent wars are white hot

Yahoo

time11-06-2025

  • Business
  • Yahoo

The AI talent wars are white hot

Tech startups and behemoths are duking it out for the best AI talent. DataBricks' VP of AI described the AI talent search as "looking for LeBron James." As CEOs try to poach top AI researchers and engineers, companies' AI infrastructure can also be a factor. The AI talent wars are reaching a fever pitch. Upstart AI research labs and tech industry giants alike have heightened their efforts to recruit top AI talent over the past few years. Naveen Rao, DataBricks' vice president of AI, has equated the scramble for top-tier AI talent to "looking for LeBron James," estimating there are fewer than 1,000 researchers capable of building frontier AI models. Startups that lack the financial resources to offer attractive pay packages like their Big Tech peers are turning to hackathons to find budding talent in the AI sector. The race to attract the best AI talent has led CEOs to personally get involved in recruiting efforts, some of which aren't successful. And it's not always about the amount of money being offered. Companies' troves of top-of-the-line computing chips — data centers lined with Nvidia H100 GPUs, for example — can play a part. Perplexity CEO Aravind Srinivas last year described a situation in which he was trying to poach an AI researcher from Meta. He was rebuffed, he said, with the researcher telling him, "Come back to me when you have 10,000 H100 GPUs." Getting the highly coveted graphics processing units from Nvidia would "cost billions and take five to 10 years to get," Srinivas said at the time. "You have to offer such amazing incentives and immediate availability of compute. And we're not talking of small compute clusters here," he added. Speaking of Meta, Mark Zuckerberg has become personally involved in the hiring fray. An AI tech worker previously told BI they were surprised to see Zuckerberg appear in an email chain about a position for which they were being recruited. Now, it sounds like Zuckerberg is only leaning into recruiting efforts. Bloomberg reported this week that Zuckerberg has been hosting top AI candidates at his home for meals in an effort to recruit them, and The New York Times said Meta has offered seven- to nine-figure compensation packages. Meta did not immediately respond to a request for comment from BI. It's not just Zuckerberg, either. Another tech worker with an AI background previously told BI that OpenAI cofounder and CEO Sam Altman called them to personally make the case for them to join the company. "There is definitely some competing CEO emailing going on," Dan Portillo, founder of Sweat Equity Ventures and cofounder of The General Partnership, previously told BI. "Leaders of companies are operating on a feeling that there's a window in time that's open right now, and one of the attributes of this moment is the aggressiveness of CEOs and cofounders saying, 'We will use every advantage we have to win employees and to win business,'" Tribe AI CEO Jaclyn Rice Nelson said at the time. Companies are also looking to college campuses to try to convince undergrad students and Ph.D. candidates with AI knowledge to join their teams, dangling the promise of huge salaries and generous research funding. The AI race has led to swirling questions around the technology's long-term impact on the job market. For the AI researchers and engineers capable of building the very same models ushering in that change, the only question is deciding which job offer is best. Read the original article on Business Insider Error in retrieving data Sign in to access your portfolio Error in retrieving data Error in retrieving data Error in retrieving data Error in retrieving data

The AI talent wars are white hot
The AI talent wars are white hot

Business Insider

time10-06-2025

  • Business
  • Business Insider

The AI talent wars are white hot

The AI talent wars are reaching a fever pitch. Upstart AI research labs and tech industry giants alike have heightened their efforts to recruit top AI talent over the past few years. Naveen Rao, DataBricks' vice president of AI, has equated the scramble for top-tier AI talent to "looking for LeBron James," estimating there are fewer than 1,000 researchers capable of building frontier AI models. Startups that lack the financial resources to offer attractive pay packages like their Big Tech peers are turning to hackathons to find budding talent in the AI sector. The race to attract the best AI talent has led CEOs to personally get involved in recruiting efforts, some of which aren't successful. And it's not always about the amount of money being offered. Companies' troves of top-of-the-line computing chips — data centers lined with Nvidia H100 GPUs, for example — can play a part. Perplexity CEO Aravind Srinivas last year described a situation in which he was trying to poach an AI researcher from Meta. He was rebuffed, he said, with the researcher telling him, "Come back to me when you have 10,000 H100 GPUs." Getting the highly coveted graphics processing units from Nvidia would "cost billions and take five to 10 years to get," Srinivas said at the time. "You have to offer such amazing incentives and immediate availability of compute. And we're not talking of small compute clusters here," he added. Speaking of Meta, Mark Zuckerberg has become personally involved in the hiring fray. An AI tech worker previously told BI they were surprised to see Zuckerberg appear in an email chain about a position for which they were being recruited. Now, it sounds like Zuckerberg is only leaning into recruiting efforts. Bloomberg reported this week that Zuckerberg has been hosting top AI candidates at his home for meals in an effort to recruit them, and The New York Times said Meta has offered seven- to nine-figure compensation packages. Meta did not immediately respond to a request for comment from BI. It's not just Zuckerberg, either. Another tech worker with an AI background previously told BI that OpenAI cofounder and CEO Sam Altman called them to personally make the case for them to join the company. "There is definitely some competing CEO emailing going on," Dan Portillo, founder of Sweat Equity Ventures and cofounder of The General Partnership, previously told BI. "Leaders of companies are operating on a feeling that there's a window in time that's open right now, and one of the attributes of this moment is the aggressiveness of CEOs and cofounders saying, 'We will use every advantage we have to win employees and to win business,'" Tribe AI CEO Jaclyn Rice Nelson said at the time. Companies are also looking to college campuses to try to convince undergrad students and Ph.D. candidates with AI knowledge to join their teams, dangling the promise of huge salaries and generous research funding. The AI race has led to swirling questions around the technology's long-term impact on the job market. For the AI researchers and engineers capable of building the very same models ushering in that change, the only question is deciding which job offer is best.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store