
Sam Altman's Lies About ChatGPT Are Growing Bolder
In a Tuesday blog post, Altman cited internal figures for how much energy and water a single ChatGPT query uses. The OpenAI CEO claimed a single prompt requires around 0.34 Wh, equivalent to what 'a high-efficiency lightbulb would use in a couple of minutes.' For cooling these data centers used to process AI queries, Altman suggested a student asking ChatGPT to do their essay for them requires '0.000085 gallons of water, roughly one-fifteenth of a teaspoon.'
Altman did not offer any evidence for these claims and failed to mention where his data comes from. Gizmodo reached out to OpenAI for comment, but we did not hear back. If we took the AI monger at his word, we only need to do some simple math to check how much water that actually is. OpenAI has claimed that as of December 2025, ChatGPT has 300 million weekly active users generating 1 billion messages per day. Based on the company's and Altman's own metrics, that would mean the chatbot uses 85,000 gallons of water per day, or a little more than 31 million gallons per year. ChatGPT is hosted on Microsoft data centers, which use quite a lot of water already. The tech giant has plans for 'closed-loop' centers that don't use extra water for cooling, but these projects won't be piloted for at least another year.
Fresh numbers shared by @sama earlier today:
300M weekly active ChatGPT users
1B user messages sent on ChatGPT every day
1.3M devs have built on OpenAI in the US
— OpenAI Newsroom (@OpenAINewsroom) December 4, 2024
These data centers were already water- and power-hungry before the advent of generative AI. For Microsoft, water use spiked from 2021 to 2022 after the tech giant formulated a deal with OpenAI. A study from University of California researchers published in late 2023 claimed the older GPT-3 version of ChatGPT drank about .5 liters for every 10 to 50 queries. If you take that data at its most optimistic, OpenAI's older model would be using 31 million liters of water per day, or 8.18 million gallons. And that's for an older model, not today's current, much more powerful (and far more demanding) GPT-4.1 plus its o3 reasoning model.
The size of the model impacts how much energy it uses. There have been multiple studies about the environmental impact of training these models, and since they continuously have to be retrained as they grow more advanced, the electricity cost will continue to escalate. Altman's figures don't mention which queries are formulated through its multiple different ChatGPT products, including the most advanced $200-a-month subscription that grants access to GPT-4o. It also ignores the fact that AI images require much more energy to process than text queries.
Altman's entire post is full of big tech optimism shrouded in talking points that make little to no sense. He claims that datacenter production will be 'automated,' so the cost of AI 'should eventually converge to near the cost of electricity.' If we are charitable and assume Altman is suggesting that the expansion of AI will somehow offset the electricity necessary to run it, we're still left holding today's bag and dealing with rising global temperatures. Multiple companies have tried to solve the water and electricity issue with AI, with some landing on plans to throw data centers into the ocean or build nuclear power plants just to supply AI with the necessary electricity. Long before any nuclear plant can be built, these companies will continue to burn fossil fuels.
The OpenAI CEO's entire blog is an encapsulation of the bullheaded big tech oligarch thinking. He said that 'entire classes of jobs' will go the way of the dodo, but it doesn't matter since 'the world will be getting so much richer so quickly that we'll be able to seriously entertain new policy ideas we never could before.' Altman and other tech oligarchs have suggested we finally encourage universal basic income as a way of offsetting the impact of AI. OpenAI knows it won't work. He's never been serious enough about that idea that he has stumped for it harder than he has before cozying up to President Donald Trump to ensure there's no future regulation on the AI industry.
'We do need to solve the safety issues,' Altman said. But that doesn't mean that we all shouldn't be expanding AI to every aspect of our lives. He suggests we ignore the warming planet because AI will solve that niggling issue in due course. But if temperatures rise, requiring even more water and electricity to cool these data centers, I doubt AI can work fast enough to fix anything before it's too late. But ignore that; just pay attention to that still unrevealed Jony Ive doohickey that may or may not gaslight you as the world burns.

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


Chicago Tribune
35 minutes ago
- Chicago Tribune
Liz Shulman: Will Big Tech transform school into an AI video game?
'Why am I learning AI if it's going to eventually take my job?' one of my students asked me at the end of the school year. 'I don't know,' I said. 'I wonder the same thing about mine.' Students are off for the summer, but Big Tech is working hard pitching its brand to schools, marketing its products to students as 'homework buddies' and 'personal tutors' and to educators as 'teaching assistants' and 'work pals' while undermining the entire field of education and sending out a sea of mixed messages. We all have reason to worry. The dizzying pace at which artificial intelligence has infiltrated schools and dominated the discourse within education has left the classroom a battleground of contradictions. Our fears aren't hyperbolic. Schools in Texas and Arizona are already using AI to 'teach' kids with educators as mere 'guides' rather than experts in their content area. Last year, one of my seniors told me she preferred AI to her teachers 'because I can talk to AI in the middle of the night, but my teachers don't email me back until the next morning.' In May, Luis von Ahn, CEO of the foreign language education app Duolingo, said: 'It's just a lot more scalable to teach with AI than with teachers.' Schools will exist mostly just for child care. And President Donald Trump's April 23 executive order calls for the use of AI in schools, claiming the 'early exposure' will spark 'curiosity and creativity.' This pressure isn't only coming from the White House. Education websites have uncritically embraced AI at a stunning pace. Edutopia used to highlight resources for teaching literature, history, art, math and science and instead is dominated now by AI 'tools' marketed to burned-out, overworked educators to save time. EdTechTeacher and call AI 'knowledgeable colleagues' and 'friendly buddies,' shifting away from teachers' specific subject areas. If this isn't dizzying enough, when we educators are directed or forced to use AI in our teaching, we're criticized when we do. What's really happening in the classroom is this: Teachers are unable to teach the problem-solving skills kids will need as they grow up and are blamed when an entire generation is outsourcing their imaginations to Big Tech. No wonder test scores have plunged, and anxiety and depression have risen. Yet in glossy AI advertisements paid with the billions of dollars Big Tech is making off schools, the classroom is portrayed as student-centered spaces where kids engage with personalized technology that differentiates better than teachers as though it's just another school supply item like the pencil cases on their desks. The kids know it. When I teach grammar, students want to use Grammarly. When we read a book together, they say ChatGPT can summarize it for them in seconds. When I teach them any part of the writing process, they list the dozens of AI apps that are designed to 'write' the essay for them. Students readily admit they use AI to cheat, but they're constantly getting messages to use their 'writing coach,' 'debate-partner' and 'study buddy.' It's always been an uphill battle for educators to get kids to like school. It's part of the profession. 'It's our job to push students, and it's our students' job to resist,' a mentor told me when I was a new teacher. 'In the middle,' he continued, 'therein lies learning.' Wherein lies learning now? Will school become a video game packaged as, well, school? If educators don't teach writing, we're told we're not teaching students how to communicate. If we don't teach reading, we're told we're not teaching them how to think critically. If we don't teach them business skills, we're told we're not preparing them to enter the workforce. Now we're being told if we don't teach them AI, we're not preparing them for their future that consists of what, exactly? The future that's poised to steal their jobs? At the end of the school year in my freshman English class, we read Erich Remarque's novel 'All Quiet on the Western Front.' I asked my ninth graders to choose passages that stood out to them. Many of them chose this one: 'We are forlorn like children, and experienced like old men. We are crude and sorrowful and superficial — I believe we are lost.' They noticed the alienation the soldiers feel from themselves. I wondered if it's how they felt, too — estranged from their own selves. Ironically, their discovery showed the whole point of reading literature — to understand oneself and the world better and to increase one's capacity for empathy and compassion. As my mentor teacher told me decades ago, therein lies learning. Our kids have become soldiers caught on the front lines in the battle for education, stuck in the crossfire of Big Tech and school. The classroom — a sacred space that should prioritize human learning, discovery and academic risk-taking — has become a flashpoint in America, and our kids are in the center of it. I recently finished reading 'The Road Back,' Remarque's sequel to 'All Quiet on the Western Front.' The novel dramatizes the ongoing alienation of the soldiers once they've returned home from war. 'Why can't you let the kids enjoy the few years that are left to them,' Willy, one of the soldiers pleads, 'while they need still know nothing about it?' Is the classroom going to remain a torched battleground such as the one my students read about in 'All Quiet on the Western Front' — kids hunkering in the trenches of our schools while the adults fight over the eroded terrain of education? Will they become even more cut off from their own selves, just when they're getting to know who they are?
Yahoo
44 minutes ago
- Yahoo
Investing in AI-Fueled Nuclear Resurgence
(1:00) - President Trump's Recent Executive Order on Nuclear Power (8:05) - Big Tech's Push into Nuclear Energy (14:30) - Supply and Demand for Uranium and the Long-Term Impact (32:50) - Episode Roundup: NUKZ, NLR, URAN, URNM, URA Podcast@ In this episode of ETF Spotlight, I speak with Tim Rotolo, Founder and CEO of Range Fund Holdings, about investing in the nuclear renaissance, which is being driven primarily by the artificial intelligence boom. President Trump's recent executive orders on nuclear power outline a sweeping agenda to quadruple U.S. nuclear capacity by 2050 and prioritize American leadership in nuclear technology globally. The renewed interest in nuclear energy is largely fueled by Big Tech's push into the sector, as AI data centers require massive amounts of energy. To meet their sustainability goals, many tech giants are increasingly exploring nuclear energy as a solution. Companies like Microsoft MSFT, Amazon AMZN, Google GOOGL, and Meta META are investing heavily in nuclear power projects, including small modular reactor (SMR) development and the revitalization of existing nuclear plants. They are also entering into long-term agreements for nuclear power. Building nuclear power plants is notoriously expensive and time-consuming. That's why there is growing interest in SMRs, which are typically manufactured in factories and can reduce construction time and costs. When will we start seeing SMRs more widely adopted? Nuclear fusion is often called the "holy grail" of energy because it promises a virtually limitless, clean, and safe source of power by replicating the process that fuels the stars. However, no one has yet made fusion power commercially viable. The Range Nuclear Renaissance Index ETF NUKZ holds companies in the nuclear fuel and energy industry. Cameco CCJ, Constellation Energy Corporation CEG, and Oklo OKLO are among its top holdings. Investors might also consider the Global X Uranium ETF URA, VanEck Uranium+Nuclear Energy ETF NLR, and Sprott Uranium Miners ETF URNM. Tune in to the podcast to learn more. Make sure to be on the lookout for the next edition of ETF Spotlight! If you have any comments or questions, please email podcast@ Want the latest recommendations from Zacks Investment Research? Today, you can download 7 Best Stocks for the Next 30 Days. Click to get this free report Inc. (AMZN) : Free Stock Analysis Report Microsoft Corporation (MSFT) : Free Stock Analysis Report Constellation Energy Corporation (CEG) : Free Stock Analysis Report Cameco Corporation (CCJ) : Free Stock Analysis Report Alphabet Inc. (GOOGL) : Free Stock Analysis Report VanEck Uranium and Nuclear ETF (NLR): ETF Research Reports Global X Uranium ETF (URA): ETF Research Reports Sprott Uranium Miners ETF (URNM): ETF Research Reports Meta Platforms, Inc. (META) : Free Stock Analysis Report Range Nuclear Renaissance Index ETF (NUKZ): ETF Research Reports Oklo Inc. (OKLO) : Free Stock Analysis Report This article originally published on Zacks Investment Research ( Zacks Investment Research Sign in to access your portfolio

Business Insider
an hour ago
- Business Insider
Want to work at OpenAI? Curiosity and grit matter more than a Ph.D, the head of ChatGPT says
When OpenAI leaders Nick Turley and Mark Chen talk about hiring, they don't rattle off Ivy League credentials or AI competition wins. Instead, they value something far more human: curiosity and initiative. In a Tuesday episode of OpenAI 's podcast, Turley, the head of ChatGPT, said the single most important trait he looks for isn't technical prowess, but curiosity. "It's the number one thing that I've looked for," Turley said. "And it's actually my advice to students when they ask me, 'What do I do in this world where everything's changing?'" "There's so much we don't know," he said. "There's a certain amount of humility you have to have about building on this technology." He added that working with AI is less about knowing all the answers and more about asking the right questions. "When it comes to working with AI," he said, "It's asking the right questions that is the bottleneck, not necessarily getting the answer." Mark Chen, OpenAI's chief research officer, echoed that sentiment. He said he didn't have much formal AI training when he joined the company in 2018. "I also came into the company as a resident without much formal AI training," he said. "Even on research, I think increasingly less, we index on you have to have a Ph.D. in AI." Instead, Chen said OpenAI looks for candidates with "agency"—the ability to identify and solve problems independently. "It's really about being driven to find, 'Hey, here's the problem. No one else is fixing it. I'm just going to go dive in and fix it,'" he said. That kind of self-direction is echoed by other OpenAI veterans. Peter Deng, who served as OpenAI's VP of consumer product and previously led product teams at Meta, put it bluntly on an episode of Lenny's Podcast last month: "In six months, if I'm telling you what to do, I've hired the wrong person." That expectation aligns closely with OpenAI's own culture of hands-on ownership. Turley recalled that ChatGPT itself was born out of a hackathon-style sprint, with people from across teams — including infrastructure and supercomputing — coming together to ship a product quickly. "Fundamentally," Turley said, "we just have a lot of people with agency who can ship." He added that this is what makes OpenAI unique and is a key focus of the company's hiring process.