
AI chatbots using reason emit more carbon than those responding concisely, study finds
Live Events
A study found that carbon emissions from chat-based generative AI can be six times higher when responding to complex prompts, like abstract algebra or philosophy, compared to simpler prompts, such as high school history."The environmental impact of questioning trained ( large-language models ) is strongly determined by their reasoning approach, with explicit reasoning processes significantly driving up energy consumption and carbon emissions," first author Maximilian Dauner, a researcher at Hochschule Munchen University of Applied Sciences, Germany, said."We found that reasoning-enabled models produced up to 50 times more (carbon dioxide) emissions than concise response models ," Dauner added.The study, published in the journal Frontiers in Communication, evaluated how 14 large-language models (which power chatbots), including DeepSeek and Cogito, process information before responding to 1,000 benchmark questions -- 500 multiple-choice and 500 subjective.Each model responded to 100 questions on each of the five subjects chosen for the analysis -- philosophy, high school world history, international law, abstract algebra, and high school mathematics."Zero-token reasoning traces appear when no intermediate text is needed (e.g. Cogito 70B reasoning on certain history items), whereas the maximum reasoning burden (6.716 tokens) is observed for the Deepseek R1 7B model on an abstract algebra prompt," the authors wrote.Tokens are virtual objects created by conversational AI when processing a user's prompt in natural language. More tokens lead to increased carbon dioxide emissions.Chatbots equipped with an ability to reason, or ' reasoning models ', produced 543.5 'thinking' tokens per question, whereas concise models -- producing one-word answers -- required just 37.7 tokens per question, the researchers found.Thinking tokens are additional ones that reasoning models generate before producing an answer, they explained.However, more thinking tokens do not necessarily guarantee correct responses, as the team said, elaborate detail is not always essential for correctness.Dauner said, "None of the models that kept emissions below 500 grams of CO₂ equivalent achieved higher than 80 per cent accuracy on answering the 1,000 questions correctly.""Currently, we see a clear accuracy-sustainability trade-off inherent in (large-language model) technologies," the author added.The most accurate performance was seen in the reasoning model Cogito, with a nearly 85 per cent accuracy in responses, whilst producing three times more carbon dioxide emissions than similar-sized models generating concise answers."In conclusion, while larger and reasoning-enhanced models significantly outperform smaller counterparts in terms of accuracy, this improvement comes with steep increases in emissions and computational demand," the authors wrote."Optimising reasoning efficiency and response brevity, particularly for challenging subjects like abstract algebra, is crucial for advancing more sustainable and environmentally conscious artificial intelligence technologies," they wrote.
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


India Today
4 hours ago
- India Today
How can India build AI like ChatGPT? By doing what Mark Zuckerberg is doing
There are times when we get epoch-defining technologies. Fire, wheel, manufacturing of paper, steam engine, antibiotics, printing press, telegram, electricity, airplanes, silicon chips, WWW. It has happened again and again, and while the bar is high, and most of the time, the next big thing is merely hype, breakthroughs do happen. The generative AI feels like this epoch-defining technology. It is not perfect, but it is the beginning of something. AI will embed itself in our lives to become a layer on top of which the world will move. It is the potency of this idea that has started an arms race among tech companies, and not just companies but among when we say countries, we are mostly talking about two: the US and China. I would love to see another name there — 2017, I headed to Google I/O, which resulted in this piece titled I/O 2017 shows Google is no longer a search company, it's an AI company. I spent my 15-hour flight reading Homo Deus by Yuval Noah Harari. Unlike the Sapiens, which looked at humanity's past, this one tried to imagine human affairs in the coming years. Harari made a number of observations in that book. One has stayed with me ever since that flight. 'In the early twenty-first century, the train of progress is again pulling out of the station,' Harari wrote. 'This will probably be the last train ever to leave the station called Homo Sapiens. Those who miss this train will never get a second chance.'advertisement We are already beginning to see that some of this is happening. Over the last few decades, technologies, including military tech, have coalesced around a few places. One is obviously Silicon Valley. Then there are a few Chinese and other Asian cities. But it is the generative AI, such as ChatGPT and DeepSeek, which is truly going to accelerate the trend. The potential inherent in modern AI, when combined with enough compute and robotics, is such that it will fundamentally alter the world. And this is without taking into account where it ends up going. Even if all the AI development freezes right now and there is no new technology breakthrough, even then we already have enough in terms of core tech to remake the it's not going to freeze at the moment. The world - or at least some US and Chinese companies - is racing towards creating AI systems that would be as good as humans, or better, at most tasks. The race towards AGI - Artificial General Intelligence - is real and so is the risk that whoever gets to AGI first will zoom ahead of everyone else for perpetuity. This is the reason why Harari also warned in Homo Deus that 'in the twenty-first century, those who ride the train of progress will acquire divine abilities of creation and destruction, while those left behind will face extinction.'advertisementWhat has this got to do with Mark Zuckerberg? Unlike OpenAI or Google, or even DeepSeek, his company Meta has not exactly been an AI pioneer so far. Precisely. That is why we need to talk about Mark the last few months — I am assuming around the time Meta launched its lacklustre Llama 4 in April — Zuckerberg decided that this was it. He woke up, and as they say in Gram, chose violence. Now Zuckerberg is personally assembling a team of crack AI researchers. It is as if he believes that nothing else matters in the future except AI, that without a good AI system in place, his companies like Meta and WhatsApp will not only miss the train but will only has Zuckerberg decided to build a crack team of AI researchers, he has decided to build it irrespective of the cost. No cost is too high. Pissing off people is okay, including OpenAI CEO Sam Altman, who is seemingly pissed off at how aggressively Zuckerberg is trying to poach his people. In the last few weeks, there have been 10s of top OpenAI engineers who have left the company for Zuckerberg's team. This includes Trapit Bansal, an IITian who was supposedly a key figure at are reports that not only is Zuck handpicking his hires, he is also throwing an unimaginable amount of money at each of them. The reported salaries run into tens of millions - Rs 80 crore to Rs 400 crore. Some chosen ones are likely getting over a hundred million. This comes just days after Meta acqui-hired, a process where a company buys another one just to get people working in it, by putting in $14 billion for 49 per cent stake in an AI company called is possible that Zuckerberg's efforts may come to naught. Or he may succeed. We don't know. Even Zuckerberg wouldn't know. But he wants to take a swing. And what a swing he is taking! The way he is going about building an AI system after falling behind has some lessons for Indian government should be taking a lead in developing AGI. But so contested is the scene right now, majorly for AI researchers, that merely talking about it is not going to cut it. It needs a plan and a willingness to push for it irrespective of the cost. Most significantly, it needs infrastructure and people. India has is a sobering fact: Zuckerberg just spent $14 billion to get a handful of AI researchers, whereas the Indian government is hoping to spend a little over $1 billion in five years on AI. This is according to our 2024 Budget. This year, in the Budget, AI merely got a passing reference and an allocation of around Rs 500 crore, a figure that is likely less than what Mark Zuckerberg has offered top AI I look at what companies and governments in the US and China are doing, I find India's AI rhetoric empty. Beyond platitudes and empty words, India has not made any serious attempt to get on the AI train. Now, it risks missing it. We have a few startups. Krutim and Sarvam AI come to mind. But these are not a patch on what the likes of Zuckerberg are cooking. At the same time, India's IT giants are happy doing what they always do — bureaucratic SAAS and coolie like IT service work without ever thinking about deep tech and fundamental 2023, while Sam Altman was visiting India, he ruffled feathers by saying that it was impossible for India, or Indian companies, to build something like ChatGPT. He knew what would be needed to build a top-class AI system. For AGI, India would need infrastructure and an ecosystem that it currently doesn't have. This ecosystem can only be enabled and created by the government. It's the same with Indian government needs to reach out to AI engineers and researchers and somehow convince them to build AGI in India. It needs to do what Mark Zuckerberg is doing, which is writing emails and bringing people on board. In other words, India needs its AI Manhattan Project to get AGI or an AI system comparable to what OpenAI, Google or China's DeepSeek have. Nothing less will do.(Javed Anwer is Technology Editor, India Today Group Digital. Latent Space is a weekly column on tech, world, and everything in between. The name comes from the science of AI and to reflect it, Latent Space functions in the same way: by simplifying the world of tech and giving it a context)- Ends(Views expressed in this opinion piece are those of the author)


Hans India
5 hours ago
- Hans India
Baidu to Open Source ERNIE AI Model, Shaking Up Global AI Arena
In a bold and unexpected move, Chinese tech giant Baidu has announced the open-sourcing of its flagship ERNIE generative AI model, marking a pivotal moment in the rapidly evolving global AI competition. The company confirmed that the rollout would begin gradually starting Monday. While not as abrupt or headline-grabbing as the recent debut of DeepSeek, Baidu's decision is already making waves in the AI community and prompting responses from key industry stakeholders worldwide. The development comes as a surprise, given Baidu's long-held stance favouring proprietary development. The company has traditionally maintained strict control over its AI tools and infrastructure, resisting the open-source wave that has swept through parts of the tech world. 'Baidu has always been very supportive of its proprietary business model and was vocal against open-source, but disruptors like DeepSeek have proven that open-source models can be as competitive and reliable as proprietary ones,' said Lian Jye Su, Chief Analyst at technology research firm Omdia, speaking to CNBC earlier. Although the move might not have the dramatic impact DeepSeek generated, experts are calling it an important step in AI's broader evolution. 'This isn't just a China story. Every time a major lab open-sources a powerful model, it raises the bar for the entire industry,' said Sean Ren, Associate Professor of Computer Science at the University of Southern California and Samsung's AI Researcher of the Year. Ren pointed out that open-source models challenge industry norms, especially for closed-source providers like OpenAI and Anthropic. 'While most consumers don't care whether a model's code is open-sourced, they do care about lower costs, better performance, and support for their language or region. Those benefits often come from open models, which give developers and researchers more freedom to iterate, customize, and deploy faster,' he explained. From a pricing standpoint, industry analysts see Baidu's move as a potential game-changer. Alec Strasmore, founder of AI advisory Epic Loot, compared the shift to a price war. 'Baidu just threw a Molotov into the AI world,' he declared. 'OpenAI, Anthropic, DeepSeek — all these guys who thought they were selling top-notch champagne are about to realise that Baidu will be giving away something just as powerful.' He continued, 'This isn't a competition; it's a declaration of war on pricing.' According to Strasmore, startups and smaller developers may soon rethink paying premium prices for AI access. This new strategy isn't entirely unanticipated. Earlier in March, Baidu claimed that its latest model, ERNIE X1, could match DeepSeek's R1 in performance while costing half as much. CEO Robin Li also hinted at the company's global ambitions during an April developer event. 'Our releases aim to empower developers to build the best applications — without having to worry about model capability, costs, or development tools,' Li said at the time. However, not all experts believe Baidu's open-source shift will immediately shake the Western market. Cliff Jurkiewicz, VP of Global Strategy at applied AI firm Phenom, suggested the news might not even register in the U.S. tech scene. 'The news of Baidu going open source probably lands with a big thud,' he commented. 'Most people in the United States don't even know it's a Chinese tech company.' Drawing parallels with the early Android ecosystem, Jurkiewicz explained that while open systems provide flexibility, they can also be challenging to manage. 'When Android first emerged, its standout feature was that it was configurable and customisable. But it was almost too much work… Android, out of the box, is plain and vanilla, so it has to be customised, and that's a real challenge,' he noted. As Baidu begins its rollout, all eyes are now on how this strategic pivot will reshape the global AI landscape — from affordability and accessibility to the core philosophies of AI development.


India Today
6 hours ago
- India Today
After DeepSeek, China's Baidu to open source its Ernie AI chatbot
Baidu is set to open source its Ernie generative AI model, which will be a major development in the ongoing global AI competition. The company has confirmed that the open-sourcing of its large language model will begin with a gradual rollout starting Monday. While it may not be as disruptive as the emergence of DeepSeek, Baidu's move is already sparking debate within the AI community and is being closely watched by industry leaders across the decision comes as a surprise to many, especially given its long-standing preference for a proprietary approach to AI development. The company had previously opposed the open-source model, favouring internal control over its tools and infrastructure.'Baidu has always been very supportive of its proprietary business model and was vocal against open-source, but disruptors like DeepSeek have proven that open-source models can be as competitive and reliable as proprietary ones,' Lian Jye Su, chief analyst with technology research and advisory group Omdia, previously told CNBC. While some experts believe Baidu's move may not have the same dramatic effect as DeepSeek's launch, others argue that it is an important milestone in the broader evolution of artificial intelligence.'This isn't just a China story. Every time a major lab open-sources a powerful model, it raises the bar for the entire industry,' said Sean Ren, associate professor of computer science at the University of Southern California and Samsung's AI Researcher of the added that open-source models put pressure on companies like OpenAI and Anthropic to justify their closed platforms, premium APIs, and subscription-based pricing models. 'While most consumers don't care whether a model's code is open-sourced, they do care about lower costs, better performance, and support for their language or region. Those benefits often come from open models, which give developers and researchers more freedom to iterate, customize, and deploy faster,' he insiders are also pointing to the broader impact Baidu's move could have on pricing. Alec Strasmore, founder of AI advisory Epic Loot, likened the development to a direct challenge to the commercial dominance of current AI leaders.'Baidu just threw a Molotov into the AI world,' Strasmore said. 'OpenAI, Anthropic, DeepSeek, all these guys who thought they were selling top-notch champagne are about to realise that Baidu will be giving away something just as powerful,' he added, comparing Baidu's move to budget retail giant Costco creating its own high-quality alternative.'This isn't a competition; it's a declaration of war on pricing,' he said, adding that the open-source release of Ernie could encourage startups and developers to stop paying top dollar for AI ambitions are clear. In March, the company claimed its latest ERNIE X1 model could match the performance of DeepSeek's R1 at half the price. CEO Robin Li also hinted earlier this year that Baidu's open-source strategy aims to support developers across the releases aim to empower developers to build the best applications — without having to worry about model capability, costs, or development tools,' Li said during a developer event in not everyone is convinced the news will immediately disrupt the global AI landscape. Cliff Jurkiewicz, vice president of global strategy at applied AI firm Phenom, said Baidu's announcement may not generate much reaction in markets like the US.'The news of Baidu going open source probably lands with a big thud,' Jurkiewicz said. 'Most people in the United States don't even know it's a Chinese tech company.'He also drew comparisons between Baidu's move and the early days of Android. 'When Android first emerged, its standout feature was that it was configurable and customisable. But it was almost too much work in the sense that people just wanted the thing to function correctly,' he said. 'Android, out of the box, is plain and vanilla, so it has to be customised, and that's a real challenge,' Jurkiewicz added.- Ends