logo
The AI Valuation Paradox: Balancing Hype With Real-World Impact

The AI Valuation Paradox: Balancing Hype With Real-World Impact

Forbes09-04-2025
Tomas Milar is the Founder and CEO of Eqvista, an equity management platform.
For the past year, soaring artificial intelligence (AI) startup valuations have been justified by rapid revenue growth, driven by various industries recognizing AI's potential to reshape operations and enhance productivity.
A prime example is OpenAI's valuation, which grew more than tenfold in just three years, from $14 billion in 2021 to $157 billion in 2024, fueled by ChatGPT's success and its impressive projected earnings. The market's confidence in AI is evident in the lofty average revenue multiple of 23.4x commanded by AI startups.
However, we may soon witness a decline in the high funding levels AI startups currently attract, driven by the rise of low-cost, asset-light alternatives. While this in itself is a strong reason for AI startup valuations to deflate, I believe the exaggeration of current AI capabilities leaves room for further corrections.
Recently, we have seen AI startups secure valuations that were thousands of times their annual revenues. For example, xAI and Infinite Reality were valued at $40 billion and $12.25 billion, respectively.
Even considering the growth potential of AI startups, such valuation multiples are excessive. Such outliers can skew data that most AI companies can achieve such heights when the reality is that many more AI startups tend to close their doors before achieving such market success.
AI startups have distinguished themselves from their predecessors with an unprecedented ability to generate revenue. According to Stripe (paywall), today's leading AI startups that have reached an annualized revenue of $30 million have done so five times faster than past SaaS companies.
At the same time, we must acknowledge that AI startups are much more capital-hungry than other tech startups.
OpenAI faces significant operational costs from its flagship product, ChatGPT, spending approximately $700,000 daily (paywall)—over $255 million annually. While these operational costs are offset comfortably by its $3.6 billion annualized revenue (paywall), OpenAI faces intense competition from tech giants such as Google as well as emerging players such as Anthropic.
To maintain its competitive edge, OpenAI must spend an additional $5 billion annually to train new models. This is an expense that will likely continue until OpenAI establishes itself as the undisputed market leader. To put things into perspective, OpenAI's total funds raised stand at $21.9 billion (registration required).
However, recent advancements by new entrants and the limitations of existing AI models cast doubt on both the funding needs and valuations of AI startups.
DeepSeek, the Chinese AI startup, has disrupted the U.S. AI startup ecosystem by demonstrating that premier AI models could be built without exorbitant capital expenditure. Although various experts are disputing this, the company claims that the total training cost was $5.6 million for DeepSeek-R1, the model that delivers performance comparable to OpenAI's ChatGPT.
When we compare the training costs for the two startups, we can see that OpenAI could train new models for less than half a day with DeepSeek's entire budget.
We are already seeing the AI leaders being challenged. After the release of DeepSeek-R1, between January 23 and 25, ChatGPT lost 41.3 million views.
Thus, some investors are questioning if high-performing AI models really cost as much as advertised.
The reasons to believe that AI startups are overvalued are plentiful. Firstly, we haven't yet achieved true artificial general intelligence (AGI), which by definition is capable of performing any intellectual task a human can. What we have right now is a very narrow version of AI that can reliably carry out certain tasks, such as natural language processing or image recognition, but has limited application elsewhere.
Secondly, AI's commercial viability remains questionable. A Boston Consulting Group (BCG) report analyzing 1,000 companies that adopted AI found that only 4% generated substantial value, while only 22% had progressed beyond the proof-of-concept stage to generate any value at all. Notably, the companies that stood out were already well-positioned for success due to strong nonfinancial factors, such as patents filed and employee satisfaction.
Thirdly, various studies note that the capabilities of AI in tasks such as logical reasoning, chemical compound discovery and code writing have been exaggerated.
Thus, only a few AI startups that achieve significant breakthroughs, such as closing the gap between advertised and actual capabilities and enhancing commercial viability, are likely to survive and justify their valuations, while the majority perish.
While some AI startups' values are astronomical multiples of their annual revenues, these cases represent a small group of outliers. Once we exclude the outliers, we can observe reasonable valuation multiples across all stages.
However, a widely recognized cause for concern for AI startups is their struggle to achieve profitability due to their asset-heavy nature and the high costs associated with operations and training.
Additionally, AI's real-world impact remains limited, with narrow applications, questionable commercial viability and sometimes-exaggerated capabilities. As low-cost alternatives emerge, investors are increasingly scrutinizing whether U.S. AI startups can maintain their competitive edge.
The information provided here is not investment, tax or financial advice. You should consult with a licensed professional for advice concerning your specific situation.
Forbes Finance Council is an invitation-only organization for executives in successful accounting, financial planning and wealth management firms. Do I qualify?
Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

Don't hold your breath for OpenAI's new model to run on your phone's Snapdragon chip
Don't hold your breath for OpenAI's new model to run on your phone's Snapdragon chip

Android Authority

time27 minutes ago

  • Android Authority

Don't hold your breath for OpenAI's new model to run on your phone's Snapdragon chip

TL;DR The new OpenAI model can now run directly on some devices with Snapdragon chips. It's the first time an OpenAI reasoning model has been made available for on-device use. This could mean faster, more private AI features on your phone, just not yet. When you use an AI model like ChatGPT, it runs in the cloud rather than on your phone or laptop, but Qualcomm seems eager to change that. The company has announced that OpenAI's first open-source reasoning model, with the less-than-catchy name 'gpt-oss-20b,' is now capable of running directly on Snapdragon-powered devices. In a press release, Qualcomm says this is the first time OpenAI has made one of its models available for on-device use. Previously, the company's most advanced models could only run on powerful cloud infrastructure, but with help from Qualcomm's AI Engine and AI Stack, this 20-billion-parameter model has been tested locally. However, that doesn't mean your phone is ready for it. We believe that on-device AI capability will increase rapidly, opening the door to private, low-latency, personalized agentic experiences. Qualcomm Despite references to Snapdragon devices, this isn't aimed at smartphones just yet. The model is still pretty beefy and requires 24GB of RAM, with Qualcomm's integration work appearing targeted at developer-grade platforms, not the chip in your pocket. It's more about Snapdragon-powered PCs than a simple AI upgrade for your Android device. Still, Qualcomm calls this a milestone moment, with potential benefits in areas like privacy, speed, and personalization. Because everything runs directly on the device, there's no need to send data elsewhere, and tasks like reasoning or assistant-style interactions can happen faster and offline. While OpenAI is initially targeting developers, if it is scaled, it could impact how AI tools behave on your Snapdragon phone in the future. Think faster responses and no delays if your internet connection is playing up. It could also open the door for future apps that use local AI without sacrificing privacy. Developers can now access the model through platforms like Hugging Face and Ollama, with Qualcomm saying more deployment info will appear soon on its AI Hub. Follow

Groq and HUMAIN Launch OpenAI's New Open Models Day Zero
Groq and HUMAIN Launch OpenAI's New Open Models Day Zero

Yahoo

timean hour ago

  • Yahoo

Groq and HUMAIN Launch OpenAI's New Open Models Day Zero

Available worldwide with real-time performance, low cost, and local support in Saudi Arabia PALO ALTO, Calif. and RIYADH, Saudi Arabia, Aug. 5, 2025 /CNW/ -- Groq, the pioneer in fast inference, and HUMAIN, a PIF company and Saudi Arabia's leading AI services provider, today announced the immediate availability of OpenAI's two open models on GroqCloud. The launch delivers gpt-oss-120B and gpt-oss-20B with full 128K context, real-time responses, and integrated server-side tools live on Groq's optimized inference platform from day zero. Groq has long supported OpenAI's open-source efforts, including large-scale deployment of Whisper. This launch builds on that foundation, bringing their newest models to production with global access and local support through HUMAIN. "OpenAI is setting a new high performance standard in open source models," said Jonathan Ross, CEO of Groq. "Groq was built to run models like this, fast and affordably, so developers everywhere can use them from day zero. Working with HUMAIN strengthens local access and support in the Kingdom of Saudi Arabia, empowering developers in the region to build smarter and faster." "Groq delivers the unmatched inference speed, scalability, and cost-efficiency we need to bring cutting-edge AI to the Kingdom," said Tareq Amin, CEO at HUMAIN. "Together, we're enabling a new wave of Saudi innovation—powered by the best open-source models and the infrastructure to scale them globally. We're proud to support OpenAI's leadership in open-source AI." Built for full model capabilities To make the most of OpenAI's new models, Groq delivers extended context and built-in tools like code execution and web search. Web search helps provide real-time relevant information, while code execution enables reasoning and complex workflows. Groq's platform delivers these capabilities from day zero with a full 128k token context length. Unmatched price-performance Groq's purpose-built stack delivers the lowest cost per token for OpenAI's new models while maintaining speed and accuracy. gpt-oss-120B is currently running at 500+ t/s and gpt-oss-20B is currently running at 1000+ t/s on GroqCloud. Groq is offering OpenAI's latest open models at the following pricing: gpt-oss-120B: $0.15 / M input tokens and $0.75 / M output tokens gpt-oss-20B: $0.10 / M input tokens and $0.50 / M output tokens Note: For a limited time, tool calls used with OpenAI's open models will not be charged. Learn more at Global from day zero Groq's global data center footprint across North America, Europe, and the Middle East ensures reliable, high-performance AI inference wherever developers operate. Through GroqCloud, OpenAI's open models are now available worldwide with minimal latency. About Groq Groq is the AI inference platform redefining price performance. Its custom-built LPU and cloud have been specifically designed to run powerful models instantly, reliably, and at the lowest cost per token—without compromise. Over 1.9 million developers trust Groq to build fast and scale smarter. Contact: pr-media@ About HUMAIN HUMAIN, a PIF company, is a global artificial intelligence company delivering full-stack AI capabilities across four core areas - next-generation data centers, hyper-performance infrastructure & cloud platforms, advanced AI Models, including the world's most advanced Arabic multimodal LLMs, and transformative AI Solutions that combine deep sector insight with real-world execution. HUMAIN's end-to-end model serves both public and private sector organisations, unlocking exponential value across all industries, driving transformation and strengthening capabilities through human-AI synergies. With a growing portfolio of sector-specific AI products and a core mission to drive IP leadership and talent supremacy world-wide, HUMAIN is engineered for global competitiveness and national distinction. View original content to download multimedia: SOURCE Groq View original content to download multimedia:

Amazon may bring ads to Alexa+ in least surprising move ever
Amazon may bring ads to Alexa+ in least surprising move ever

Tom's Guide

timean hour ago

  • Tom's Guide

Amazon may bring ads to Alexa+ in least surprising move ever

Up until now Amazon has mostly avoided stuffing Alexa with ads, but according to a report from Mashable, Amazon CEO Andy Jassy broached the idea of delivering ads in Alexa Plus during the company's recent earnings call. Alexa Plus is the company's premium AI assistant that is supposed to be more naturally conversational than the previous Alexa. Jassy reportedly said that there is a "significant financial opportunity" in delivering ads via Alexa Plus. "I think over time, there will be opportunities, you know, as people are engaging in more multi-turn conversations to have advertising play a role — to help people find discovery and also as a lever to drive revenue," Jassy said, as found in the earnings call transcript. The new Alexa, introduced in February, has "enhanced" conversational abilities that are meant to put it on other AI assistants like Google's Gemini or ChatGPT. With improved memory its supposed to remember your details and help with managing tasks like booking tables for date night or buying groceries. Currently, Amazon offers a $19.99 a month subscription for non-Prime members. Prime subscribers, which costs $14.99 a month or $139 annually, can get Alexa+ for free Many Amazon devices are Alexa+ capable, though not all. However, Alexa has largely been an ad-free experience for the last decade since it was introduced. Perhaps it's not too surprising that Amazon would consider adding some kind of revenue procuring element to its AI assistant. The company has struggled for years to upgrade Alexa as other AI assistants surpassed Amazon who introduced chatbot assistants to the market. Get instant access to breaking news, the hottest reviews, great deals and helpful tips. Largely, integrating ads is a revenue ploy to recoup some of the billions Amazon has burned in trying to turn its smart assistant around. Both Google and OpenAI have explored putting ads in Gemini and ChatGPT. OpenAI teased the idea in December of 2024 though the company stepped back from the idea in the same sentence. Amazon has not officially made moves to add commercials to Alexa+ conversations, but Jassy framed the idea as helpful. It would supposedly assist you in finding products that you might want to buy. With no set plans, it's worth keeping an eye on Alexa+ to see how or if Amazon actually follows through with Jassy's idea and how obtrusive they'll become. Hopefully they'll be skippable at the very least.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store