
Why value investing has worked better outside the US
I was intrigued by this observation. So, I looked at Morningstar's indexes. Sure enough, value investing has prospered beyond American shores—this year, as well as over the past three-, five-, and 10-year periods.
'Magnificent Seven' vs. 'Granolas'
US stocks of all styles have beaten international stocks for years, due to a strengthening dollar, superior returns on invested capital, and expanding price multiples.
Morningstar's US growth index is dominated by the ' Magnificent Seven.' But internationally, the scale is far smaller. There is not a single public company outside the US currently worth $1 trillion.
There was a time when the ' Granolas ' stocks were being promoted as Europe's answer to the Magnificent Seven (names like GSK, Roche, Nestle, L'Oreal and AstraZeneca). But don't blame yourself if you haven't heard of the Granolas. They never really rivaled the Magnificent Seven from a performance perspective.
So, what has boosted value investing internationally? Largely, the financial-services sector (which was lifted by higher interest rates) and energy stocks.
Looking beyond Europe with style
While value has outperformed growth in Europe over the past five years, Europe now represents less than half of equity market capitalization outside the US.
Rather, value stocks in emerging markets (like China, India, and Brazil) and developed Asia have outperformed by an even larger margin than in Europe.
The decline of Chinese internet companies, which were once big growth stocks, helps explain why value investing has triumphed over growth investing in emerging markets in recent years.
On the developed-markets side, Japan has seen rising interest rates and improved economic and investment conditions disproportionately benefit financial-services stocks, which tend to reside on the value side of the market.
Value's underperformance in the US: Is it macro or micro?
While I have mentioned some macroeconomic factors above, I am generally skeptical of attempts to explain style leadership from the top down. Back in 2022, when sticky inflation prompted the largest interest-rate hikes in a generation, US growth stocks fell much further than value. A popular narrative arose: Growth stocks are more sensitive to interest rates.
But then growth bounced back in 2023 despite high rates. The Magnificent Seven and others rode a wave of enthusiasm for AI. Growth stocks' thriving amid higher rates is hardly unprecedented. Between 2015 and 2018, the US Federal Reserve hiked rates several times, yet growth beat value by a wide margin.
Ultimately, I agree with Rasmussen that the triumph of growth over value in the US has more to do with 'historically unique and rare circumstances.'
I've been following markets long enough to know that style leadership can be cyclical. Right now, it's value investing that's being fundamentally questioned.
Value stocks have been called 'structurally challenged,' in 'secular decline,' and 'value traps.' But the value side of the market has always been home to troubled companies. Value investing is about stocks that under promise and overdeliver.
Perhaps the long-term cycle will turn again in value's favor. AI could be revolutionary, but, like the internet, ahead of itself from an investment perspective. We could look back at 2025 as a historical inflection point, as marking a new market regime.
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


Digital Trends
15 minutes ago
- Digital Trends
The hottest new ChatGPT trend is disturbingly morbid
The rise of AI has helped us make some huge leaps. From helping with medicine research to spotting cancer, the advances enabled by AI have been pretty remarkable. But at the same time, even the most popular AI tools, such as ChatGPT, have gone haywire in the most astounding fashion. Over the past couple of years, reports have detailed how ChatGPT guided a person about murder, accused a person of killing their children, and nudged them into a conspiracy theory spiral. It seems the next hot thing is using ChatGPT to write obituaries of loved ones. Or even building a business atop the massive demand. ChatGPT, for the dead among us According to a report in The Washington Post, funeral homes are using ChatGPT to ' write obituaries all the time without telling their clients.' Of course, they have to do it with a lot of caution, or else ChatGPT will turn the obituaries into unrealistic accounts of how a person passed away among their loved ones or departed the mortal plane peacefully. 'We don't know that it was a peaceful death, though we'd like to imagine it was,' an anonymous employee at a funeral home was quoted as saying. But it's not just funeral homes and some enterprising tech founders that are using AI to write obituaries, while charging for it. Regular folks are using it, too, and seem quite happy about it. Recommended Videos A Nevada resident, who used ChatGPT to write their mother's obituary, told the outlet that 'she'd be very happy with the end result.' The individual has even more ambitious plans for the future when they might have to write an obituary for their father. 'This time I'm gonna use Deep Research mode. It's gonna be a banger,' the individual was quoted as saying by The Post. Some folks who talked with the reporter argued that it's not easy to articulate their feelings in moments of profound grief, and that AI tools like ChatGPT made it easier to write an obituary. All is fair with death and business Interestingly, it seems using AI tools such as ChatGPT is not just a personal choice or a sly act by some funeral homes. It's a booming business, and there are multiple companies out there that are offering 'AI for obituary' services — for a price. One of those companies is CelebrateAlly, founded by a former Microsoft employee, which charges customers $5 for 100 credits. An obituary usually takes 10 credits, which means you can write a fresh eulogy honoring your departed loved one for just fifty cents each. The company even lets users pick between ChatGPT and Anthropic's Claude AI model to change the tone or contents of the obituary. But the underlying technology is not without its faults, and if ignored, it can lead to some bizarre scenarios. Here's a segment from the report: Instructed to write a 'playful' obituary for a spirited, funny and faith-filled fake person, the AI tool said the man had been 'born on a chilly day,' 'lived by the words of the great Groucho Marx,' 'inspired everyone' and died in a 'sunny embrace,' despite being given none of that information. In other prompts, it invented fake nicknames, preferences and life events, even declaring that the man had established a community theater and mentored a 'young comedian … who went on to tour nationally.' ChatGPT is not the only tool making up stuff. Google's Gemini AI told a person to add glue to their pizza. Microsoft's AI is no different. Recent research says that depending too much on AI tools is leading to a cognitive decline and that it hinders real research. Some experts are also concerned about deep psychological and moral issues. AI companion apps, such as Character AI and Nomi, have given rise to a segment of users who are obsessed with their AI-generated partners, at the cost of real human connections. Some are even getting their AI partners pregnant and staying deeply engrossed in their own digital reality, while paying hundreds of dollars to the AI companies behind the software.
Yahoo
20 minutes ago
- Yahoo
Mag 7 Plans to 'FOMO' Into $650B Tech Investment Despite Trump's U.S. Manufacturing Push
While President Donald Trump's tariff war aims to spark a manufacturing boom at home, corporate America's spending focus remains firmly on "bits" rather than "bricks and mortar." This contrast is evident in the spending patterns of the Magnificent 7 (Mag 7) stocks – a group comprising large-cap tech companies, including Alphabet (parent company of Google), Amazon, Apple, Meta Platforms (parent company of Facebook and Instagram), Microsoft, Nvidia, and Tesla. These firms are expected to cumulatively spend an astonishing $650 billion this year on capital expenditure (capex) and research and development (R&D), according to data tracked by Lloyds Bank. That amount is larger than what the U.K. government spends on public investments in a year, the bank noted in a Thursday note. If that number alone doesn't impress you, consider this: the total economy-wide investment spending on IT equipment and software has continued to surge this year, accounting for 6.1% of GDP, while both private fixed and fixed non-residential investment, excluding IT, have shrunk for consecutive quarters. FOMO and AI According to Lloyds' FX Strategist Nicholas Kennedy, the decline in investments across other sectors of the economy could be due to several reasons, including the fear of missing out (FOMO) on the artificial intelligence (AI) boom. "There might be some explanations other than a crowding out by IT spending and political/trade uncertainties that you could call on; the building boom that was triggered by Biden's CHIPS act, which boosted structures, has faded, for instance. There is also a FOMO effect at work, firms encouraged to divert investment resources from what they traditionally do towards fashionable AI-related projects. So they're just spending elsewhere," Kennedy said in a note to clients. The chart indicates that U.S. corporate spending on IT equipment and software has increased to $1.45 trillion, representing a 13.6% year-over-year rise. The tally makes up over 40% of the total U.S. private fixed investment. The U.S. second-quarter GDP estimate, released by the Bureau of Economic Analysis early this week, showed that private fixed investment in IT increased by 12.4% quarter-on-quarter. Meanwhile, investment in non-IT sectors or the broader economy fell by 4.9%, extending the three-quarter declining trend. From 'bricks' to 'bits' This continued dominance of "bits" spending in corporate America should calm the nerves of those worried that the administration's focus on manufacturing may suck capital away from technology markets, including emerging avenues like cryptocurrencies. Bitcoin and NVDA, the bellwether for all things AI, both bottomed out in late November 2022 with the launch of ChatGPT and have since enjoyed incredible bull runs, demonstrating a powerful correlation between technology's rise and the crypto market. "Whether that [AI spending boom] generates a return is another matter, but it does reshape plans towards bits from bricks," Kennedy said. Moreover, the crypto market has also found a significant tailwind in the form of a favourable regulatory policy under Trump. The administration has demonstrated its pro-crypto bias through the signing of several key pieces of legislation aimed at clarifying regulatory oversight for digital assets and stablecoins, including measures that have garnered bipartisan support. Additionally, the administration has made strategic appointments to financial regulatory bodies. Error in retrieving data Sign in to access your portfolio Error in retrieving data Error in retrieving data Error in retrieving data Error in retrieving data
Yahoo
20 minutes ago
- Yahoo
Is the cloud the wrong place for AI?
When you buy through links on our articles, Future and its syndication partners may earn a commission. The enterprise software playbook seemed clear: everything moves to the cloud eventually. Applications, databases, storage: they all followed the same inevitable arc from on-premises to software-as-a-service. But with the arrival and boom of artificial intelligence, we're seeing a different story play out, one where the cloud is just one chapter rather than the entire book. AI systems AI workloads are fundamentally different beasts than the enterprise applications that defined the cloud migration wave. Traditional software scales predictably, processes data in batches, and can tolerate some latency. AI systems are non-deterministic, require massive parallel processing, and often need to respond in real-time. These differences reshape the entire economic equation of where and how you run your infrastructure. Take the challenge of long-running training jobs. Machine learning models don't train on schedule; they train until they converge. This could be hours, days, or weeks. Cloud providers excel at providing infrastructure at short notice, but GPU capacity at hyperscalers can be hard to get without a 1 year reservation. The result is either paying for guaranteed capacity you might not fully use, or risking that your training job gets interrupted when using spot instances to reduce costs. Then there's the inference challenge. Unlike web applications that might see traffic spikes during Black Friday, AI services often need to scale continuously as customer usage grows. The token-based pricing models that govern large language models make this scaling unpredictable in ways that traditional per-request pricing never was. A single customer query might consume 10 tokens or 10,000, depending on the complexity of the response and the size of the context window. Hybrid approaches The most intriguing development involves companies discovering hybrid approaches that acknowledge these unique requirements rather than abandoning the cloud. They're using on-premises infrastructure for baseline, predictable workloads while leveraging cloud resources for genuine bursts of demand. They're co-locating servers closer to users for latency-sensitive applications like conversational AI. They're finding that owning their core infrastructure gives them the stability to experiment more freely with cloud services for specific use cases. This evolution is being accelerated by regulatory requirements that simply don't fit the cloud-first model. Financial services, healthcare, and government customers often cannot allow data to leave their premises. For these sectors, on-premises or on-device inference represents a compliance requirement rather than a preference. Rather than being a limitation, this constraint is driving innovation in edge computing and specialized hardware that makes local AI deployment increasingly viable. Infrastructure strategies The cloud providers aren't standing still, of course. They're developing AI-specific services, improving GPU access, and creating new pricing models. But the fundamental mismatch between AI's resource requirements and traditional cloud economics suggests that the future won't be a simple rerun of the SaaS revolution. Instead, we're heading toward a more nuanced landscape where different types of AI workloads find their natural homes. Experimentation and rapid prototyping will likely remain cloud-native. Production inference for established products might move closer to owned infrastructure. Training runs might split between cloud spot instances for cost efficiency and dedicated hardware for mission-critical model development. The approach represents a step toward infrastructure strategies that match the actual needs of AI systems rather than forcing them into patterns designed for different types of computing The most successful AI companies of the next decade will likely be those that think beyond the cloud-first assumptions and build infrastructure strategies as sophisticated as their algorithms. We've featured the best cloud storage. This article was produced as part of TechRadarPro's Expert Insights channel where we feature the best and brightest minds in the technology industry today. The views expressed here are those of the author and are not necessarily those of TechRadarPro or Future plc. If you are interested in contributing find out more here: