logo
I abandoned my Steam Deck for a year – but Nvidia GeForce Now has made it a Nintendo Switch 2 killer

I abandoned my Steam Deck for a year – but Nvidia GeForce Now has made it a Nintendo Switch 2 killer

Tom's Guide01-06-2025

Nvidia GeForce Now is now available to download on Steam Deck. The full native app was quietly announced at CES 2025, but now it's here. To say this is one of the biggest steps forward that Nvidia has made in cloud gaming would be an understatement.
I've been testing it, and not only has it renewed my dwindling love for the Steam Deck, I think this combination could be the Nintendo Switch 2's biggest threat.
Of course, this comes with some conditions, which I'll go into, but for the purpose of just playing gorgeous-looking games on a handheld, getting GeForce Now on the Deck is significant.
So, what does it have to do to beat the Switch 2? As a gamer that isn't necessarily that bothered by the team chat, motion controls or other fun additions to Nintendo's new console, this combo has to do two things well:
Does it hit both these notes? Sort of, as there's still some way to go. But it does so in a way that makes me confident that Nvidia is on the right track here, and it's making me reconsider my past judgements of game streaming. Let's get into it.
This is the model of Steam Deck I used for this testing, and in all honesty, for all the love I have for OLED, this is all you need for a great gaming experience.
For a limited time, you can get 40% off a 6-month Nvidia GeForce Now Performance tier package. This gets you 1440p gaming capabilities at 60 FPS, and gives you an Nvidia RTX gaming PC in the cloud to handle all your gameplay on any device — be it a Steam Deck, smartphone, or even the Meta Quest 3.
With $10 off my favorite cheap docking station for Steam Deck, this is a must-buy for anyone sporting Valve's handheld. Not only do you get the 100W power delivery and HDMI 2.0 for 60 FPS gameplay, but there's two USB ports for peripherals and even an Ethernet cable for that smoother, faster connection for GeForce Now.
There's a bit of a workaround to it, so let's break down the steps:
There is a plan B using Command Line if you need it, but out of everyone I've spoken to, nobody had to resort to this. If you're in the minority here, Nvidia's got you covered.
Get instant access to breaking news, the hottest reviews, great deals and helpful tips.
Let's address the elephant in the room: I loved the Steam Deck when I first got it — I even reviewed it for Laptop Mag and was smitten. So, what changed over the past three years?
Honestly, my uses have always come in waves. It was perfect for travel — playing AAA games on a flight is still awesome, even if it's a little rough.
But over the last 12 months, three things happened that pushed it into the dreaded 'man drawer' (you know the one).
But that really good reason has arrived. Let me tell you about GeForce Now, and how it's completely revived my love for the Steam Deck.
Dramatic, I know, but let me explain.
For those uninitiated, GeForce Now is Nvidia's cloud gaming service that links to your existing Steam, Epic, Ubisoft and Xbox libraries, and gives you streaming access to the games that you already own.
Once the best server is selected upon opening a game, you are taken straight into the game. And, if you're on GeForce Now Ultimate, you have an entire RTX 4080 gaming rig to play on.
As you can see from the screenshots, the difference is night and day if you're playing with a good internet connection for two key reasons.
First, you're not relying on the AMD silicon to power the games. That means no more hardware limitations, and it means you can play the likes of Cyberpunk 2077 at maxed out settings for beautiful visuals at a locked 60 FPS (the only constraint of my OG Steam Deck's refresh rate).
In fact, if something becomes a little more demanding (like Indiana Jones and The Great Circle), you can make the most of DLSS (provided you're on Ultimate) to make it even smoother. Whether I was on my home network or at the pub, I could play to my heart's content.
And speaking of that, this leads me to the second part — the battery life boosts. Currently, the 40Wh cell in my old Steam Deck is showing its age. Firing up a Balatro session can see that life draining in around 90 minutes, and don't even get me started on the roughly 45 minutes I get on Hitman: World of Assassination.
But by putting the computational demands on a cloud server rather than spinning up the AMD chip on the device, I've seen longevity go up dramatically.
For context, playing the same level on Hitman, I'm able to get roughly 6 hours of longevity in one sitting. Did I come out of the pub rather tipsy for playing that long? Yes. But it's purely revelatory watching the power demands on my Steam Deck reduce from 15 watts down to 7 watts for GeForce Now and getting so much more stamina for it.
Also, shoutout to the low latency on offer here. Cloud gaming has a bit of a reputation for lagging controls — something I still feel in Xbox Cloud Gaming for sure.
But there is one more thing that the Steam Deck with GeForce Now has to do to truly put Nintendo on blast: docked mode. I wired it up to my TV to see what I could squeeze out of it.
And the end result is incredible. It's not perfect, as resolution scaling seems to be limited to the 16:10 aspect ratio of the Steam Deck's display, but the fidelity of games on here due to the fact you're streaming an entire gaming PC is oceans beyond what the Nintendo Switch 2 will be able to do.
Going back to Hitman with everything turned up to Ultra and DLSS set to balanced, frame rates were smooth, details were beautifully rendered without any of those network speed glitches you may see in visuals and it scales well to a big screen.
If Nvidia is reading this (hi), if you could update this to support external screens better (maybe giving us full 4K 16:9), that'd be another significant step forward.
Let's start with the obvious. GeForce Now on Steam Deck has been Nvidia's Jay-Z moment, allowing the company to reintroduce itself as the best player in the cloud streaming game.
Gameplay is near-latency free, connecting is rapid with very short waiting times (provided you go for Ultimate or Performance, which, based on what I see from the free mode, are the only real ways to go) and the sheer drop in power demands meant I could play all day with no worries.
A little bit of work needs to be done on transitioning between handheld and docked mode to really make it sing on a TV. But as far as first steps go, this is a Herculean leap into it.
Of course, there are limitations. The big one being that it requires an internet connection. But statistically, most of the time you'll be on your Deck near a Wi-Fi network, so outside of the smaller circumstances where you'll have to rely on your device's chip to run offline, there's no better way to play on Steam Deck right now.

Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

3M Company (MMM) Outpaced Broader Market in 2025 with Renewed Focus on Core Businesses
3M Company (MMM) Outpaced Broader Market in 2025 with Renewed Focus on Core Businesses

Yahoo

time2 hours ago

  • Yahoo

3M Company (MMM) Outpaced Broader Market in 2025 with Renewed Focus on Core Businesses

3M Company (NYSE:MMM) is one of the Best Dividend Stocks of 2025. A specialized industrial laboratory, filled with high-tech machinery for producing abrasives. The stock has delivered a nearly 14% return since the start of 2025. The company posted strong earnings in the first quarter of 2025 and continues to make headway on key operational goals tied to CEO Bill Brown's turnaround strategy. Following the spin-off of its healthcare division as Solventum last year, and the resolution of legal issues related to PFAS and combat earplugs, both management and investors now have more clarity on the company's future financial obligations. With these major distractions behind it, 3M Company (NYSE:MMM)'s leadership is now better positioned to focus on enhancing the business. To that end, Brown has outlined several areas for improvement, including a renewed emphasis on research and development to drive new product innovation, optimizing asset efficiency, and lowering working capital needs. He has also openly acknowledged the importance of improving on-time, full deliveries, particularly in the safety and industrial segments. 3M Company (NYSE:MMM)'s dividend is also gaining traction among investors. It currently offers a quarterly dividend of $0.73 per share, having raised it by 4.3% in February. The stock supports a dividend yield of 1.94%, as of June 26. While we acknowledge the potential of MMM as an investment, we believe certain AI stocks offer greater upside potential and carry less downside risk. If you're looking for an extremely undervalued AI stock that also stands to benefit significantly from Trump-era tariffs and the onshoring trend, see our free report on the best short-term AI stock. READ NEXT: and . Disclosure. None. Sign in to access your portfolio

What Are 5 AI Stocks Growing Revenue by 30% or More to Buy Right Now?
What Are 5 AI Stocks Growing Revenue by 30% or More to Buy Right Now?

Yahoo

time3 hours ago

  • Yahoo

What Are 5 AI Stocks Growing Revenue by 30% or More to Buy Right Now?

Nvidia, AMD, and TSMC are all seeing tremendous growth thanks to the boom in AI infrastructure spending. Palantir's Artificial Intelligence Platform should be a growth driver for many years to come. SoundHound AI is experiencing hypergrowth, and its move into agentic AI could be a game changer. 10 stocks we like better than Nvidia › Make no mistake, investors love revenue growth, and investing in companies that are growing their top lines quickly can be quite rewarding. Nvidia (NASDAQ: NVDA), Advanced Micro Devices (NASDAQ: AMD), Taiwan Semiconductor Manufacturing (NYSE: TSM), Palantir Technologies (NASDAQ: PLTR), and SoundHound AI (NASDAQ: SOUN) have all been achieving revenue growth of 30% or more recently, and they're all among the best stocks to buy right now. Let's take a closer look. Nvidia has delivered explosive revenue growth over the past few years and there are no signs of that letting up. In its fiscal 2026 first quarter, the company grew revenue by 69% year over year to $44.1 billion as data center revenue soared by 73% to $39.1 billion. Even more impressively, the company's data center sales were up by more than 800% compared to its fiscal 2024 first quarter, when they were just $4.3 billion. The chipmaker's growth is being powered by the artificial intelligence (AI) infrastructure build-out. Its graphics processing units (GPUs) have become the most widely used chips for providing the parallel processing muscle that AI model training and inference require. However, the company's CUDA software platform -- which allows developers to program its GPUs for specific tasks and increase processing speeds -- has provided another wide moat around its business. Nvidia's previous efforts led to many developers integrating CUDA into early high-performance computing work, and the company later built a collection of tools and libraries to help enhance the performance of its chips for these tasks. But the platform and its tools can only be used with Nvidia chips, which has helped the company preserve its dominant market share in the GPU space and continues to set it up for strong growth ahead. While AMD is a distant second to Nvidia in the GPU market, it nonetheless has been seeing strong sales growth. Last quarter, AMD's revenue jumped 36% year over year to $7.4 billion, while its data center revenue climbed 57% to $3.7 billion. The company has established itself as a leading maker of central processing units (CPUs) for data centers, and it has been taking market share in that niche. Meanwhile, it has been carving out a niche for its GPUs in the AI inference market. In its most recent quarterly report, AMD noted that one of the largest AI model companies was handling a significant portion of its daily inference traffic using AMD's chips. This is important for AMD, as the AI inference market is predicted to grow to a much larger size than the AI model training segment. Inference is not as technically demanding as training, and chip costs come much more into play, which helps negate some of Nvidia's CUDA advantage. This opens the door for AMD to take some GPU market share in the future. While Nvidia and AMD design the chips that are helping power the AI revolution, Taiwan Semiconductor Manufacturing is the company that physically produces most of them. This has led to tremendous growth for TSMC: Its revenue jumped by 35% year over year in Q1 to $25.5 billion. The AI infrastructure build-out has been the biggest driver of TSMC's growth recently as it continues to increase manufacturing capacity to meet surging demand. Last quarter, high-performance computing accounted for 59% of its revenue. Because its operations are integral to the high-end chip space -- the company has the greatest technical expertise and scale of any foundry -- it has also been able to raise prices. This has resulted in solid gross margin improvements, including a 190 basis point increase last quarter to 58.8%. TSMC's position as an invaluable part of the semiconductor supply chain sets it up for continued strong growth in the years ahead. Best of all, it won't matter which chip designers come out on top -- it'll be a winner regardless. Palantir's revenue growth has been accelerating in recent quarters, culminating in a 39% jump in Q1 to $883.9 million. A large share of that growth is coming from its U.S. commercial segment as businesses adopt the Palantir Artificial Intelligence Platform (AIP) as a tool to solve real-world problems. Meanwhile, the company is also seeing strong sales growth with its largest customer, the U.S. government. In Q1, its U.S. commercial segment revenue surged 71% to $244 million, while its U.S. government revenue climbed 45% to $373 million. The exciting thing about Palantir is that it's quickly adding commercial customers. This has the potential to be a big growth driver as these customers expand their use of the company's solutions over time. In addition, AIP is being used to solve a wide array of problems across industries, from helping monitor sepsis in hospitals to underwriting insurance. The huge number of use cases to which AIP can be applied represents a huge opportunity for Palantir, which should lead to years of strong revenue growth ahead. SoundHound AI is currently in hypergrowth mode: Its revenue has climbed by 50% or more year over year in each of the past seven quarters. Last quarter, its revenue surged by a whopping 151% to $29.1 million. The company has made strong inroads with customers in the automotive industry and is beginning to benefit as carmakers shift away from partnering with big tech companies for their in-vehicle voice-recognition technology. SoundHound AI also continues to make progress in the restaurant space, while its recent acquisition of Amelia has given it strong positions in other verticals such as healthcare, financial services, and retail. While its revenue growth has already been surging, the company's biggest opportunity ahead lies in agentic AI. It recently launched its Amelia 7.0 platform, which includes AI agents that can perform tasks without the need for human intervention. If SoundHound AI can use this as a launchpad toward becoming more widely used as the voice layer of agentic AI, then it should continue to see strong revenue growth for many years. Before you buy stock in Nvidia, consider this: The Motley Fool Stock Advisor analyst team just identified what they believe are the for investors to buy now… and Nvidia wasn't one of them. The 10 stocks that made the cut could produce monster returns in the coming years. Consider when Netflix made this list on December 17, 2004... if you invested $1,000 at the time of our recommendation, you'd have $704,676!* Or when Nvidia made this list on April 15, 2005... if you invested $1,000 at the time of our recommendation, you'd have $950,198!* Now, it's worth noting Stock Advisor's total average return is 1,048% — a market-crushing outperformance compared to 175% for the S&P 500. Don't miss out on the latest top 10 list, available when you join . See the 10 stocks » *Stock Advisor returns as of June 23, 2025 What Are 5 AI Stocks Growing Revenue by 30% or More to Buy Right Now? was originally published by The Motley Fool Error in retrieving data Sign in to access your portfolio Error in retrieving data Error in retrieving data Error in retrieving data Error in retrieving data

OpenAI taps Google Cloud TPUs in bid to diversify AI chip supply
OpenAI taps Google Cloud TPUs in bid to diversify AI chip supply

Yahoo

time3 hours ago

  • Yahoo

OpenAI taps Google Cloud TPUs in bid to diversify AI chip supply

-- OpenAI has started using Google's (NASDAQ:GOOGL) artificial intelligence chips to help power ChatGPT and related services, marking its first significant shift away from exclusive reliance on Nvidia (NASDAQ:NVDA) hardware, according to a report by The Information. The move is part of a broader strategy by the AI company to reduce its dependence on Microsoft(NASDAQ:MSFT)-managed infrastructure. Through Google Cloud, OpenAI is renting Google's tensor processing units (TPUs) with the aim of cutting the costs associated with inference computing, the execution of models after training is completed. The decision could offer Google's TPUs a higher profile as a cost-effective alternative to Nvidia's widely used graphics processing units (GPUs), which dominate the AI sector. Previously, OpenAI sourced Nvidia chips primarily via partnerships with Microsoft and Oracle (NYSE:ORCL) to train and deploy its models. While Google is providing some TPU capacity, it is reportedly not offering its most powerful versions to OpenAI, according to sources cited by The Information. That limitation suggests Google's most advanced TPUs remain reserved for internal use, including work on its own large language models under the Gemini project. For OpenAI, access to earlier versions of the TPUs still represents a step toward infrastructure diversification amid growing industry demand. It's still unclear whether OpenAI will use Google chips for model training or limit them to inference workloads. As competition increases and resource constraints deepen, a hybrid-use infrastructure could provide new flexibility for scaling. The arrangement highlights the evolving dynamics of the AI hardware landscape, where companies like Google are leveraging years of investment in both software and custom silicon. For OpenAI, the addition of Google as a chip supplier broadens the ecosystem around its technology stack and addresses growing concerns over availability and cost of compute resources. Related articles OpenAI taps Google Cloud TPUs in bid to diversify AI chip supply - The Information UBS examines how this year's hurricane season could impact European reinsurers Bernstein weighs in on the path ahead for Japanese semiconductor equipment stocks Error in retrieving data Sign in to access your portfolio Error in retrieving data Error in retrieving data Error in retrieving data Error in retrieving data

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store