Latest news with #TeamGreen


Tom's Guide
20-07-2025
- Tom's Guide
Nvidia N1X CPU: Everything we know so far
Nvidia is the undisputed leader of the GPU market (whether you like it or not), with its RTX 50-series graphics cards making waves this year, but Team Green looks to be putting its hat in the ring of another sector, as a new CPU may be on the horizon. Rumors have been swirling of a Nvidia N1X and N1 Arm-based CPUs that would be made for desktops and laptops, respectively. While Nvidia has already announced a new Arm-based CPU, the N1-series chips are set to be for consumers. Believed to be made in partnership with MediaTek, not only does this mean Nvidia will have a stake in PCs in a whole new way, but as reports have pointed out, it could lead to slimmer, more powerful gaming laptops, too. While Nvidia may have GPU and AI markets in its pocket, its N1X and N1 System on Chips (SoC) may prove to shake up the competition in Intel, AMD, Qualcomm and Apple's offerings. It may be a while before we see Nvidia's N1X and N1 CPUs arrive, and there's still a lot to learn, but the rumor mill has been churning out plenty on these chips. Let's dive into what we know so far. The rumored launch of Nvidia's N1-series CPU has been all over the place, as not too long ago, many believed the chips would be here by now. However, it's looking like we may have to wait at least a year until we see them arrive. Initially, Nvidia and MediaTek's Arm-based CPU was rumored to be announced at Computex 2025, with the tech giant expected to be gearing up to show off its smaller GB10 Blackwell chip in the Arm SoC coming to laptops. Get instant access to breaking news, the hottest reviews, great deals and helpful tips. As you can tell, this didn't come to be, as it seems Nvidia wasn't ready to officially announce its chips. Many, including Moore's Law is Dead, believed it would arrive in late 2025 or early 2026, which would be in time for CES 2026, but it may turn out to be later than we thought. Now, it's been reported that the Nvidia N1X Arm CPU has been delayed until late 2026. As noted by SemiAccurate, Nvidia faced problems that caused a roadblock in the CPU arriving in early 2026. While this was reportedly handled, the new chip is now rumored to be suffering from another hurdle. While the report doesn't detail the specific problem with Nvidia's chip, sources state that the chip has been hit with problems that require engineers to make design changes to the silicon. Due to this, the SoC is now believed to be coming later in 2026. With Nvidia's track record of announcements, it could end up being at CES 2027 in January. For now, of course, this is all up in the air. But with rumors indicated delays, it's likely it will be a while before we see any mention of a new CPU from Nvidia. So, what kind of performance can we expect the Nvidia's N1-series chips to deliver? According to leaked benchmarks, we could see some big performance gains in ultraportable laptops. We've heard that the N1-series chip will be based on a GB10 Superchip, found in Nvidia's announced Project DIGITS AI supercomputer (now known as DGX Spark) for desktops. For the laptop version, which is set to be the N1 SoC, it may be a cut-down version of GB10, with some combination of a Blackwell GPU and a MediaTek CPU. That said, there's reason to believe it could use a GB206 model. Either way, it's looking to leverage the power of an RTX GPU, with these Blackwell-based GPUs being used in RTX 5060 Ti or RTX 5060 graphics cards. But the real kicker here is that this N1 chip will reportedly deliver the same performance of an RTX 4070-equipped laptop, but with far better energy efficiency, according to Taiwanese outlet UDN. For a CPU that delivers an integrated GPU with that kind of power, along with improvements to power efficiency (so possibly longer battery life), is already a good sign that Team Green's chip will be worth waiting for. But the rumors continue, as the N1 chip is expected to use 65W power to match the performance of a 120W RTX 4070 gaming laptop, and another source suggesting the chip would offer a TDP (Thermal Design Power) of 80W to 120W. According to ComputerBase Nvidia and MediaTek's chip may only have 8 or 12 CPU cores instead of 20. Benchmark leaks of the Nvidia's GB10 Arm superchip (via Notebookcheck) suggest single-core performance reaching 2,960 and multicore at 10,682. Due to the delay, it's only guesswork if these are the benchmarks (or even specs) that will arrive, as for now, these Geekbench results put it behind Apple's M4 Max chips. While it's believed the N1X chip is for desktop and the N1 is for laptops, it's looking likely that the latter will be primed for gaming laptops. And reports even suggest the first gamer-focused notebooks that will be getting them. According to the UDN report, Dell's gaming brand Alienware will be among the first to launch new gaming laptops featuring the Nvidia and MediaTek CPU. That means we could see fresh Alienware notebooks that are slimmer and offer better battery life, if rumors about Nvidia and MediaTek's chip are accurate — not unlike the newly designed Alienware 16 Aurora lineup. If rumors are accurate, Nvidia's Arm-based SoC is set to bolster ultraportable gaming laptops (and possibly PC gaming handhelds) with better power efficiency, which hopefully translates to greater battery life in gaming notebooks. We've seen Arm chips in action before, with Snapdragon X Elite laptops impressing with their long battery life and fast speeds. We've even tested Snapdragon X Elite PCs for gaming, and while impressive, they aren't quite built for demanding titles. With Nvidia's own chip sporting its GPU tech, however, gaming on machines with this chip could see major performance gains, especially if it uses some form of DLSS 4 and its Multi Frame Generation tech. But there's already some competition heating up, and that's from two heavy hitters in the laptop market. For one, the AMD Strix Halo APU already delivers close to RTX 4060 desktop GPU power, and Qualcomm's Snapdragon X2 Series chip is set to arrive soon. It's still early days for the Nvidia N1X Arm-based CPU, as it isn't even certain it may release. We have an idea of what to expect, especially when it comes to the power the N1-series chip for laptop may deliver, but all this could change if it doesn't arrive until next year. Only time will tell when we see Nvidia's N1X Arm-based CPU arrive, and whether its the CPU for consumers we've been expecting. But if it comes from Team Green, we should expect to see a boost in ultraportable laptops, at the very least, along with a touch of AI for greater power efficiency management.


Tom's Guide
17-07-2025
- Tom's Guide
Nvidia wants to make 8GB GPUs great again with AI texture compression — but I'm not convinced
If you're annoyed by just getting 8GB of video memory (VRAM) on your Nvidia RTX 5060 Ti, RTX 5060 or RTX 5050 GPU, there may be a fix coming. And just like a lot of Team Green's work, it's all about AI. In 2025, when plenty of games are requiring more than this from the jump, it's simply not enough (and PC gamers are letting Nvidia and AMD know with their wallets). Which is why Nvidia is looking to neural trickery — it's bread and butter with the likes of DLSS 4 and multi-frame gen. You may already know of Neural Texture Compression (or NTC), which is exactly what it says on the tin: taking those detailed in-game textures and compressing them for efficiency of loading and frame rate. As WCCFTech reports, NTC has seemingly taken another giant step forward by taking advantage of Microsoft's new Cooperative Vector in DirectX Raytracing 1.2 — resulting in one test showing an up-to-90% reduction in VRAM consumption for textures. To someone who is always wanting to make sure people get the best PC gaming bang for their buck, this sounds amazing. But I'm a little weary for three key reasons. As you can see in tests run by Osvaldo Pinali Doederlein on X (using a prerelease driver), this update to make the pipeline of loading textures more efficient with AI is significant. Texture size dropped from 79 MB all the way to just 9 MB — dropping the VRAM consumption by nearly 90%. How does it perform? Disabling v-sync, RTX 5080, demo at the startup position: (explained next tweet)Default: 2,350fps / 9.20MBNo FP8: 2,160fps / 9.20MBNo Int8: 2,350fps / 9.20MBDP4A: 1,030fps / 9.14MBTranscoded: 2,600fps / 79.38MBJuly 15, 2025 Just like DLSS 4 and other technologies extracting a higher frame rate and better graphical fidelity out of RTX 50-series GPUs, NTC requires developers to code it in. Get instant access to breaking news, the hottest reviews, great deals and helpful tips. And while Nvidia is one of the better companies in terms of game support for its AI magic (so far over 125 games support DLSS 4), it's still a relatively small number when you think of the many thousands of PC titles that launch every year. Of course, this is not a burn on Doederlein here. This testing is great! But it is one example that doesn't take into account the broader landscape of challenges that are faced in a game — a test scene of a mask with several different textures isn't the same as rendering an entire level. So while this near-90% number is impressive nonetheless, when put to a far bigger challenge, I anticipate that number will be much lower on average. But when it comes to 8GB GPUs, every little bit helps! So yes, on paper, Nvidia's NTC could be the savior of 8GB GPUs, and it could extract more value from your budget graphics card. But let's address the elephant in the room — graphics cards with this low amount of video memory have been around for years, games in 2025 have proven that it's not enough and neural texture compression looks to me like a sticking plaster. I don't want to ignore the benefits here, though, because any chance to make budget tech even better through software and AI is always going to be a big win for me. But with the ever-increasing demands of developers (especially with Unreal Engine 5 bringing ever-more demanding visual masterpieces like The Witcher 4 to the front), how far can AI compression really go? Follow Tom's Guide on Google News to get our up-to-date news, how-tos, and reviews in your feeds. Make sure to click the Follow button.


Tom's Guide
14-07-2025
- Tom's Guide
The RTX 5090 is the best graphics card I've ever owned — but there's a catch for living room PC gamers
Just as Terminator 2: Judgment Day predicted back in ye olden days of 1991, the future belongs to AI. That could be a problem for Nvidia RTX 50-series GPUs, even when it comes to the best consumer graphics card money can buy. I was 'fortunate' enough to pick up an Nvidia GeForce RTX 5090 a couple of months ago. I use those semi-joking apostrophes because I merely had to pay $650 over MSRP for the new overlord of GPUs. Lucky me. Before you factor in the 5090's frame-generating AI voodoo (which I'll get to), it's important to give credit to Team Green for assembling an utter beastly piece of silicon. Around 30% more powerful than the RTX 4090 — the previous graphics card champ — there's no denying it's an astonishing piece of kit. Whether you're gaming on one of the best TVs at 120 FPS or one of the best gaming monitors at 240 fps and above, the RTX 5090 has been designed for the most ludicrously committed hardcore gamers. And wouldn't you know it? I just happen to fall into this aforementioned, horribly clichéd category. So I have a PC similar to the build our lab tester Matt Murray constructed (he even posted a handy how-to on building a PC) — packing the 5090, AMD Ryzen 7 9800X3D, and 64GB DDR5 RAM on a Gigabyte X870 Aorus motherboard. In terms of the screens I play on, I have two. For the desk, I've got an Samsung Odyssey G9 OLED with a max 240Hz refresh rate, but most of the time, I'll be in living room mode with my LG G3 OLED's max 120Hz refresh rate. The main selling point of Nvidia's latest flagship product is DLSS 4's Multi Frame Generation tech. Taking advantage of sophisticated AI features, Nvidia's RTX 50 cards are capable of serving up blistering frame rates that simply can't be achieved through brute force hardware horsepower. Multi Frame Generation — and I promise that's the last time I capitalize Team Green's latest buzz phrase — feels like the biggest (and most contentious) development to hit the PC gaming scene in ages. The tech has only been out for a few months and there are already over 100 titles that support Nvidia's ambitious AI wizardry. How does it work? Depending on the setting you choose, an additional 1-3 AI-driven frames of gameplay will be rendered for every native frame your GPU draws. This can lead to colossal onscreen FPS counts, even in the most demanding games. Doom: The Dark Ages, Cyberpunk 2077, Indiana Jones and the Great Circle and Half-Life 2 RTX — some of the most graphically intense titles around can now be played at incredibly high frame rates with full ray tracing engaged. That's mainly thanks to multi frame generation. So I got into the games (turn off Vsync for the best results). For more specific context, these figures were taken from Doom's Forsaken Plain level, Indy's Marshall College section during a particularly challenging path traced scene, driving around downtown Night City in Cyberpunk, and Gorden's mesmerizing new take on Ravenholm. All games tested at 4K (Max settings, DLSS Balanced) Cyberpunk 2077 Doom: The Dark Ages Indiana Jones and the Great Circle Half-Life 2 RTX demo Frame gen off (Average frame rate / latency) 58 FPS / 36-47 ms 95 FPS / 37-48 ms 85 FPS / 33-40 ms 75 FPS / 26-3 ms Frame gen x2 (Average frame rate / latency) 130 FPS / 29-42 ms 160 FPS / 51-58 ms 140 FPS / 35-46 ms 130 FPS / 29-42 ms Frame gen x3 (Average frame rate / latency) 195 FPS / 37-52 ms 225 FPS / 54-78 ms 197 FPS / 43-53 ms 195 FPS / 37-52 ms Frame gen x4 (Average frame rate / latency) 240 FPS / 41-60 ms 270 FPS / 56-92 ms 243 FPS / 44-57 ms 240 FPS / 41-60 ms These are ludicrous frame rates — limited only by either my LG G3 OLED's max 120Hz refresh rate, or even the sky high 240Hz on my Samsung Odyssey G9 OLED in a couple circumstances. There is a catch, though, which goes back to the ways that I play. Despite my frame rate counter showing seriously impressive numbers, the in-game experiences often don't feel as smooth as I expected. As much as I've tried to resist, I've become increasingly obsessed with the excellent Nvidia app (and more specifically) its statistics overlay while messing around with multi frame gen of late. These stats let you monitor FPS, GPU and CPU usage, and most crucially for me, latency. Also known as input lag, latency measures the time it takes a game to register the press of a button on one of the best PC game controllers or the click of a key/mouse in milliseconds. If your latency is high, movement is going to feel sluggish, regardless of how lofty your frame rate is. And that situation is compounded on my TV. The high frame rate is glorious on my monitor, but when locked to 120Hz, you don't get the perceived smoother motion of those additional frames — creating a disconnect that makes that latency a bit more noticeable. If you own one of the best gaming PCs and want to enjoy a rich ray traced experience with acceptable input lag at responsive frame rates on your TV, my advice would be to aim for the frame gen level that is as close to your maximum refresh rate as possible. For all the games I tested, that would be 2x. At this level, I find latency hovers around the mid 30s but never exceeds 60 ms, which feels as snappy in that kind of living room gaming setup. Crank up the multi frame gen set to either x4 or x3 setting, and there's a depreciation of what you get here, as the latency becomes more visibly prevalent at the restricted refresh rate using one of the best gaming mice. Flip to a 240Hz monitor, however, and the difference is night and day, as the latency remains at a responsive level alongside those AI-injected frames for a buttery smooth experience. And now, we've got to talk about path tracing — it's already blowing minds in Doom: The Dark Ages, and it's prevalent in the likes of Cyberpunk and Doctor Jones' enjoyable romp. It's essentially the 'pro level' form of ray tracing, this lighting algorithm can produce in-game scenes that look staggeringly authentic. Given the demands of this tech on your GPU, the most graphically exciting development in PC gaming for years will most likely demand you use DLSS 4's x4 or x3 AI frame-generating settings to maintain high frame rates in future implementations. I wasn't surprised that path tracing floored me in CD Projekt Red's seedy yet sensational open-world — I was messing with its path traced photo mode long before DLSS 4 arrived. The quality of the effect cranked to the max in The Great Circle knocked my socks off, though. That stunning screenshot a few paragraphs above is from the game's second level, set in Indy's Marshall College. During a segment where Jones and his vexed buddy Marcus search for clues following a robbery, path tracing gets to really flex its muscles in a sun-dappled room full of antiquities. Dropping down to the highest form of more traditional ray tracing, I was genuinely shocked at just how much more convincing the path traced equivalent looked. So while the technology matures, I hope Nvidia continues to work to reduce latency at these middle-of-the-road frame rates too, so that this AI trickery really hits the spot when maxed out. To be clear to those on the ropes about buying an RTX 5090 — just as we've said in our reviews of the RTX 5060 Ti, 5070 and 5070 Ti, if you own a 40 series-equivalent GPU, you should stick with your current card. You may not get that multi-frame gen goodness, but with DLSS 4 running through its veins, you still get the benefits of Nvidia's latest form of supersampling and its new Transformer model — delivering considerably better anti-aliasing while being less power-hungry than the existing Legacy edition. I don't want to end on a total downer though, so I'll give credit where its due. If you're on a monitor with a blisteringly refresh rate though, I'll admit multi frame generation might be a good suit for your setup. My fondness for the RTX 5090 is only matched by Hannibal Lecter's delight in chowing down on human livers. But for those who hot switch between the desk and the couch like I do, make sure you tweak those settings reflective of your refresh rate.


The Hindu
12-07-2025
- General
- The Hindu
Aadi Pattam: a spotlight on volunteers from Chennai who green neighbourhoods
The pandemic had two contrasting effects on newly sprouted initiatives. Some went into a tailspin and never recovered from it. And there were others that got their tail up, the pandemic years, strangely, providing them with a conducive environment for growth. Deepa Lakshmi and her husband K. Mohanasundaram had barely started a hyperlocal greening exercise around their moorings in Mogappair East when the pandemic struck. Now, Deepa Lakshmi is an English teacher at Chennai Higher Secondary School on Subbrayan Street in Shenoy Nagar, and in addition to her teaching role, serves as in-charge headmistress of the school. While latching the doors of the school (and taking the classes online), the pandemic threw wide open the door to environment action. Deepa Lakshmi seized that opportunity. 'It all started during COVID. There was so much free time,' recalls Deepa Lakshmi, co-founder of a voluntering force called Team Green. 'We used to water the few plants we had, and then one day, we just planted some saplings. Neighbours noticed, and slowly people started joining us.' The streets in the neighbourhood saw more green; and then the streets beyond the neighbourhood did. What began as a simple act became a habit not just for this duo and their neighbours in Mogappair East, but also for many eco-conscious people elsewhere in Chennai. Team Green — as this group of sapling-toting volunteers are called — has expanded to Perungudi, Keelkattalai, Madipakkam, Gated Communities in OMR, Perumbakkam, Tiruvottiyur and Thirumullaivoyal. Team Green provides saplings of native trees for free to residents and individuals upon request — a huge volume of such requests being honoured on special days such as Environment Day — but not before extracting a promise from them. The recipient has to take a vow to never abandon the saplings. They would be put through a wringer of questions, much like someone adopting a puppy would be before taking the bundle of fur home. A quick run-through of the posers high up on the questionnaire: who will be responsible for the care of the saplings? Has proper soil preparation been done? The interviewee will find themselves being edified about plant care. Team Green does not leave anything to chance. A volunteer would one day invite themselves to the recipient's stomping ground to see how the saplings are coming along. This stringent process weeds out dilettantes, and brings on board only those extremely keen on greening their neighbourhoods and personal spaces. Residents interested in greening their patches constitute much of the demand. Colleges (through their outreach wings such as NSS units) also seek saplings for their environmental initiatives. Team Green provides saplings for free, except when they do not have the saplings of a specific tree species that has been sought. In such an event, they help procure it from a nursery, with the cost being borne by the one making the request. Vidiyal: an offshoot Deepa Lakshmi is part of a force driving an initiative called Vidiyal, which splices women's empowerment with environment action. Vidiyal is designed in a manner that gets groups of three to four women from an underprivileged background to gather every day at a designated school to nurture saplings. They are provided with soil, seeds and used milk packets, these materials having been collected and supplied by Team Green volunteers. The women work from 10 a.m. to 3 p.m., nurturing saplings and starting new ones with the soil and seeds. In case you are wondering what the milk packets are for: they serve as coverings for the saplings. Currently, around 15 to 20 women are engaged in this work on a full-time basis. The project operates on a community-supported model where volunteers contribute financially to pay the women for their efforts. For instance, a donation of ₹900 is enough to pay three women ₹300 each for a day's work. Those who wish to support just one person can contribute ₹300. Deepa mentions that they never have to actively seek sponsors, as there is always a steady flow of generous volunteers who come forward to pitch in and help sustain the project. Deepa notes that Team Green is not an NGO, only a scattered but tightly-knit group of individual volunteers. The volunteering group functions on its own except for collaborations with Exnora from time to time. Deepa finds the most supportive volunteer in her hearth: her husband Mohanasundaram who has handed over the reins of his earthmoving business to his team, distancing himself from its day-to-day activities and thereby freeing up time for Team Green's activities. For details about Team Green and its activities, call 9042594891 / 6379072259


Tom's Guide
05-07-2025
- Tom's Guide
The RTX 5090 is the best graphics card I've ever owned — but its big new feature disappoints
Just as Terminator 2: Judgment Day predicted back in ye olden days of 1991, the future belongs to AI. That could be a problem for Nvidia RTX 50 GPUs, even when it comes to the best consumer graphics card money can buy. I was 'fortunate' enough to pick up an Nvidia GeForce RTX 5090 a couple of months ago. I use those semi-joking apostrophes because I merely had to pay $650 over MSRP for the new overlord of GPUs. Lucky me. Before you factor in the 5090's frame-generating AI voodoo (which I'll get to), it's important to give credit to Team Green for assembling an utter beastly piece of silicon. Around 30% more powerful than the RTX 4090 — the previous graphics card champ — there's no denying it's an astonishing piece of kit. Whether you're gaming on one of the best TVs at 120 FPS or one of the best gaming monitors at 240 fps and above, the RTX 5090 has been designed for the most ludicrously committed hardcore gamers. And wouldn't you know it? I just happen to fall into this aforementioned, horribly clichéd category. The main selling point of Nvidia's latest flagship product is DLSS 4's Multi Frame Generation tech. Taking advantage of sophisticated AI features, Nvidia's RTX 50 cards are capable of serving up blistering frame rates that simply can't be achieved through brute force hardware horsepower. Multi Frame Generation — and I promise that's the last time I capitalize Team Green's latest buzz phrase — feels like the biggest (and most contentious) development to hit the PC gaming scene in ages. The tech has only been out for a few months and there are already over 100 titles that support Nvidia's ambitious AI wizardry. How does it work? Depending on the setting you choose, an additional 1-3 AI-driven frames of gameplay will be rendered for every native frame your GPU draws. This can lead to colossal onscreen FPS counts, even in the most demanding games. Doom: The Dark Ages, Hogwarts Legacy, Microsoft Flight Simulator 2024, Cyberpunk 2077 — some of the most graphically intense titles around can now be played at incredibly high frame rates with full ray tracing engaged. That's mainly thanks to multi frame generation. Just how high are we talking? On my RTX 5090, I can comfortably hit a locked 120 FPS at 4K with max settings, providing Nvidia DLSS is enabled. That figure is limited by my LG G3 OLED's max 120Hz refresh rate. When I hook my rig up to my 240Hz Samsung Odyssey G9 OLED super ultrawide monitor, some of the games above can be played at over 200 FPS. There is a catch, though. And said stumbling block is as sizable as a certain silver screen ape that clambered to the top of the Empire State Building. That ended well, right? Yes, the scarcely believable frame rates my third-party RTX 5090 is able to achieve are a lot cheerier than the finale of King Kong. Yet that doesn't mean the best graphics card in the world doesn't have to face its own version of pesky biplanes. Despite my frame rate counter showing seriously impressive numbers, the in-game experiences often don't feel as smooth as you'd expect. It's not unfair to expect 120 FPS gameplay to be super slick, and when all your frames are being rendered natively by your GPU, it normally does. Sadly, that's not quite the case with multi frame generation. As much as I've tried to resist, I've become increasingly obsessed with the excellent Nvidia app (and more specifically) its statistics overlay while messing around with multi frame gen of late. These stats let you monitor FPS, GPU and CPU usage, and most crucially for me, latency. Also known as input lag, latency measures the time it takes a game to register the press of a button on one of the best PC game controllers or the click of a key/mouse in milliseconds. If your latency is high, movement is going to feel sluggish, regardless of how lofty your frame rate is. Generally speaking, I find input lag of 70 ms and above pretty hard to stomach. I've mostly been playing around with Team Green's multi frame gen features in Doom: The Dark Ages, Indiana Jones and the Great Circle, Cyberpunk 2077: The Phantom Liberty and the recent, extra demanding Half-Life 2 RTX demo. To say the results have been mixed would be akin to describing Godzilla as 'above average height'. Cyberpunk 2077 actually fairs pretty well when it comes to balancing input lag and big frame rate numbers. At the maximum x4 multi frame gen setting, I generally float around 64-78 ms of latency in 4K (3840 x 2160) at 120 FPS with all settings maxed out and full path tracing enabled — more on that shortly. For a game that hardly requires lightning reactions, those latency measurements feel just about acceptable to me. Knock multi frame generation down to x3 and input lag drops to around 55-65 ms cruising around Night City while still hitting a locked 120 FPS, which feels reasonably responsive. At x2 frame gen, latency of around 50 ms feels even better, albeit with the big caveat that I drop down to 90 FPS. And with frame generation turned off completely? You're looking at 40 ms of lag with a nosedive to 50 FPS. In the case of Cyberpunk, I'd say x3 frame gen hits the sweet spot between responsiveness and in-game smoothness. It's not a fast-paced shooter, so a little added latency is worth sacrificing for a locked 4K/120 FPS experience. Speaking of games that do require more nimble reactions, Doom: The Dark Ages can produce multi frame generation results that feel downright awful. Despite being well optimized overall and even with Nvidia Reflex low latency mode turned on, controlling the Doom Slayer during his medieval murder quest can feel like wading through a sea of space soup. At x4 and x3 multi frame gen settings, the action is outright ghastly. With Nvidia's AI tech maxed out, latency never once measures in below an unplayable 110 ms on my rig. Turn frame gen off though, and a card like the 5090 can still hand in 4K/120 FPS but with latency dropping to a slick and responsive 20 fps. The higher frame generation presets may look smooth in motion, yet they feel massively heavy with a controller in your hands. Next up is Indy's latest adventure. The Great Circle might be a breezy, enjoyable action-adventure, but it's definitely not the best poster boy for multi frame generation. At the amusingly stupid 'Very Ultra' settings in 4K with all settings maxed out and path tracing cranked up, latency lands on a super sluggish 100 ms and above with x4 frame gen enabled. If you own one of the best gaming PCs and want to enjoy a rich ray traced experience with acceptable input lag at responsive frame rates, I suggest going for the x2 frame gen setting. At this level, I find latency hovers between mid 30s to low 40 ms gameplay, which feels as snappy as one of the explorer's legendary whip lashes. Even though it's over 20 years old, it's Gordon Freeman's path traced Half-Life 2 RTX demo that produces the worst results on my gaming PC. Movement feels utterly shocking with multi frame gen set to either the x4 or x3 setting. I'm talking '150 ms of latency' levels of shocking. Even cutting through Headcrabs in the shooter's legendary Ravenholm level at 120 FPS using one of the best gaming mice is horribly sluggish. It's only by turning Nvidia's latest tech off entirely that torpedoing zombies with buzzsaws fired from Gordon's gravity gun feels playable again. With frame gen disabled, my 5090-powered PC was able to achieve just 30 ms of latency consistently as frame rates fluctuated between 60-75 FPS. And if all of the inconsistent frame rates above are making you queasy, I can assure you they never bothered me thanks to the combination of my display's FPS-smoothing G-Sync and VRR (Variable Refresh Rate) features. You'd probably think the big takeaway from my multi frame generation experiments would be 'disable multi frame gen' at this point, am I right? In the here and now, most definitely. Yet in the future, the most graphically exciting development in PC gaming for years will most likely demand you use DLSS 4's x4 or x3 AI frame-generating settings to maintain high frame rates. That feature is the aforementioned path tracing. Essentially the 'pro level' form of ray tracing, this lighting algorithm can produce in-game scenes that look staggeringly authentic. The two best current examples of the technology being deployed to eye-arousing effect I've come across are Cyberpunk and Doctor Jones' enjoyable romp. I wasn't surprised that path tracing floored me in CD Projekt Red's seedy yet sensational open-world — I was messing with its path traced photo mode long before DLSS 4 arrived. The quality of the effect cranked to the max in The Great Circle knocked my socks off, though. That stunning screenshot a few paragraphs above is from the game's second level, set in Indy's Marshall College. During a segment where Jones and his vexed buddy Marcus search for clues following a robbery, path tracing gets to really flex its muscles in a sun-dappled room full of antiquities. Dropping down to the highest form of more traditional ray tracing, I was genuinely shocked at just how much more convincing the path traced equivalent looked. It kinda pains me to think I'm probably going to have to lean on multi frame generation going forward if I'm to maintain 4K, high frame rates experiences in games that support path tracing. As the technology matures, I really hope Nvidia finds ways to reduce latency without massively compromising on the speedy FPS performance its latest AI experiment targets. Seeing as the launch of the RTX 50 range has gone as smoothly as a dinner party for chickens organised by The Fantastic Mr Fox, I have no problem stating that if you own a 40 series GPU (especially an RTX 4080 or 4090), you should stick with your current card. Even if you've been hyped for multi frame generation, know that it's nowhere near effective enough at the moment to be worth upgrading your GPU for. The most damning aspect of DLSS 4's multi frame gen performance is that it's actually producing worse in-game experiences than you get with DLSS 3's x2 frame gen settings. Based on my time with the titles I've mentioned, the lowest level of this frame-boosting tech hits the best balance between reasonable latency and stutter-free gameplay. Considering Nvidia first launched the DLSS 3 version back in October 2022, and you can enjoy it on last-gen GPUs it's not a great advert for DLSS 4 and its latest AI ace in the hole. The iconic computing company's new artificial intelligence model might be 40% faster than the previous iteration, but that doesn't mean multi frame generation feels satisfying in motion in its current state. I don't want to end on a total downer though, so I'll give DLSS 4 credit where it's due. Multi frame gen undeniably reeks of the Emperor's New Clothes at present and that's disappointing. However, Nvidia's latest form of supersampling and its new Transformer model deliver considerably better anti-aliasing while being less power-hungry than the existing Legacy edition. My fondness for the RTX 5090 is only matched by Hannibal Lecter's delight in chowing down on human livers. Probably. If you're on the fence about the latest wave of Nvidia GPUs though, don't let multi frame generation sway a potential purchasing decision.