Latest news with #NvidiaGeForceRTX5090


Tom's Guide
9 hours ago
- Tom's Guide
The RTX 5090 is the best graphics card I've ever owned — but there's a catch for living room PC gamers
Just as Terminator 2: Judgment Day predicted back in ye olden days of 1991, the future belongs to AI. That could be a problem for Nvidia RTX 50-series GPUs, even when it comes to the best consumer graphics card money can buy. I was 'fortunate' enough to pick up an Nvidia GeForce RTX 5090 a couple of months ago. I use those semi-joking apostrophes because I merely had to pay $650 over MSRP for the new overlord of GPUs. Lucky me. Before you factor in the 5090's frame-generating AI voodoo (which I'll get to), it's important to give credit to Team Green for assembling an utter beastly piece of silicon. Around 30% more powerful than the RTX 4090 — the previous graphics card champ — there's no denying it's an astonishing piece of kit. Whether you're gaming on one of the best TVs at 120 FPS or one of the best gaming monitors at 240 fps and above, the RTX 5090 has been designed for the most ludicrously committed hardcore gamers. And wouldn't you know it? I just happen to fall into this aforementioned, horribly clichéd category. So I have a PC similar to the build our lab tester Matt Murray constructed (he even posted a handy how-to on building a PC) — packing the 5090, AMD Ryzen 7 9800X3D, and 64GB DDR5 RAM on a Gigabyte X870 Aorus motherboard. In terms of the screens I play on, I have two. For the desk, I've got an Samsung Odyssey G9 OLED with a max 240Hz refresh rate, but most of the time, I'll be in living room mode with my LG G3 OLED's max 120Hz refresh rate. The main selling point of Nvidia's latest flagship product is DLSS 4's Multi Frame Generation tech. Taking advantage of sophisticated AI features, Nvidia's RTX 50 cards are capable of serving up blistering frame rates that simply can't be achieved through brute force hardware horsepower. Multi Frame Generation — and I promise that's the last time I capitalize Team Green's latest buzz phrase — feels like the biggest (and most contentious) development to hit the PC gaming scene in ages. The tech has only been out for a few months and there are already over 100 titles that support Nvidia's ambitious AI wizardry. How does it work? Depending on the setting you choose, an additional 1-3 AI-driven frames of gameplay will be rendered for every native frame your GPU draws. This can lead to colossal onscreen FPS counts, even in the most demanding games. Doom: The Dark Ages, Cyberpunk 2077, Indiana Jones and the Great Circle and Half-Life 2 RTX — some of the most graphically intense titles around can now be played at incredibly high frame rates with full ray tracing engaged. That's mainly thanks to multi frame generation. So I got into the games (turn off Vsync for the best results). For more specific context, these figures were taken from Doom's Forsaken Plain level, Indy's Marshall College section during a particularly challenging path traced scene, driving around downtown Night City in Cyberpunk, and Gorden's mesmerizing new take on Ravenholm. All games tested at 4K (Max settings, DLSS Balanced) Cyberpunk 2077 Doom: The Dark Ages Indiana Jones and the Great Circle Half-Life 2 RTX demo Frame gen off (Average frame rate / latency) 58 FPS / 36-47 ms 95 FPS / 37-48 ms 85 FPS / 33-40 ms 75 FPS / 26-3 ms Frame gen x2 (Average frame rate / latency) 130 FPS / 29-42 ms 160 FPS / 51-58 ms 140 FPS / 35-46 ms 130 FPS / 29-42 ms Frame gen x3 (Average frame rate / latency) 195 FPS / 37-52 ms 225 FPS / 54-78 ms 197 FPS / 43-53 ms 195 FPS / 37-52 ms Frame gen x4 (Average frame rate / latency) 240 FPS / 41-60 ms 270 FPS / 56-92 ms 243 FPS / 44-57 ms 240 FPS / 41-60 ms These are ludicrous frame rates — limited only by either my LG G3 OLED's max 120Hz refresh rate, or even the sky high 240Hz on my Samsung Odyssey G9 OLED in a couple circumstances. There is a catch, though, which goes back to the ways that I play. Despite my frame rate counter showing seriously impressive numbers, the in-game experiences often don't feel as smooth as I expected. As much as I've tried to resist, I've become increasingly obsessed with the excellent Nvidia app (and more specifically) its statistics overlay while messing around with multi frame gen of late. These stats let you monitor FPS, GPU and CPU usage, and most crucially for me, latency. Also known as input lag, latency measures the time it takes a game to register the press of a button on one of the best PC game controllers or the click of a key/mouse in milliseconds. If your latency is high, movement is going to feel sluggish, regardless of how lofty your frame rate is. And that situation is compounded on my TV. The high frame rate is glorious on my monitor, but when locked to 120Hz, you don't get the perceived smoother motion of those additional frames — creating a disconnect that makes that latency a bit more noticeable. If you own one of the best gaming PCs and want to enjoy a rich ray traced experience with acceptable input lag at responsive frame rates on your TV, my advice would be to aim for the frame gen level that is as close to your maximum refresh rate as possible. For all the games I tested, that would be 2x. At this level, I find latency hovers around the mid 30s but never exceeds 60 ms, which feels as snappy in that kind of living room gaming setup. Crank up the multi frame gen set to either x4 or x3 setting, and there's a depreciation of what you get here, as the latency becomes more visibly prevalent at the restricted refresh rate using one of the best gaming mice. Flip to a 240Hz monitor, however, and the difference is night and day, as the latency remains at a responsive level alongside those AI-injected frames for a buttery smooth experience. And now, we've got to talk about path tracing — it's already blowing minds in Doom: The Dark Ages, and it's prevalent in the likes of Cyberpunk and Doctor Jones' enjoyable romp. It's essentially the 'pro level' form of ray tracing, this lighting algorithm can produce in-game scenes that look staggeringly authentic. Given the demands of this tech on your GPU, the most graphically exciting development in PC gaming for years will most likely demand you use DLSS 4's x4 or x3 AI frame-generating settings to maintain high frame rates in future implementations. I wasn't surprised that path tracing floored me in CD Projekt Red's seedy yet sensational open-world — I was messing with its path traced photo mode long before DLSS 4 arrived. The quality of the effect cranked to the max in The Great Circle knocked my socks off, though. That stunning screenshot a few paragraphs above is from the game's second level, set in Indy's Marshall College. During a segment where Jones and his vexed buddy Marcus search for clues following a robbery, path tracing gets to really flex its muscles in a sun-dappled room full of antiquities. Dropping down to the highest form of more traditional ray tracing, I was genuinely shocked at just how much more convincing the path traced equivalent looked. So while the technology matures, I hope Nvidia continues to work to reduce latency at these middle-of-the-road frame rates too, so that this AI trickery really hits the spot when maxed out. To be clear to those on the ropes about buying an RTX 5090 — just as we've said in our reviews of the RTX 5060 Ti, 5070 and 5070 Ti, if you own a 40 series-equivalent GPU, you should stick with your current card. You may not get that multi-frame gen goodness, but with DLSS 4 running through its veins, you still get the benefits of Nvidia's latest form of supersampling and its new Transformer model — delivering considerably better anti-aliasing while being less power-hungry than the existing Legacy edition. I don't want to end on a total downer though, so I'll give credit where its due. If you're on a monitor with a blisteringly refresh rate though, I'll admit multi frame generation might be a good suit for your setup. My fondness for the RTX 5090 is only matched by Hannibal Lecter's delight in chowing down on human livers. But for those who hot switch between the desk and the couch like I do, make sure you tweak those settings reflective of your refresh rate.


Tom's Guide
05-07-2025
- Tom's Guide
The RTX 5090 is the best graphics card I've ever owned — but its big new feature disappoints
Just as Terminator 2: Judgment Day predicted back in ye olden days of 1991, the future belongs to AI. That could be a problem for Nvidia RTX 50 GPUs, even when it comes to the best consumer graphics card money can buy. I was 'fortunate' enough to pick up an Nvidia GeForce RTX 5090 a couple of months ago. I use those semi-joking apostrophes because I merely had to pay $650 over MSRP for the new overlord of GPUs. Lucky me. Before you factor in the 5090's frame-generating AI voodoo (which I'll get to), it's important to give credit to Team Green for assembling an utter beastly piece of silicon. Around 30% more powerful than the RTX 4090 — the previous graphics card champ — there's no denying it's an astonishing piece of kit. Whether you're gaming on one of the best TVs at 120 FPS or one of the best gaming monitors at 240 fps and above, the RTX 5090 has been designed for the most ludicrously committed hardcore gamers. And wouldn't you know it? I just happen to fall into this aforementioned, horribly clichéd category. The main selling point of Nvidia's latest flagship product is DLSS 4's Multi Frame Generation tech. Taking advantage of sophisticated AI features, Nvidia's RTX 50 cards are capable of serving up blistering frame rates that simply can't be achieved through brute force hardware horsepower. Multi Frame Generation — and I promise that's the last time I capitalize Team Green's latest buzz phrase — feels like the biggest (and most contentious) development to hit the PC gaming scene in ages. The tech has only been out for a few months and there are already over 100 titles that support Nvidia's ambitious AI wizardry. How does it work? Depending on the setting you choose, an additional 1-3 AI-driven frames of gameplay will be rendered for every native frame your GPU draws. This can lead to colossal onscreen FPS counts, even in the most demanding games. Doom: The Dark Ages, Hogwarts Legacy, Microsoft Flight Simulator 2024, Cyberpunk 2077 — some of the most graphically intense titles around can now be played at incredibly high frame rates with full ray tracing engaged. That's mainly thanks to multi frame generation. Just how high are we talking? On my RTX 5090, I can comfortably hit a locked 120 FPS at 4K with max settings, providing Nvidia DLSS is enabled. That figure is limited by my LG G3 OLED's max 120Hz refresh rate. When I hook my rig up to my 240Hz Samsung Odyssey G9 OLED super ultrawide monitor, some of the games above can be played at over 200 FPS. There is a catch, though. And said stumbling block is as sizable as a certain silver screen ape that clambered to the top of the Empire State Building. That ended well, right? Yes, the scarcely believable frame rates my third-party RTX 5090 is able to achieve are a lot cheerier than the finale of King Kong. Yet that doesn't mean the best graphics card in the world doesn't have to face its own version of pesky biplanes. Despite my frame rate counter showing seriously impressive numbers, the in-game experiences often don't feel as smooth as you'd expect. It's not unfair to expect 120 FPS gameplay to be super slick, and when all your frames are being rendered natively by your GPU, it normally does. Sadly, that's not quite the case with multi frame generation. As much as I've tried to resist, I've become increasingly obsessed with the excellent Nvidia app (and more specifically) its statistics overlay while messing around with multi frame gen of late. These stats let you monitor FPS, GPU and CPU usage, and most crucially for me, latency. Also known as input lag, latency measures the time it takes a game to register the press of a button on one of the best PC game controllers or the click of a key/mouse in milliseconds. If your latency is high, movement is going to feel sluggish, regardless of how lofty your frame rate is. Generally speaking, I find input lag of 70 ms and above pretty hard to stomach. I've mostly been playing around with Team Green's multi frame gen features in Doom: The Dark Ages, Indiana Jones and the Great Circle, Cyberpunk 2077: The Phantom Liberty and the recent, extra demanding Half-Life 2 RTX demo. To say the results have been mixed would be akin to describing Godzilla as 'above average height'. Cyberpunk 2077 actually fairs pretty well when it comes to balancing input lag and big frame rate numbers. At the maximum x4 multi frame gen setting, I generally float around 64-78 ms of latency in 4K (3840 x 2160) at 120 FPS with all settings maxed out and full path tracing enabled — more on that shortly. For a game that hardly requires lightning reactions, those latency measurements feel just about acceptable to me. Knock multi frame generation down to x3 and input lag drops to around 55-65 ms cruising around Night City while still hitting a locked 120 FPS, which feels reasonably responsive. At x2 frame gen, latency of around 50 ms feels even better, albeit with the big caveat that I drop down to 90 FPS. And with frame generation turned off completely? You're looking at 40 ms of lag with a nosedive to 50 FPS. In the case of Cyberpunk, I'd say x3 frame gen hits the sweet spot between responsiveness and in-game smoothness. It's not a fast-paced shooter, so a little added latency is worth sacrificing for a locked 4K/120 FPS experience. Speaking of games that do require more nimble reactions, Doom: The Dark Ages can produce multi frame generation results that feel downright awful. Despite being well optimized overall and even with Nvidia Reflex low latency mode turned on, controlling the Doom Slayer during his medieval murder quest can feel like wading through a sea of space soup. At x4 and x3 multi frame gen settings, the action is outright ghastly. With Nvidia's AI tech maxed out, latency never once measures in below an unplayable 110 ms on my rig. Turn frame gen off though, and a card like the 5090 can still hand in 4K/120 FPS but with latency dropping to a slick and responsive 20 fps. The higher frame generation presets may look smooth in motion, yet they feel massively heavy with a controller in your hands. Next up is Indy's latest adventure. The Great Circle might be a breezy, enjoyable action-adventure, but it's definitely not the best poster boy for multi frame generation. At the amusingly stupid 'Very Ultra' settings in 4K with all settings maxed out and path tracing cranked up, latency lands on a super sluggish 100 ms and above with x4 frame gen enabled. If you own one of the best gaming PCs and want to enjoy a rich ray traced experience with acceptable input lag at responsive frame rates, I suggest going for the x2 frame gen setting. At this level, I find latency hovers between mid 30s to low 40 ms gameplay, which feels as snappy as one of the explorer's legendary whip lashes. Even though it's over 20 years old, it's Gordon Freeman's path traced Half-Life 2 RTX demo that produces the worst results on my gaming PC. Movement feels utterly shocking with multi frame gen set to either the x4 or x3 setting. I'm talking '150 ms of latency' levels of shocking. Even cutting through Headcrabs in the shooter's legendary Ravenholm level at 120 FPS using one of the best gaming mice is horribly sluggish. It's only by turning Nvidia's latest tech off entirely that torpedoing zombies with buzzsaws fired from Gordon's gravity gun feels playable again. With frame gen disabled, my 5090-powered PC was able to achieve just 30 ms of latency consistently as frame rates fluctuated between 60-75 FPS. And if all of the inconsistent frame rates above are making you queasy, I can assure you they never bothered me thanks to the combination of my display's FPS-smoothing G-Sync and VRR (Variable Refresh Rate) features. You'd probably think the big takeaway from my multi frame generation experiments would be 'disable multi frame gen' at this point, am I right? In the here and now, most definitely. Yet in the future, the most graphically exciting development in PC gaming for years will most likely demand you use DLSS 4's x4 or x3 AI frame-generating settings to maintain high frame rates. That feature is the aforementioned path tracing. Essentially the 'pro level' form of ray tracing, this lighting algorithm can produce in-game scenes that look staggeringly authentic. The two best current examples of the technology being deployed to eye-arousing effect I've come across are Cyberpunk and Doctor Jones' enjoyable romp. I wasn't surprised that path tracing floored me in CD Projekt Red's seedy yet sensational open-world — I was messing with its path traced photo mode long before DLSS 4 arrived. The quality of the effect cranked to the max in The Great Circle knocked my socks off, though. That stunning screenshot a few paragraphs above is from the game's second level, set in Indy's Marshall College. During a segment where Jones and his vexed buddy Marcus search for clues following a robbery, path tracing gets to really flex its muscles in a sun-dappled room full of antiquities. Dropping down to the highest form of more traditional ray tracing, I was genuinely shocked at just how much more convincing the path traced equivalent looked. It kinda pains me to think I'm probably going to have to lean on multi frame generation going forward if I'm to maintain 4K, high frame rates experiences in games that support path tracing. As the technology matures, I really hope Nvidia finds ways to reduce latency without massively compromising on the speedy FPS performance its latest AI experiment targets. Seeing as the launch of the RTX 50 range has gone as smoothly as a dinner party for chickens organised by The Fantastic Mr Fox, I have no problem stating that if you own a 40 series GPU (especially an RTX 4080 or 4090), you should stick with your current card. Even if you've been hyped for multi frame generation, know that it's nowhere near effective enough at the moment to be worth upgrading your GPU for. The most damning aspect of DLSS 4's multi frame gen performance is that it's actually producing worse in-game experiences than you get with DLSS 3's x2 frame gen settings. Based on my time with the titles I've mentioned, the lowest level of this frame-boosting tech hits the best balance between reasonable latency and stutter-free gameplay. Considering Nvidia first launched the DLSS 3 version back in October 2022, and you can enjoy it on last-gen GPUs it's not a great advert for DLSS 4 and its latest AI ace in the hole. The iconic computing company's new artificial intelligence model might be 40% faster than the previous iteration, but that doesn't mean multi frame generation feels satisfying in motion in its current state. I don't want to end on a total downer though, so I'll give DLSS 4 credit where it's due. Multi frame gen undeniably reeks of the Emperor's New Clothes at present and that's disappointing. However, Nvidia's latest form of supersampling and its new Transformer model deliver considerably better anti-aliasing while being less power-hungry than the existing Legacy edition. My fondness for the RTX 5090 is only matched by Hannibal Lecter's delight in chowing down on human livers. Probably. If you're on the fence about the latest wave of Nvidia GPUs though, don't let multi frame generation sway a potential purchasing decision.


Tom's Guide
11-06-2025
- Tom's Guide
Exclusive: Maingear just dropped a monster RTX 5090 gaming laptop with up to 192GB RAM
Maingear just revealed its new 18-inch gaming laptop equipped with an Nvidia GeForce RTX 5090 GPU, and it can come with a staggering 192GB of RAM. The Ultima 18 is Maingear's most powerful laptop to date, made in collaboration with manufacturer Clevo. Aimed at power users, this desktop replacement boasts an Intel Core Ultra 9 275HX, RTX 5090 or RTX 5080 and an 18-inch 4K (3840 x 2400) display with a 200Hz display. It's clearly a beast, but the real kicker is that it can be outfitted with up to 192GB of DDR5 RAM and 4x M.2 NVMe SSDs. With the power it packs under the hood, the Ultima 18 is set to cruise through load times and crush multitasking for both productivity and gaming. The Ultima 18 is available at Maingear's official store starting at $3,599, both in pre-configured and customizable models. It's already giving best gaming laptop vibes, so let's dive into the details. We've seen some powerful RTX 5090 gaming rigs so far, including the $6,700 MSI Titan 18 HX, but Maingear's new Ultima 18 gaming laptop takes it up a notch with its customizable specs. Maingear Ultima 18 Price From $3,599 Display 18-inch 4K (3840 x 2400), 200Hz, 16:10 aspect ratio CPU Intel Core Ultra 9 275HX GPU Up to Nvidia GeForce RTX 5090 RAM Up to 192GB DDR5 (4x 48GB) Storage Up to 4x M.2 NVMe SSDs Battery 98Wh (330W charger) Ports 2x USB-A( 1x with PD), 2x Thunderbolt 5, 2x 2.5Gb LAN, 1x HDMI2.1, 1x microSD Connectivity Wi-Fi 7, Bluetooth 5.4 Weight 8.8 pounds On paper, this is one of the most powerful laptops I've seen, and at its highest configuration, it's sure to cost a pretty penny. However, as with the Maingear ML-16, the Ultima 18 can be customized to the user's liking. So, of course, if you don't need anywhere near 192GB of DDR5 RAM, you can shave off some memory. Otherwise, there are a few fixtures that still make it a powerful desktop replacement. That includes the Intel Core Ultra 9 275HX CPU, plenty of useful ports that include dual Thunderbolt 5 slots, a 98Wh battery to give its internals enough juice and even the latest Wi-Fi 7 connectivity. Get instant access to breaking news, the hottest reviews, great deals and helpful tips. It also comes equipped with a massive 18-inch 4K display with a 200Hz refresh rate and a response time of less than 7ms. Many newly released gaming laptops offer displays with up to 240Hz and sub-3ms response rates, like the Alienware 16 Area-51 and Asus ROG Strix Scar 18, but the Ultima 18's 4K resolution with a 16:10 aspect ratio should present gorgeous visuals. Plus, it also supports Nvidia G-Sync to manage screen tearing. Judging from its size (to be confirmed) and weight (8.8 pounds), this is a laptop you'll want planted on your desk rather than open up at a café. But the Ultima 18 is built to be a desktop replacement. 'Ultima 18 isn't just a laptop, it's a no-compromise desktop-class gaming rig that fits in a backpack,' states Wallace Santos, CEO of Maingear. 'We've engineered this notebook to handle the latest AAA games, creative workloads, and AI-driven applications with headroom to spare. From the raw horsepower to the fine details, this system embodies everything our gamers expect from a premium Maingear gaming system.' The amount of RAM and storage it can pack is what makes this a beast. It supports modern dual-channel DDR5 memory and PCIe Gen 5 SSDs, with the highest configuration allowing for 4x 48GB memory along with 1x Gen 5x4 and 3x Gen 4x4 SSDs. As for what else it comes equipped with, the Ultima 18 features a customizable per-key RGB keyboard with a dedicated Copilot key. There are also five speakers with two drivers, two tweeters and a subwoofer that's powered by a Sound Blaster Studio Pro 2 audio system, along with a 5MP Windows Hello camera. Maingear also states the Ultima 18 is free of bloatware (hoorah!) and its Control Center, where users can swap performance modes, customize per-key RGB lighting, adjust fan profiles and assign macros. There are some powerful gaming laptops out there, but Maingear's Ultima 18 looks to bring the heat. We'll see how this monster performs once we get our hands on it, but in the meantime, check out how this RTX 5090 Corsair gaming PC measures up.


Gizmodo
05-05-2025
- Gizmodo
MSI Titan 18 HX Review: Titanic Doesn't Even Begin to Describe This Gaming Laptop
Let's get one thing out of the way before we start this review. MSI's Titan 18 HX is a hulking, powerful, and all-around excessive gaming laptop, starting at $5,279 for its lowest configuration. It demands much more from your wallet if you want the best possible specs. It's also a similar price to what you would pay for a full-sized, pre-built desktop with all the fixings. The Titan 18 HX is an ultra-expensive desktop replacement, and when I say that, I mean it in the whole sense of what that term implies—it's so massive in size and cost, and yet it is the closest you'll get to having a desktop tower you can schlep from room to room. MSI Titan 18 HX This behemoth-sized laptop offers intense performance in a great chassis with excellent keyboard and display. It also costs well over $5,000. Pros Performance is great for both gaming and other intensive tasks Performance is great for both gaming and other intensive tasks Mechanical keyboard feels incredible Mechanical keyboard feels incredible Nice screen with good brightness Nice screen with good brightness Sound is solid and loud for this size of device Cons Terrible battery life Terrible battery life Per-key RGB isn't very bright Per-key RGB isn't very bright Astronomical (and rising) price If you can spend the necessary amount of dough on this nearly eight-pound behemoth, the Titan 18 HX can be the mobile PC for whatever you need, so long as you can do it in less than two hours when away from a plug. It feels luxurious to use with its excellent mechanical keyboard, and it looks good with a quality mini-LED, 4K display. However, you should know that—inevitably—you'll be pushing the settings on your favorite games and then find the performance ceiling. Reaching the max of what you can do at 60 fps on the Titan 18 HX is akin to the feeling of driving 80 mph on a highway without a car in sight, then screeching to a halt with bumper-to-bumper traffic. Even with an Nvidia GeForce RTX 5090 GPU, this gigantic laptop is packing, though there's a limit to what a laptop GPU can accomplish. That's not a point against the Titan 18 HX, but it's something to keep in mind alongside that eye-watering price tag. With the latest top-end Intel Core Ultra 9 285HX CPU and RTX 5090 GPU, and 64GB of DDR5 RAM, I found the Titan 18 HX can meet just under the benchmarks of a full tower with the latest components. To display those graphics, the laptop sports a 4K mini-LED display with clean visuals and nice, high brightness. It's still prone to reflections in direct light, so keep that in mind if you intend to take this beast outdoors. (It already weighs as much as a small dog, so why not play fetch with it?) At the top end you can spec it with 6TB of storage, enough for—at the very least—a majority chunk of your Steam library. I've never felt so spoiled when using a laptop. For the price you're paying for this device, you better feel like you're living in the lap of luxury. With the near-top-end specs in my review unit, the MSI Titan 18 HX totals to a whopping $6,379. If you really have no care for expenses, you could go for the even more expensive Dragon Edition Norse Myth edition with a cover that includes embossed Nordic runes and a killer image of a leering dragon. Unless this machine is something you plan to take to work, which may incite more questions than you're comfortable with. It costs a dragon's hoard, but MSI's prices have gone up, even in the midst of our testing for this review. MSI's other products, like its Claw 8 AI+ have seen price increases over the last few weeks, and the Titan 18 HX is no exception. MSI did not respond to questions about whether these price hikes have anything to do with Trump tariffs. Either way, it's clear MSI isn't alone feeling the pressure of the White House's obsession with import taxes. At this price, the Titan 18 HX has to be a perfect specimen if I were to recommend it at all to anybody, even those who can afford it. With these specs, the device needs a quality thermal system in place, and MSI has managed the heat with surprising grace. You can bet your keister this device spits out some hot air, enough that my mouse hand could get extra sweaty next to the side exhaust. The rest of the airflow blasts out the rear vents. After playing with the laptop for around 15 minutes, the area surrounding the function row keys was burning hot, enough that if you leave your fingers there, it stretches from uncomfortable into painful-to-touch territory. But the Titan 18 HX still feels comfortable when you actually use it. The laptop's thermals, which include a vapor chamber and dedicated copper heat pipe for the SSD (it has four SSD slots, though one is PCIe Gen 5 and the rest are Gen 4), keep heat off the palmrest and away from the WASD keys where most players will rest their sensitive fingertips. Playing on this laptop itself is a joy. The Titan 18 HX sports a full mechanical keyboard with Cherry switches. Each key clacks with a subdued clap that's not too stiff but not light enough to accidentally press the wrong key at the wrong time. If I had one small complaint, it's that the per-key RGB lighting isn't all that bright or eye-catching unless you're sitting in a truly dark room. The seamless touchpad uses haptics, which use motors to simulate the click of a regular mechanical pad, feels on point without that too-smooth glassy texture found on other laptops. MSI claims its device can draw 270W of power to both the CPU and GPU for gaming tasks. This 'Max Boost' setting can also push 200W to the CPU exclusively, which will increase performance in intensive tasks. All that means is that the Titan 18 HX manages to put a good deal of power toward the components that need it most for gaming. Perhaps what surprised me most is just how effective an Intel's Core Ultra 9 285HX is as a gaming CPU. In benchmarks, it manages to meet or, in some cases, beat the performance of the Intel Core Ultra 9 285K desktop-level CPU. Intel's latest desktop processors weren't exactly stellar on release last year—especially for gaming purposes, but the fact this laptop CPU is comparable to desktops at all indicates Intel made some good strides with the top-end Arrow Lake series of chips. Outside of gaming, the Titan 18 HX hit the mark for more intense rendering tasks. It managed to meet sub-1 minute benchmarks for our tests rendering a scene in Blender, which is practically equivalent to what we get on high-end desktop PCs. The Titan 18 HX was similarly fast in our 4K to 1080p video encoding tests. As for gaming, that's where things get slightly more complicated. The in-game benchmark results were stellar in titles like Cyberpunk 2077, where the Titan 18 HX managed to get 60 fps at 4K with ultra settings without any kind of upscaling. It was a similar story with games like Horizon Zero Dawn Remastered and Black Myth Wukong. However, there is a performance ceiling, and if you were hoping to max out every single ray tracing setting in very demanding games like Marvel's Spider-Man II, you will find a wall that will dunk your fps below 50 and into the low 40s in some intensive moments. This is where you could technically make use of Nvidia's oft-touted multi-frame gen capabilities with DLSS 4. This essentially inserts multiple 'generated' frames in between each frame that's actually rendered by the PC. With a host of technical trickery, this boosts your frames per second so 50 or 60 fps can work into the low 100s on 2x frame gen, or upwards of 200 fps with 4x settings. Multi-frame gen is not a panacea for low framerates. You still want closer to 50 or 60 fps before you enable it to avoid graphics artifacts that will spoil the picture. It's especially not ideal if you're planning to mostly focus on multiplayer, as frame gen will necessarily increase latency, which will impact how the PC tracks your mouse movement between frames. That being said, I wouldn't worry about your framerates in most multiplayer shooters, whether that's Marvel Rivals or Call of Duty: Black Ops 6. The laptop version of the RTX 5090 is a top-of-the-line card, and to be frank, there is no other consumer-level card you can buy that is more powerful than it. The GPU will top out most games, but even then there's a cap to how much you can honestly expect from it. And still, even if you have the bodybuilder physique to hold this eight-pound behemoth aloft for more than a few seconds, it can't be your everyday carry laptop. MSI can't escape the age-old problem with gaming laptops: the battery. Off a charger, on balanced performance settings through Windows 11 and MSI's Center software, I couldn't even make it two hours without needing to plug it in. It's far worse when gaming, and you'll be lucky to get more than an hour of time before the battery runs out. The Titan 18 HX is a desktop replacement, after all, and that demands you use it from your desk or at least close enough to some power source. There's only so much you can expect from a laptop, but the Titan 18 HX represents the top end of what's technically possible with modern hardware. When the only Razer Blade 16 configuration currently for sale costs $4,900 with an RTX 5090 in a thinner chassis, paying upwards of $5,000 for the heftier frame and larger (though non-OLED) display doesn't sound as ludicrous. We can expect the upcoming Razer Blade 18 or other large laptops that have yet to see the light of day (which is another factor of Trump's trade war) to be priced at or just below MSI's humongous device. MSI's Titan 18 HX is close enough to a supercharged pre-built desktop I could even consider it an alternative for a full gaming rig, especially if you have need for just one device you can plug into your TV or drag to your bedroom for whatever gaming needs. Though as tariff woes continue to increase costs and decrease availability, I wouldn't blame you for looking at the Titan 18 HX as an idol of excess. I'm not the religious type, and idolatry is the kind of sin that may be a worthy price of admission for such a powerful piece of tech.
Yahoo
06-04-2025
- Yahoo
Nvidia RTX 5090's 16-pin power connector hits 150C in reviewer's thermal camera shots
When you buy through links on our articles, Future and its syndication partners may earn a commission. Some thermal imagery shared on Twitter/X underlines how toasty-hot power connectors servicing Nvidia GeForce RTX 5090 graphics cards can get. Veteran hardware reviewer Andreas Schilling, an editor at Germany's Hardware Luxx, took thermal photos of his water-cooled graphics card. While the GPU barely broke a sweat, the power connectors could be seen "cooking at 150+ degrees." That's Celsius, and for those unfamiliar with metric units, 150 degrees Celsius equates to just over 300 degrees Fahrenheit. Though reviewers at Tom's Hardware haven't experienced dangerously hot 12V-2x6 cable connectors first-hand, reports indicate this issue can affect anyone—from seasoned hardware veterans to budding enthusiasts. In previous generations and with earlier iterations of the 16-pin connector, there was often a nagging doubt about 'user error, ' but we seem to be fully past that notion now. We must face up to the problem that RTX 5090 power cables may be doomed to burn. Schilling provided some background information on the thermal imagery he shared. The graphics card he was testing was a liquid-cooled Inno3D RTX 5090 Frostbite, and one of the images clearly shows the pipe fittings. The PSU in this PC build was a be quiet! Dark Power 13 and Schilling confirmed that the GPU pulled 600W during tests. The Hardware Luxx editor sounds like he has run out of patience with Nvidia's graphics card power connector(s) choice. "The 12V-2x6 cable was cooking at 150+ degrees," he observed, backed up by the thermography. "This is no joke and will forever remain a weak point of this generation(s)." Prompted by social media interest in the visuals, Schilling added some extra details. He said the cabling looked like it was still for now, despite the high temperatures seen. "But you can see that they have been subjected to thermal stress. I have zero trust in that solution of any kind," blasted the reviewer. Later, Schilling recalled that the PSU-to-cabling mating cycles were still very low, "a handful." However, the connector might have been plugged and unplugged "several hundred" times on the GPU side. That second figure seems well beyond the 12V-2x6 connector's "mating cycle life of 30 cycles," mentioned by Corsair. The images show the unexpectedly high temperatures on both sides—the graphics card and PSU connections. Before we go, it is important to remember that thermal cameras measure surface temperatures so that things could be far hotter inside the plastic connectors. Schilling didn't see actual melting during his later inspections but said there was evidence of some thermal stress. We expect the 12V-2x6 connector to use the same Nylon 66 and LCP housing as per 12VHPWR specs. The former has a melting point of 255 degrees Celsius (491 degrees Fahrenheit), and the latter melts above 335 degrees Celsius (635 degrees Fahrenheit). Thankfully, Schilling's power connectors must not have quite reached these thresholds inside, but extended testing and use of this build – without changes – sounds like it could be hazardous.