Latest news with #GDDR6


Time of India
5 days ago
- Time of India
What is Rowhammer bit-flip attack which forced Nvidia to issue security alert
Researchers discovered that Nvidia's A6000 GPUs are prone to Rowhammer attacks. This allows hackers to tamper with user data on shared GPUs. Nvidia has issued an alert, advising users to enable Error Correction Code. Newer GPUs with GDDR7 or HBM3 memory have built-in protection. This vulnerability poses a risk in multi-tenant environments. Tired of too many ads? Remove Ads What Is Rowhammer bit-flip attack Tired of too many ads? Remove Ads Why is this a problem for Nvidia GPUs? How did Nvidia respond? A team of researchers has revealed that Nvidia's A6000 GPUs using GDDR6 memory are vulnerable to Rowhammer attacks, which can allow hackers to interfere with users' data on a shared GPU in the cloud, like AI models, even if they don't have direct access to it. In response to the research, Nvidia issued an alert for users asking them to ensure system-level ECC is enabled across the following NVIDIA products.'Specific generations of DRAM devices starting with DDR4, LPDDR5, HBM3, and GDDR7 implement On-Die ECC (OD-ECC) to help with DRAM scaling. OD-ECC indirectly protects Rowhammer bit flips. Note: OD-ECC is not adjustable by users. If OD-ECC is present, it is always enabled,' the company said in its alertRowhammer is a security issue where repeated access to memory can silently change data. Researchers just showed that this attack now works on Nvidia GPUs, so Nvidia issued an alert and recommends using ECC to stay for GPUs, also called GPUHammer, is a new hardware attack that targets graphics cards, specifically NVIDIA GPUs with GDDR6 memory. It is a hardware vulnerability found in computer memory chips (DRAM).Normally, data is safely stored in separate rows of memory cells. But if you rapidly and repeatedly access (hammer) one row, it introduces bit flips in adjacent memory rows. This can 'flip' bits of data even if no one directly accessed that data.'Since 2014, this vulnerability has been widely studied in CPUs and CPU-based memories like DDR3, DDR4, and LPDDR4. However, with critical AI and ML workloads now running on discrete GPUs in the cloud, it is vital to assess the vulnerability of GPU memories to Rowhammer attacks,' the researchers is a circuit-level DRAM vulnerability that lets attackers flip bits in neighboring memory rows. It had only been shown on CPU DRAM before, but now, researchers have demonstrated the first Rowhammer bit-flip attack on GPU DRAM, specifically on GDDR6 memory used in NVIDIA GPUs like the can potentially use Rowhammer to mess with another user's data on a shared GPU in the cloud, like AI models, even if they don't have direct access to it. The research proved that even a single bit flip could destroy the accuracy of an AI attack works in shared environments where multiple people or programs are using the same GPU at the same time (multi-tenant setups).Nvidia confirmed the Rowhammer risk on some GPUs and advised customers to turn on Error Correction Code (ECC), a memory feature that can catch and fix single-bit errors. This helps stop Rowhammer attacks, though it might slow down some AI workloads by up to 10%.Newer Nvidia GPUs (like those with GDDR7 or HBM3 memory) already have built-in protections (on-die ECC) against this type of attack.


Mint
09-07-2025
- Business
- Mint
RAM prices are spiking in 2025, and it's not just because of AI
PC builders, smartphone makers, and gamers are in for a rough ride. Prices for DDR4, DDR5, GDDR6, and LPDDR4X memory are climbing fast in 2025, and the impact is already showing across product categories. But here's the twist: AI isn't the main reason. Let's start with the numbers. DDR4, still widely used in desktops and laptops, could jump up to 45% this quarter for some modules. DDR5, used in newer machines and servers, is up 3–8%, while GDDR6, the memory in most current graphics cards, is set for a 28–33% hike. Mobile RAM like LPDDR4X? 23–28% higher and still rising. These aren't minor blips. These increases cut across the board like PC components, smartphones, servers, and GPUs are all going to get more expensive in the coming months. Contrary to what you might think, this isn't about AI server farms hoarding all the chips. It's about supply, production priorities, and a bit of trade drama. End of the line for DDR4 and GDDR6: Big names like Samsung, Micron, and SK hynix are winding down older memory formats to push newer, higher-margin products like DDR5, GDDR7, and HBM (High Bandwidth Memory). As a result, DDR4 and GDDR6 are in short supply, and that's driving up prices. Production cuts: After months of oversupply, manufacturers scaled back output to stabilize margins. That, combined with the phaseout of legacy memory types, has left distributors scrambling. Geopolitics: With US–China trade tensions back in the spotlight and new tariff threats, tech brands are stockpiling memory. That's making things even tighter for the open market. Peak season rush: It's the typical seasonal restocking cycle. Everyone, from phone makers to laptop brands, is trying to lock in inventory before prices go higher. AI is definitely consuming memory, but mostly HBM and newer formats. What we're seeing here is more about production strategy than raw demand. The biggest hikes are in formats that AI infrastructure isn't even using heavily. If you're building or upgrading a PC with DDR4 or a GDDR6 GPU, be prepared to pay more. Mid-range phones might quietly get more expensive too, especially those using LPDDR4X. Even entry-level laptops could start feeling overpriced. For now, the outlook is shaky. Prices may stay elevated until either demand eases or manufacturers ramp up production again. Ironically, some suppliers are even considering restarting DDR4 production, just to cash in on the demand. So if you've been sitting on that RAM upgrade or GPU purchase, this might be your last shot at a decent price for a while.


Digital Trends
06-05-2025
- Digital Trends
We just got our first glimpse of AMD's RX 9060 XT, but many questions remain
The start of the year was littered with some of the best graphics cards (although, admittedly, they weren't really up for grabs due to the state of the market). Now, it's time for the mainstream GPUs to make an appearance. AMD's RX 9060 XT is one such GPU that many gamers are waiting for, and we just spotted it in a retail listing, indicating the launch might not be too far off. Spotted by VideoCardz, the GPU broke cover at a Brazilian retailer, Terabyteshop. The Gigabyte AMD Radeon RX 9060 Gaming OC was the only model listed, but it came with an official-looking AMD blurb and a spec sheet, although the most important detail was missing: The price. First, let's dig into what we do know, and then we can discuss what we still need to learn. Recommended Videos The RX 9060 XT comes with 16GB of GDDR6 memory, and although the spec sheet failed to mention that, most leakers expect a narrow 128-bit bus. The maximum resolution is listed as 7,680 x 4,320, with support for up to four displays. The card supports DirectX 12 and OpenGL 4.6, comes with two DisplayPort 2.1a and two HDMI 2.1a connectors, and uses only a single 8-pin power connector. Despite that fact, the spec sheet still calls for a power requirement of 850 watts, which sounds way overkill for a graphics card of this caliber, so we could be dealing with a mistake in the specifications. Of course, until the card is officially released, we can't take these specs at face value — but this source seems pretty legitimate, so a lot of it might be true. With that said, we're still dealing with a bunch of question marks. For starters, what's the memory interface like? What's the maximum clock speed? Most of all, how much will this GPU cost? AMD's RX 9070 XT struck a goldmine by offering great value for the money. Unfortunately, once the initial stock of MSRP-priced GPUs was all sold out, the card drastically increased in price. If AMD can offer a steady supply of the RX 9060 XT at around $300 or below, it could be a hit, but the 'steady supply' part of it all is what I'm worried about. AMD has yet to officially launch the card. Rumor has it that the RX 9060 XT will be officially announced on May 21 during Computex 2025. Please enable Javascript to view this content


Forbes
19-04-2025
- Business
- Forbes
AI Gets Memory With Chips From Micron And Others
Integrated Circuit, Film-Layout of a Printed Circuit Board. (Photo by Mediacolors/Construction ... More Photography/Avalon/Getty Images) News broke yesterday that Micron Technology is shaking things up with a new focus on a 'cloud memory business unit' that will create something called HBM chips or high bandwidth memory chips. HBM chips are 3-D stacked SDRAM microprocessors traditionally used with high-performance hardware setups. Over in the world of model design, we're seeing LLMs get more memory capacity and more utility out of the context data that they keep in memory. So it makes sense that this hardware revolution would be occurring. The interesting thing is who the players are. Insiders note that Micron is a top global provider of HBM chips, along with Samsung and a company called SK Hynix. So who's actually making these chips? Take Samsung, for example. Industry news reveals that Samsung is working with its rival foundry partner TSMC to develop the HBM chips. We've seen so many times how TSMC has a dominant position in the market as a foundry. Other companies use TSMC for the raw fabrication power, and develop their own plans on top of TSMC's production capability. That in turn has led to everything from a shortage of vehicle chips, to more recently, some troublesome geopolitical problems around production having to do with export controls. It seems like the world would be in a lot better shape if there were, say, a dozen foundry makers around the world. Anyway, in creating these high-design chips, do Samsung and TSMC compete with Nvidia? Not exactly. Other industry reporting shows that Nvidia was planning to buy the chips from Samsung, but the vendor company couldn't meet Nvidia's bar. A March 20 press release shows Nvidia CEO Jensen Huang saying Samsung 'has an important part to play,' but noting that the company hasn't formally ordered Samsung HBM3E chips. First of all, the HBM chip is a 3-D stacked DRAM type of chip. The memory unit sits close to a CPU or GPU, to conquer latency and provide high bandwidth with low power consumption. I asked ChatGPT more about the specs for these chips, and it came out with this: Bandwidth: 819 GB per second, per stack Speed: 6.4 GB per pin Capacity: up to 64 GB per stack Thermals: better efficiency Use cases: AI, HPC, GPUs (In this context, we're talking mainly about using it for AI applications.) ChatGPT also gave me this interesting graphic comparing the HBM's build to something called GDDR6, a gaming chip that's cheaper and more widely available: You can get more from public resources like this one on how the HBM has been engineered to fit very specific needs. Let's look briefly at this corner of the tech market, for enterprise context that CEOs (or anyone else) might want to know about. First, we have Nvidia down around 40% from all-time highs within the past year, and crawling back down toward $100 per share in recent trading cycles, ostensibly based on U.S. export controls. The assertion of Huang and company that Nvidia is poised to lose $5.5 billion due to new rules has been big news lately. Then there's Micron, at around $70 per share currently, about one half of all-time high values, and down significantly since winter. As for Samsung, which looks like it's down 8% in a short time frame. Companies like AMD are also down. 'A warning from AI chips champion Nvidia that it will face a $5.5 billion hit from tightened U.S. controls on exports to China marks a new chapter in the escalating tit-for-tat between Washington and Beijing,' AJ Bell investment director Russ Mould said, as quoted by Elsa Ohlen writing for Barron's. That's a little on some of the great new hardware developments happening now. The context, in terms of LLM news, is the advancement of models with persistent memory. I've talked about using an AI chat companion from Sesame, for example, and how 'Maya' seemed to remember my name, as a return user, on a good day. Along with chain of thought, memory is a big capability builder for all of those vibrant use cases that we have come to expect from our neural net friends and neighbors.