Latest news with #GB200NVL72


Time of India
10-07-2025
- Business
- Time of India
Amazon has found its 'own way' to cool down Nvidia's AI graphics cards
Amazon's cloud division has reportedly developed its own hardware to cool next-generation Nvidia graphics cards for artificial intelligence (AI) workloads. This internal solution addresses the significant energy consumption and heat generation associated with Nvidia's GPUs, which are vital for AI workloads. Dave Brown, Vice President of Compute and Machine Learning Services at Amazon Web Services (AWS), stated in a YouTube video that commercially available cooling equipment was not suitable and building data centres with widespread liquid cooling would have taken too much time. This led Amazon to develop its methods that can better manage the heat from these power-intensive Nvidia GPUs. What the AWS VP said about these tools developed for Nvidia GPUs Talking about the cooling equipment available for AI GPUs in the video, Brown said: 'They would take up too much data centre floor space or increase water usage substantially. And while some of these solutions could work for lower volumes at other providers, they simply wouldn't be enough liquid-cooling capacity to support our scale.' So, instead of relying on conventional solutions, Amazon engineers developed the In-Row Heat Exchanger (IRHX), a cooling system that can be integrated into both existing and future data centres. Previously, these traditional air cooling methods have been sufficient for earlier Nvidia chip generations. Introducing Amazon EC2 P6e-GB200 UltraServers: Powering Frontier AI at Scale | Amazon Web Services In a blog post, Brown also confirmed that AWS customers can now access this updated infrastructure through new P6e computing instances. These new offerings support Nvidia's high-density computing architecture, particularly the GB200 NVL72, which consolidates 72 Nvidia Blackwell GPUs into a single rack for training and deploying large AI models. Previously, similar Nvidia GB200 NVL72-based clusters were available via Microsoft and CoreWeave. AWS, as the leading global cloud infrastructure provider, continues to enhance its capabilities. Amazon has a history of developing its infrastructure hardware, including custom chips for general computing and AI, along with in-house-designed storage servers and networking equipment. This approach reduces reliance on external vendors and can improve profitability. AWS posted its highest operating margin since at least 2014 during the first quarter, contributing significantly to Amazon's overall net income. Microsoft, the second-largest cloud provider, has also moved into custom hardware. In 2023, it introduced a cooling system called Sidekicks, tailored for its Maia AI chips. What Is Artificial Intelligence? Explained Simply With Real-Life Examples AI Masterclass for Students. Upskill Young Ones Today!– Join Now


CNBC
09-07-2025
- Business
- CNBC
Amazon Web Services is building equipment to cool Nvidia GPUs as AI boom accelerates
Amazon said Wednesday that its cloud division has developed hardware to cool down next-generation Nvidia graphics processing units that are used for artificial intelligence workloads. Nvidia's GPUs, which have powered the generative AI boom, require massive amounts of energy. That means companies using the processors need additional equipment to cool them down. Amazon considered erecting data centers that could accommodate widespread liquid cooling to make the most of these power-hungry Nvidia GPUs. But that process would have taken too long, and commercially available equipment wouldn't have worked, Dave Brown, vice president of compute and machine learning services at Amazon Web Services, said in a video posted to YouTube. "They would take up too much data center floor space or increase water usage substantially," Brown said. "And while some of these solutions could work for lower volumes at other providers, they simply wouldn't be enough liquid-cooling capacity to support our scale." Rather, Amazon engineers conceived of the In-Row Heat Exchanger, or IRHX, that can be plugged into existing and new data centers. More traditional air cooling was sufficient for previous generations of Nvidia chips. Customers can now access the AWS service as computing instances that go by the name P6e, Brown wrote in a blog post. The new systems accompany Nvidia's design for dense computing power. Nvidia's GB200 NVL72 packs a single rack with 72 Nvidia Blackwell GPUs that are wired together to train and run large AI models. Computing clusters based on Nvidia's GB200 NVL72 have previously been available through Microsoft or CoreWeave. AWS is the world's largest supplier of cloud infrastructure. Amazon has rolled out its own infrastructure hardware in the past. The company has custom chips for general-purpose computing and for AI, and designed its own storage servers and networking routers. In running homegrown hardware, Amazon depends less on third-party suppliers, which can benefit the company's bottom line. In the first quarter, AWS delivered the widest operating margin since at least 2014, and the unit is responsible for most of Amazon's net income. Microsoft, the second largest cloud provider, has followed Amazon's lead and made strides in chip development. In 2023, the company designed its own systems called Sidekicks to cool the Maia AI chips it developed.

Yahoo
09-07-2025
- Business
- Yahoo
CoreWeave first to launch Nvidia RTX PRO 6000 Blackwells at scale; stock gains
-- CoreWeave Inc. (NASDAQ:CRWV) climbed 1.7% on Wednesday after the AI-native cloud provider became the first to launch NVIDIA's powerful new RTX PRO 6000 Blackwell Server Edition at scale. The move positions CoreWeave at the forefront of ultra-high-performance AI infrastructure, reinforcing its aggressive strategy to lead in GPU cloud services. NVIDIA Corporation (NASDAQ:NVDA) shares also rose by 1.4%, underscoring investor enthusiasm for growing AI demand. The RTX PRO 6000 GPU introduces transformative accelerations, claiming up to 5.6x faster large language model (LLM) inference and 3.5x faster text-to-video generation compared to its predecessor. Built to handle models as large as 70 billion parameters, CoreWeave's instances are designed for both inferencing and generative workloads across research, fintech, and creative sectors. Their cutting-edge configuration includes 8x RTX PRO 6000 GPUs, 128 Intel (NASDAQ:INTC) Emerald Rapids vCPUs, 1TB of RAM, and 100 GBps networking throughput. Integrated within CoreWeave's AI-optimized cloud platform, the new instances offer high-performance processing with flexible scalability for enterprise clients. 'The NVIDIA RTX PRO 6000 GPU represents a breakthrough in AI and graphics performance, empowering a variety of industries with advanced, cost-effective solutions,' said Dave Salvator, director of accelerated computing products at NVIDIA. The announcement also expands CoreWeave's already broad NVIDIA GPU portfolio, which includes pioneering access to the GB200 NVL72 system and the HGX B200 platform. Its history of early adoption, being the first to bring NVIDIA H200 and GB200 NVL72 offerings to general availability, has made it a sought-after partner for developers building next-generation AI solutions. Wall Street has taken notice of CoreWeave's momentum as AI infrastructure spending accelerates. As demand for foundation models and generative AI surges, investors appear increasingly bullish on companies with unrivaled access to top-tier GPUs and the engineering talent to deploy them at scale. Related articles CoreWeave first to launch Nvidia RTX PRO 6000 Blackwells at scale; stock gains Jefferies maintains Buy on Alibaba shares, cites strong AI-driven cloud growth TikTok preparing standalone U.S. app with separate algorithm
Yahoo
03-07-2025
- Business
- Yahoo
Why CoreWeave Rallied 46.5% in June
CoreWeave submitted new benchmarks on its largest Nvidia Blackwell GB200 NVL72 cluster. The data release seemed to put CoreWeave ahead of other clouds in deploying the most performant Blackwell chips. The massive June rally happened on top of a miraculous 170% gain in May. 10 stocks we like better than CoreWeave › Shares of artificial intelligence (AI) neocloud CoreWeave (NASDAQ: CRWV) rocketed 46.5% in June, according to data from S&P Global Market Intelligence. CoreWeave went public in March under a cloud of scrutiny and fears over tariffs. However, it has since become an AI darling, skyrocketing not only in May on the back of an incremental Nvidia (NASDAQ: NVDA) investment, but also in June. June's gains appeared to come from increasing optimism over AI-related growth, with CoreWeave publishing impressive leading benchmarks running Nvidia's latest Blackwell chips. In early June, CoreWeave submitted MLPerf Training v5.0 benchmarks for its GB200 NVL72 cluster, in collaboration with Nvidia and IBM. CoreWeave's submission used 2,496 Nvidia GPUs running on CoreWeave's AI-optimized infrastructure. That infrastructure includes CoreWeave's proprietary software and middleware innovations such as SUNK, which allows customers to use a combination of popular AI training programming languages instead of having to choose just one. CoreWeave's Tensorizer software also routes data to the closest possible GPU, resulting in faster training times. CoreWeave said its training cluster was 34 times larger than the only other cluster submitted for the same benchmark from a major cloud provider, underscoring CoreWeave's current advantage of deploying huge Nvidia clusters quickly. The company noted its infrastructure ran the large 405 billion-parameter Llama 3.1 model in just 27.3 minutes, more than twice as fast as other submissions. CoreWeave may be getting a preferred allocation of Nvidia chips before other major clouds, due to Nvidia's investment in CoreWeave, as well as all major clouds now pursuing their own AI training and inference ASICs. So CoreWeave appears to have a time-to-market advantage versus others, which may make CoreWeave attractive to AI labs needing the latest and greatest Nvidia chips as quickly as possible in large numbers. For instance, OpenAI, thought to be the leading AI lab today, inked an $11.9 billion deal with CoreWeave in March. That being said, CoreWeave is inherently at the mercy of Nvidia, which is both a supplier, investor, and customer, in a somewhat circular relationship. While the arrangement seems to be working for now, there is always the danger that if Nvidia ever gets a big competitive threat, things could get complicated for CoreWeave -- especially at its current elevated valuation. Before you buy stock in CoreWeave, consider this: The Motley Fool Stock Advisor analyst team just identified what they believe are the for investors to buy now… and CoreWeave wasn't one of them. The 10 stocks that made the cut could produce monster returns in the coming years. Consider when Netflix made this list on December 17, 2004... if you invested $1,000 at the time of our recommendation, you'd have $692,914!* Or when Nvidia made this list on April 15, 2005... if you invested $1,000 at the time of our recommendation, you'd have $963,866!* Now, it's worth noting Stock Advisor's total average return is 1,050% — a market-crushing outperformance compared to 179% for the S&P 500. Don't miss out on the latest top 10 list, available when you join . See the 10 stocks » *Stock Advisor returns as of June 30, 2025 Billy Duberstein and/or his clients has no position in any of the stocks mentioned. The Motley Fool has positions in and recommends International Business Machines and Nvidia. The Motley Fool has a disclosure policy. Why CoreWeave Rallied 46.5% in June was originally published by The Motley Fool Error in retrieving data Sign in to access your portfolio Error in retrieving data Error in retrieving data Error in retrieving data Error in retrieving data
Yahoo
27-06-2025
- Business
- Yahoo
Analyst Who Started Recommending NVIDIA (NVDA) in 2018 Says Stock Still in ‘Early Stages' of AI Buildout
Dryden Pence, Pence Capital Management CIO, in a recent program on CNBC explained why NVIDIA Corp (NASDAQ:NVDA) remains his top idea despite the stock's significant growth over the past few years. 'We've had it since 2018 and continue to increase positions in it. Why? Because we're just at the early stages of the buildout of the infrastructure around AI. It's like we're just laying the track of the transcontinental railroad. Nvidia is the absolute choke point to that, and if we're going to grow AI and what it's going to do for labor productivity and every company in the world, Nvidia is going to be a key part of that. And so we think that their chips are absolutely essential to growth. We think their chips are absolutely essential to increased labor productivity, and we think that the demand signal for this is going to only get greater over time.' Despite a $4.5 billion inventory charge related to US import restrictions for China, Nvidia expects gross margins to reach the mid-70% range by late this year due to scaling Blackwell production. NVDA bulls believe the company can easily offset losses related to China amid new products and market diversification. Saudi Arabia's Humain plans to buy more than 200,000 AI GPUs from Nvidia, potentially generating $15 billion in sales. The UAE reportedly has an agreement for up to 500,000 GPUs. Even without China's involvement for now, Nvidia said nearly 100 AI factories are under construction. These factories have hyperscalers deploying 1,000 GB200 NVL72 racks weekly, each with 72,000 Blackwell GPUs. RiverPark Large Growth Fund stated the following regarding NVIDIA Corporation (NASDAQ:NVDA) in its Q1 2025 investor letter: 'NVIDIA Corporation (NASDAQ:NVDA) was our top detractor in the quarter as investors took profits following its extraordinary performance in 2024. Despite reporting strong quarterly results, the stock pulled back amid concerns that AI-related demand may be plateauing near-term and that capital expenditures by hyperscalers could moderate. Additionally, investor anxiety rose following the announcement of sweeping new tariffs, which sparked fears of supply chain disruptions and rising input costs across the semiconductor industry. We continue to believe that NVIDIA remains one of the most strategically important companies in global computing, with best-in class GPUs, a dominant software ecosystem, and expanding opportunities in inference, networking, and edge AI. The long-term secular trend toward accelerated computing remains intact, and we believe NVDA is well-positioned to be a key beneficiary.' While we acknowledge the potential of NVDA as an investment, our conviction lies in the belief that some AI stocks hold greater promise for delivering higher returns and have limited downside risk. If you are looking for an extremely cheap AI stock that is also a major beneficiary of Trump tariffs and onshoring, see our free report on the best short-term AI stock. READ NEXT: 20 Best AI Stocks To Buy Now and 30 Best Stocks to Buy Now According to Billionaires. Disclosure: None. This article is originally published at Insider Monkey.