logo
#

Latest news with #hyperscaler

Everyone's Talking About AI Compute—But It All Starts With Storage
Everyone's Talking About AI Compute—But It All Starts With Storage

Forbes

timea day ago

  • Business
  • Forbes

Everyone's Talking About AI Compute—But It All Starts With Storage

Dave Friend is the cofounder and CEO of Wasabi. When you think of AI, there are probably many things that come to mind, such as how to use it, where it's headed and what powers it. The conversation typically centers around compute—A.K.A. all the CPUs and GPUs you hear about when discussing AI. While compute is critical, there is a significant aspect of AI that is often overlooked: data. Although it may not be widely discussed, the reality is that these massive, unstructured and ever-growing data sets are what are truly driving global AI growth. As AI models become larger and more sophisticated, accessing the necessary data to train them is becoming a significant challenge for users. This is due to multiple factors, including the ever-increasing amount of data needed to train AI models. To make matters worse, hyperscaler storage that many rely on is expensive, overly complex and not optimized for the accessibility and performance that AI workflows demand. Additionally, enterprise data used to train AI systems is becoming a favored target for malicious actors. All of these factors combine to make AI adoption incredibly challenging, expensive and time-consuming. The reality is that most companies aren't struggling with AI compute limitations. They're hitting walls because they can't store, access and manage the data quickly, securely and affordably enough to support real-time inference, fine-tuning or long-term retention. If AI needs to run efficiently and cost-effectively, so does the data it learns from. To address these growing problems and fully leverage the benefits AI has to offer, organizations should implement a scalable cloud storage solution that provides cost-effectiveness, security and hybrid capabilities. Best Practices And What Leaders Should Expect However, not all data storage providers are created equal. The cloud giants that dominate the industry charge exorbitant fees to access data, making it more difficult, expensive and time-consuming for users. This makes training AI and storing the data that AI gleans a costly and challenging undertaking. To address this, organizations should seek out affordable cloud storage providers that don't charge these fees. This will enable them to easily access their data in a way that makes AI training as seamless and cost-effective as these storage buckets can be easily scaled up and down depending on need. This is ideal for training AI, as the storage will need to hold both the data required to train AI models and store the resulting information. Being able to scale up easily and down will ensure that an organization is adequately prepared for AI models and can adjust as needed. Just as important as where you store the data is ensuring it is stored securely. Cyberattackers are increasingly going after enterprise data due to its vital role in AI operations. As a result, it is crucial for IT leaders to ensure that the storage solutions they choose are adequately protecting their data. When selecting a provider, organizations should remember to look for one that offers robust data protection offerings that ensure the storage sets are impenetrable. Organizations should also take notice that the data is hidden from bad actors in the event of a breach to prevent deletion or ransomware threats. These are critical for avoiding an attack and protecting critical enterprise data. Key Features And Approaches To Security An essential part of a secure data management program is immutable backups, which prevent a malicious actor from modifying or deleting the stored data. Immutable backups are an air-gapped solution that isolates data from potential threats such as ransomware or accidental deletion. IT leaders should consider immutable backups to ensure their data is impenetrable and cannot be encrypted or deleted. Additionally, no secure cloud management program would be complete without employee training on cyber protection. By regularly updating cybersecurity best practices for employees and providing training, organizations can effectively prevent malicious actors from breaching their networks and accessing critical data. An approach that combines cost effectiveness and security is hybrid storage, which involves storing data in different methods and locations. This can include one copy in the cloud, one on-premises, and one on a hard drive. Incorporating cost-effective solutions like the cloud reduces expenses, while having the data in multiple locations allows it to be readily available in case of a cyberattack. For AI training, data can be readily available in the cloud, but it can also be stored on-premises for added security. While it is easy to get caught up in the AI boom, organizations must take their time incorporating the emerging feature. Technology decision-makers should ensure they prioritize cost-effective and secure ways to store the data necessary for proper AI training. Without it, they may be left behind their competitors in the AI adoption race. Forbes Technology Council is an invitation-only community for world-class CIOs, CTOs and technology executives. Do I qualify?

Data center spending soared amid rising GPU demand in Q1
Data center spending soared amid rising GPU demand in Q1

Yahoo

time20-06-2025

  • Business
  • Yahoo

Data center spending soared amid rising GPU demand in Q1

This story was originally published on CIO Dive. To receive daily news and insights, subscribe to our free daily CIO Dive newsletter. Data center capital expenditures increased 53% year over year to $134 billion during the first three months of 2025, according to Dell'Oro Group research published Tuesday. The spike was driven by a surge in hyperscaler spending on AI infrastructure, particularly Nvidia Blackwell GPUs and custom accelerators, the research firm said. The four companies with the largest cloud footprints — AWS, Google, Meta and Microsoft — accounted for 44% of Q1 data center capital investments. Enterprise infrastructure spending — the second biggest category — accounted for one-third of the total, according to Dell'Oro. 'Despite some project cancellations by U.S. cloud providers, overall CapEx remains on track, with hyperscalers adjusting capacity rather than cutting investments,' Dell'Oro Group Senior Research Director Baron Fung said in the report. 'Enterprises, facing tighter budgets and tariff-related risks, are more cautious, prompting slight downward revisions to their CapEx forecasts.' An ongoing data center building boom sparked by generative AI adoption showed no signs of an early-year slowdown as hyperscalers raced to add compute capacity. Amazon reported $24.3 billion in Q1 capital expenditures, primarily to expand AWS' AI cloud infrastructure. Microsoft and Google weren't far behind, reporting $21.4 billion and $17 billion in CapEx for the first three months of the year, respectively. Dell'Oro Group expects the trend to continue. The firm forecasted a 30% year-over-year bump in data center investments for 2025, despite mixed economic signals triggered by tariff concerns and supply chain challenges. Last year, data center infrastructure capital expenditures grew 51% year over year to $455 billion. 'Tariff-related uncertainties are not expected to materially alter hyperscaler spending plans given their diversified global supply chains,' Fung said in the report. AWS, Microsoft and Google Cloud plan to invest more than $250 billion in buildouts this year, in part to ease capacity constraints and satisfy growing customer demand for AI processing power. 'The availability of GPU and supporting infrastructure is supply constrained,' Fung said in an email. 'Demand is so strong that the top 4 U.S. cloud service providers have had to turn away smaller customers.' Amazon signaled two massive U.S. data center construction projects earlier this month — a $20-billion hub in Pennsylvania and a $10-billion project in North Carolina. The company is also planning a nearly $13 billion buildout in Australia, according to a June 14 announcement. 'As fast as we actually put the capacity in, it's being consumed,' Amazon CEO Andy Jassy said during a May earnings call. The executive noted that demand for AI compute services is 'unlike anything we've seen before' and characterized AI as a 'once-in-a-lifetime reinvention of everything we know' in an April letter to shareholders. AWS and its hyperscaler peers are jockeying to establish early market share, Fung told CIO Dive. Realizing a return on the investments will take time, he said. Error in retrieving data Sign in to access your portfolio Error in retrieving data Error in retrieving data Error in retrieving data Error in retrieving data

Navitas Semiconductor Corporation (NVTS) Launches 12kW PSU, Powers Next-Gen AI Data Centers
Navitas Semiconductor Corporation (NVTS) Launches 12kW PSU, Powers Next-Gen AI Data Centers

Yahoo

time23-05-2025

  • Business
  • Yahoo

Navitas Semiconductor Corporation (NVTS) Launches 12kW PSU, Powers Next-Gen AI Data Centers

We recently published a list of . In this article, we are going to take a look at where Navitas Semiconductor Corporation (NASDAQ:NVTS) stands against other AI stocks that are making waves this week. Navitas Semiconductor Corporation (NASDAQ:NVTS) is a small-cap chip designer. Its next-generation power solutions support energy-efficient AI data centers. On May 21, the pure-play, next-generation power semiconductor company announced its latest 12 kW power supply unit (PSU) for hyperscaler AI data centers. The PSU is designed for high-power rack densities of 120 kW and complies with Open Rack v3 (ORv3) specifications and Open Compute Project (OCP) guidelines. It is designed to ensure the highest efficiency and performance, along with the lowest component count. In simple words, Navitas' new power supply unit is faster, safer, and more efficient. This makes it an ideal choice for powering the next generation of AI-driven data centers. Aerial view of a large solar panel array under construction in a rural China landscape. 'The continuation and leadership of Navitas' AI power roadmap has seen a quadrupling in output power – from 2.7 to 12 kW – in just over 24 months. This increase in power delivery is vital for the world's data centers to support the exponential power demanded by the latest GPU architectures. The 'designed for production' PSU enables our customers to quickly implement a highly efficient, simple, and cost-effective solution to address the power delivery challenges for AI and hyperscale data centers.' Overall, NVTS ranks 9th on our list of AI stocks that are making waves this week. While we acknowledge the potential of NVTS as an investment, our conviction lies in the belief that some AI stocks hold greater promise for delivering higher returns and have limited downside risk. If you are looking for an AI stock that is more promising than NVTS and that has 100x upside potential, check out our report about this cheapest AI stock. READ NEXT: 20 Best AI Stocks To Buy Now and 30 Best Stocks to Buy Now According to Billionaires. Disclosure: None. This article is originally published at Insider Monkey.

CoreWeave (CRWV) Stock Soars After Nvidia Discloses Stake
CoreWeave (CRWV) Stock Soars After Nvidia Discloses Stake

Yahoo

time18-05-2025

  • Business
  • Yahoo

CoreWeave (CRWV) Stock Soars After Nvidia Discloses Stake

CoreWeave (NASDAQ:CRWV) shares surged 22% on May 16 and touched $80 after Nvidia Corp.'s (NASDAQ:NVDA) SEC filing (13G) revealed that it held around 24.2 million shares, or a 7% stake, as of March 31. Investors' confidence in CoreWeave appears to have risen multi-fold after this disclosure, as the stake now is higher than the 5.2% stake known to the Street since the company filed its IPO prospectus. IPO'ed just two months ago, CoreWeave Inc. (NASDAQ:CRWV) provides AI developers and enterprises with cloud-based graphics processing unit (GPU) infrastructure, primarily based on Nvidia GPUs. This relationship closely ties CoreWeave's growth to advancements in the latter's chips, enabling the company to offer customers the most advanced computing power. As a result, a greater backing from Nvidia corresponds to significant long-term potential for CoreWeave from AI infrastructure demand. Just a day before this news, the company released stronger-than-expected Q1 2025 results with an impressive 420% and 550% year-over-year increase in revenue and adjusted operating income, respectively. Encouraged by the solid results in an otherwise volatile earnings season, JP Morgan analyst Mark Murphy raised his share price target by 54% to $66 (from $43), while maintaining his Overweight rating. The company announced some new deals, including one from a hyperscaler customer (speculated to be Google) and another $4 billion expansion from an AI company. The analyst hailed these deals as a welcome diversification of CoreWeave's customer base and positive near-term growth catalysts. He likes the company's strong revenue backlog and its focus on expanding enterprise customer base, which support his positive opinion on the company. While we acknowledge the potential of CRWV as an investment, our conviction lies in the belief that some AI stocks hold greater promise for delivering higher returns and have limited downside risk. If you are looking for an AI stock that is more promising than CRWV and that has 100x upside potential, check out our report about the cheapest AI stock. READ NEXT: and . Disclosure: None. Error in retrieving data Sign in to access your portfolio Error in retrieving data Error in retrieving data Error in retrieving data Error in retrieving data

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store