Latest news with #Trainium3

The Hindu
5 days ago
- Business
- The Hindu
Microsoft's next-gen AI chip production delayed to 2026: Report
Microsoft's next-generation Maia AI chip is facing a delay of at least six months, pushing its mass production to 2026 from 2025, The Information reported on Friday, citing three people involved in the effort. When the chip, code-named Braga, goes into production, it is expected to fall well short of the performance of Nvidia's Blackwell chip that was released late last year, the report said. Microsoft had hoped to use the Braga chip in its data centers this year, the report said, adding that unanticipated changes to its design, staffing constraints and high turnover were contributing to the delay. Microsoft did not immediately respond to a Reuters request for comment. Like its Big Tech peers, Microsoft has focused heavily on developing custom processors for artificial intelligence operations and general purpose applications, a move that would help reduce the tech giant's reliance on pricey Nvidia chips. Cloud rivals Amazon and Alphabet's Google have both raced to develop chips in-house, customized for their specific needs with the goal of improving performance and reducing costs. Microsoft had introduced the Maia chip in November 2023, but has lagged its peers in ramping it up to scale. Google, meanwhile, has seen success with its custom AI chips - called Tensor Processing Units - and in April unveiled its seventh-generation AI chip designed to speed the performance of AI applications. Amazon in December also unveiled its next-generation AI chip Trainium3 that is set to be released late this year.


Indian Express
5 days ago
- Business
- Indian Express
Microsoft's next-gen AI chip production delayed to 2026: Report
Microsoft's next-generation Maia AI chip is facing a delay of at least six months, pushing its mass production to 2026 from 2025, The Information reported on Friday, citing three people involved in the effort. When the chip, code-named Braga, goes into production, it is expected to fall well short of the performance of Nvidia's Blackwell chip that was released late last year, the report said. Microsoft had hoped to use the Braga chip in its data centers this year, the report said, adding that unanticipated changes to its design, staffing constraints and high turnover were contributing to the delay. Microsoft did not immediately respond to a Reuters request for comment. Like its Big Tech peers, Microsoft has focused heavily on developing custom processors for artificial intelligence operations and general purpose applications, a move that would help reduce the tech giant's reliance on pricey Nvidia chips. Cloud rivals Amazon and Alphabet's Google have both raced to develop chips in-house, customized for their specific needs with the goal of improving performance and reducing costs. Microsoft had introduced the Maia chip in November 2023, but has lagged its peers in ramping it up to scale. Google, meanwhile, has seen success with its custom AI chips – called Tensor Processing Units – and in April unveiled its seventh-generation AI chip designed to speed the performance of AI applications. Amazon in December also unveiled its next-generation AI chip Trainium3 that is set to be released late this year.
Yahoo
5 days ago
- Business
- Yahoo
Microsoft's next-gen AI chip production delayed to 2026, The Information reports
(Reuters) -Microsoft's next-generation Maia AI chip is facing a delay of at least six months, pushing its mass production to 2026 from 2025, The Information reported on Friday, citing three people involved in the effort. When the chip, code-named Braga, goes into production, it is expected to fall well short of the performance of Nvidia's Blackwell chip that was released late last year, the report said. Microsoft had hoped to use the Braga chip in its data centers this year, the report said, adding that unanticipated changes to its design, staffing constraints and high turnover were contributing to the delay. Microsoft did not immediately respond to a Reuters request for comment. Like its Big Tech peers, Microsoft has focused heavily on developing custom processors for artificial intelligence operations and general purpose applications, a move that would help reduce the tech giant's reliance on pricey Nvidia chips. Cloud rivals Amazon and Alphabet's Google have both raced to develop chips in-house, customized for their specific needs with the goal of improving performance and reducing costs. Microsoft had introduced the Maia chip in November 2023, but has lagged its peers in ramping it up to scale. Google, meanwhile, has seen success with its custom AI chips - called Tensor Processing Units - and in April unveiled its seventh-generation AI chip designed to speed the performance of AI applications. Amazon in December also unveiled its next-generation AI chip Trainium3 that is set to be released late this year.
Yahoo
5 days ago
- Business
- Yahoo
Microsoft's next-gen AI chip production delayed to 2026, The Information reports
(Reuters) -Microsoft's next-generation Maia AI chip is facing a delay of at least six months, pushing its mass production to 2026 from 2025, The Information reported on Friday, citing three people involved in the effort. When the chip, code-named Braga, goes into production, it is expected to fall well short of the performance of Nvidia's Blackwell chip that was released late last year, the report said. Microsoft had hoped to use the Braga chip in its data centers this year, the report said, adding that unanticipated changes to its design, staffing constraints and high turnover were contributing to the delay. Microsoft did not immediately respond to a Reuters request for comment. Like its Big Tech peers, Microsoft has focused heavily on developing custom processors for artificial intelligence operations and general purpose applications, a move that would help reduce the tech giant's reliance on pricey Nvidia chips. Cloud rivals Amazon and Alphabet's Google have both raced to develop chips in-house, customized for their specific needs with the goal of improving performance and reducing costs. Microsoft had introduced the Maia chip in November 2023, but has lagged its peers in ramping it up to scale. Google, meanwhile, has seen success with its custom AI chips - called Tensor Processing Units - and in April unveiled its seventh-generation AI chip designed to speed the performance of AI applications. Amazon in December also unveiled its next-generation AI chip Trainium3 that is set to be released late this year.
Yahoo
18-06-2025
- Business
- Yahoo
Marvell Stock Jumps After Unveiling Multi-Billion AI Chip Deals
June 18 - Marvell Technology (NASDAQ:MRVL) rose about 6% on Wednesday morning after Wall Street analysts responded positively to the company's recent custom AI event, citing major design wins and an expanded market outlook. Warning! GuruFocus has detected 4 Warning Signs with MRVL. The chipmaker revealed two new compute-focused XPU projects from hyperscale clients, with additional attach design wins bringing the total to 13, including one with Meta (NASDAQ:META). Evercore ISI's Mark Lipacis called these multi-billion-dollar lifetime revenue opportunities, potentially ramping up between 2026 and 2027. Lipacis maintained an Outperform rating with a $133 price target, noting each XPU attach win could generate several hundred million dollars in long-term revenue per socket. Morgan Stanley's Joseph Moore described the updates as ambitious, but acknowledged the opportunity looked credible. He highlighted Marvell's push to capture 20% of a newly sized $94 billion data center market, with over half of that driven by custom ASICs. Moore kept an Equal-Weight rating with a $73 price target. Bank of America's Vivek Arya was more bullish, raising his price target to $90 from $80. He said the company's increased 2028 earnings outlook points to potential EPS of $8, roughly 60% above consensus forecasts. Analysts said Marvell's expanding AI chip pipeline, including ongoing work with Microsoft (NASDAQ:MSFT) on its Maia processor and integration with Amazon's (NASDAQ:AMZN) Trainium3, reinforces its position in the competitive AI infrastructure space. This article first appeared on GuruFocus. Error in retrieving data Sign in to access your portfolio Error in retrieving data Error in retrieving data Error in retrieving data Error in retrieving data