Latest news with #Trainium


CNBC
21 hours ago
- Business
- CNBC
Amazon to buy AI company Bee that makes wearable listening device
Amazon plans to acquire wearables startup Bee AI, the company confirmed, in the latest example of tech giants doubling down on generative artificial intelligence. Bee, based in San Francisco, makes a $49.99 wristband that appears similar to a Fitbit smartwatch. The device is equipped with AI and microphones that can listen to and analyze conversations to provide summaries, to-do lists and reminders for everyday tasks. Bee AI CEO Maria de Lourdes Zollo announced in a LinkedIn post on Tuesday that the company will join Amazon. "When we started Bee, we imagined a world where AI is truly personal, where your life is understood and enhanced by technology that learns with you," Zollo wrote. "What began as a dream with an incredible team and community now finds a new home at Amazon." Amazon spokesperson Alexandra Miller confirmed the company's plans to acquire Bee. The company declined to comment on the terms of the deal. Amazon has introduced a flurry of AI products, including its own set of Nova models, Trainium chips, a shopping chatbot, and a marketplace for third-party models called Bedrock. The company has also overhauled its Alexa voice assistant, released over a decade ago, with AI capabilities as Amazon looks to chip away at the success of rivals like OpenAI's ChatGPT, Anthropic's Claude and Google's Gemini. Ring, the smart home security company owned by Amazon, has also looked to introduce generative AI in some of its products. Amazon previously experimented in the wearables space through a health and fitness-focused product called Halo. It sunset the Halo in 2023 as part of a broader cost-cutting review. Other tech companies have launched AI-infused consumer hardware with mixed success. There's the Rabbit R1, a small square gadget that costs $199 and uses an OpenAI model to answer questions, as well as the AI pin developed by Humane, which later sold to HP. Meta's Ray-Ban smart glasses have grown in popularity since the first version was released in 2021. OpenAI in May acquired Jony Ive's AI devices startup io for roughly $6.4 billion. The company reportedly plans to develop a screen-free device.

Business Insider
11-07-2025
- Business
- Business Insider
Amazon's alliance with Anthropic is paying off handsomely for the tech giant's cloud business
Amazon 's partnership with Anthropic has been a big success and is set to drive significant growth in the tech giant's cloud business in coming years, according to new estimates released on Friday. Amazon Web Services stands to generate billions of dollars in extra in revenue from the alliance, Morgan Stanley analysts wrote in a note to investors. The investment bank estimated that AWS could generate $1.28 billion in sales in 2025 from Anthropic's use of its cloud services. That number could balloon to almost $3 billion in 2026 and $5.6 billion in 2027 as Anthropic's AI workloads grow, the note said. The projections highlight the substantial returns AWS could reap from its partnership with Anthropic. Amazon invested $8 billion in Anthropic so far, making it the largest startup investment in company history. That stake is now worth $13.8 billion, according to Amazon's latest financial statement. Anthropic is also a heavy user of AWS's cloud services, including its Trainium AI chips. AWS is now considering pouring more money into the OpenAI rival, the Financial Times reported this week. Spokespeople from Amazon and Anthropic didn't respond to a request for comment. In its note, Morgan Stanley forecast Anthropic's revenue will grow from $4 billion this year to $10 billion in 2026 and $19 billion in 2027. The bank's analysts then assumed Anthropic gross profit margins of 60%, and estimated that 75% of related costs are spent on AWS cloud services. This is mostly from AWS doing inference for Anthropic, which is running the startup's AI models. Any upfront training of future Anthropic models would generate more revenue for Amazon, the analysts wrote. More broadly, Morgan Stanley cited the partnership with Anthropic as one of the key factors behind its expectation that AWS revenue growth will accelerate this year. Other reasons include AWS's solid growth trajectory, even without Anthropic's contribution. AWS grew in the range of 16% to 19% a year over the last 5 quarters, which speaks to the resilience of its core cloud offerings, Morgan Stanley wrote. Microsoft Azure's recent success, fueled by both AI and non-AI workloads, suggests a broader enterprise appetite for generative AI infrastructure, Morgan Stanley's note added. AWS could benefit from a similar pattern, especially as organizations increase their AI work, it said. Morgan Stanley's latest CIO survey also indicated that Amazon could gain share over Microsoft and Google Cloud in the near term, giving "more confidence" in AWS's strong position, the note said. "In all, our new AWS base model could prove conservative if Anthropic continues to grow and GPU supply constraints ease," Morgan Stanley wrote, "opening the door for faster GPU and non-GPU enabled workload growth."

Business Standard
10-07-2025
- Business
- Business Standard
Amazon mulls another multibillion-dollar investment in Anthropic
Ecommerce giant Amazon is reportedly considering a further multi-billion dollar investment in artificial intelligence firm Anthropic, The Financial Times reported today. This would expand upon the $8 billion already committed as of November last year. In November 2023, Amazon injected $4 billion into Anthropic, a rival to OpenAI, to capitalise on the fast-growing generative AI sector. This move doubled the tech giant's initial investment in the company. The potential new funding would help Amazon maintain its position as one of Anthropic's primary shareholders, outpacing Google, which has thus far invested over $3 billion. Race for AI supremacy Amazon is aiming to solidify its standing in AI innovation, following early advances made by competitors such as OpenAI and Google, particularly in the realm of consumer-facing AI models. As competition intensifies, firms are ramping up AI investments and employing novel strategies to secure top-tier AI talent. Meta has extended unusually large compensation packages to recruits for its 'superintelligence' team, including an offer worth over $200 million to Ruoming Pang, the former leader of Apple's AI division. As part of its strategic expansion, Amazon is constructing what will become its largest data centre complex near New Carlisle, Indiana, according to a report by The New York Times. Comprising approximately 30 facilities, the site will be filled with hundreds of thousands of specialised AI chips. Collectively, these centres are designed to function as a singular, powerful machine exclusively dedicated to artificial intelligence workloads. The site is expected to draw 2.2 gigawatts of electricity, sufficient to power one million homes. It has reportedly been purpose-built for a single client: Anthropic. This data centre cluster is the first of a new breed under Amazon's Project Rainier initiative, named after the prominent mountain near its Seattle headquarters. Project Rainier represents Amazon's foray into the tech industry's escalating race to build AI data infrastructure on a previously unimaginable scale. Anthropic to utilise Amazon's custom AI chips Anthropic intends to develop and run its foundational AI models using Amazon's proprietary Trainium and Inferentia processors. While Nvidia currently dominates the market for AI-specific chips and counts Amazon among its many hyperscale clients, Amazon is actively developing its own hardware through Annapurna Labs. Anthropic said it is working closely with the Annapurna team to assist in chip development. Amazon's deepening financial commitment to Anthropic highlights the wider trend of soaring investment in AI startups. Since the debut of OpenAI's ChatGPT in late 2022, interest in generative AI has skyrocketed, prompting investors to channel billions into companies aiming to lead the next phase of AI innovation.


Hans India
28-06-2025
- Business
- Hans India
Microsoft's AI Chip ‘Braga' Delayed to 2026, Expected to Trail Nvidia's Blackwell: Report
Microsoft's ambitious plans to mass-produce its next-generation AI chip, code-named Braga, have reportedly hit a significant delay, with production now expected in 2026 instead of this year. This development, as reported by The Information on Friday, has been attributed to unexpected design revisions, staffing issues, and a high rate of employee turnover within the project team. Initially slated to power Microsoft's data centers by the end of 2025, the Braga chip is the successor to the Maia AI chip, which was introduced in November 2023. According to the report citing three individuals directly involved in the project, the Braga chip is not only delayed but is also expected to significantly underperform when compared to Nvidia's Blackwell chip, which launched in late 2024 and is currently leading the market in AI chip performance. The delay marks a setback in Microsoft's broader strategy to reduce its dependency on Nvidia's GPUs—currently the dominant force in AI hardware—and establish itself as a serious contender in the custom chip space. Microsoft has not issued an official comment in response to the report, as noted by Reuters. The push for custom chips has become a defining trend among major cloud providers. Like its tech rivals Amazon and Alphabet (Google), Microsoft has invested heavily in in-house silicon to support the exponential growth in demand for AI computing. These custom processors are not only crucial for boosting performance but also for managing rising operational costs in AI workloads. Despite introducing the Maia chip in late 2023, Microsoft has struggled to scale production in line with competitors. Google's Tensor Processing Units (TPUs), for example, have been pivotal in powering many of its AI services. The search giant recently launched its seventh-generation TPU in April 2025, with notable performance upgrades designed to accelerate large-scale AI applications. Meanwhile, Amazon continues to make strides with its Trainium chip line. In December 2024, the company unveiled Trainium3, its next-gen AI processor, scheduled for release later this year, promising improved training speeds and energy efficiency. Microsoft's delay could give both Amazon and Google further time to cement their positions in the AI chip arena. Moreover, the setback may compel Microsoft to lean more on third-party chipmakers like Nvidia and possibly consider interim solutions to meet its data center demands. In a related development, OpenAI—the AI research lab heavily backed by Microsoft—was recently reported to be testing Google's AI chips to power some of its products. This move hints at the broader industry reality: even AI leaders may be forced to look beyond their preferred partnerships when custom solutions lag in readiness. As the race for AI supremacy intensifies, delays like Braga's could have far-reaching consequences—not just for Microsoft, but for the entire ecosystem of AI infrastructure.
Yahoo
24-06-2025
- Business
- Yahoo
Amazons AWS fires back at Nvidia with Graviton4 and Trainium3
Amazon's AWS is sharpening its AI edge with custom chipsan upgraded Graviton4 CPU and a forthcoming Trainium3 GPUthat could start chipping away at Nvidia's (NASDAQ:NVDA) market stronghold in AI training and inference. CNBC reports AWS will soon launch a Graviton4 update boasting 600 Gbps of network bandwidth, courtesy of its Annapurna Labs design, with availability expected by month's end. Later this year, AWS plans to roll out Trainium3, promising 50% better energy efficiency versus Trainium2, which underpins Anthropic's Claude Opus 4 model. While Nvidia's Blackwell GPU retains a rawperformance lead, Trainium2 already offers superior cost-performance ratios, according to AWS Senior Director Gadi Hutt. Developers eyeing Trainium will need to retool workloads away from Nvidia's CUDA ecosystem and validate modelaccuracy parity on AWS's frameworks. Nvidia has dominated AI compute thanks to unmatched throughput and the ubiquity of CUDA in developer toolchains. By delivering strong price-performance and energy gains via Graviton4 and Trainium3, AWS aims to lure hyperscalers and cost-sensitive enterprises that run massive inference fleets or large-scale training jobs. If AWS can minimize migration friction and prove equivalent accuracy, it could open the door for a meaningful shift in AI infrastructure spend. The real test will come when Graviton4 benchmarks are published and Trainium3 previews hit developer hands. Watch for cloudnative AI workloads running on non-CUDA stacks and for enterprise case studies highlighting total cost-of-ownership savings. Those signals will reveal whether AWS can genuinely erode Nvidia's GPU hegemony. This article first appeared on GuruFocus. Error in retrieving data Sign in to access your portfolio Error in retrieving data Error in retrieving data Error in retrieving data Error in retrieving data