logo
US & China Race for AI Supremacy

US & China Race for AI Supremacy

Bloomberg27-06-2025
The Chinese tech sector has been on a roll since the arrival in January of DeepSeek, the AI startup that stunned the world with a language model that claimed to match or outperform Western rivals, at a fraction of the cost. 'Bloomberg Tech: Asia' anchor Annabelle Droulers reports on how the rapid strides in AI are poised to escalate the tech "cold war" between the US and China. (Source: Bloomberg)
Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

NVIDIA Corporation (NVDA)'s CEO Knows 'What's Coming' Is Smart, Says Jim Cramer
NVIDIA Corporation (NVDA)'s CEO Knows 'What's Coming' Is Smart, Says Jim Cramer

Yahoo

time31 minutes ago

  • Yahoo

NVIDIA Corporation (NVDA)'s CEO Knows 'What's Coming' Is Smart, Says Jim Cramer

We recently published . NVIDIA Corporation (NASDAQ:NVDA) is one of the stocks Jim Cramer recently discussed. NVIDIA Corporation (NASDAQ:NVDA)'s shares are off to a great start in the year's second half. They are up by 14.9% year-to-date and have recovered all losses since January's DeepSeek selloff. Unsurprisingly, AI is at the heart of NVIDIA Corporation (NASDAQ:NVDA)'s newfound bullishness. Investors have rewarded the firm after several analysts speculate that GPUs will continue to grow their enterprise computing market share over the coming years and the AI market will expand. NVIDIA Corporation (NASDAQ:NVDA) has also benefited from its competitiveness as its GPUs continue to be the highest-performing and most widely sought AI hardware in the world. In this appearance, Cramer commented on NVIDIA Corporation (NASDAQ:NVDA)'s CEO and his thoughts about the future: '[On research reports showing limitations with the reasoning ability of generative AI] Why is NVIDIA bothering to put out new iterations? Well because each one is better and Jensen is about, he will tell you, the one he's thinking about a few years from now, will be, he would say smarter than you. It'd be smarter than you. And you know that will be intimidating. You will say, like you heard Jassy talk about, he went back and forth about the Kentucky Derby winner, well it's entirely possible that this next one will be able to say, well listen, right now if you send that to an upper left corner pitch, Bryce Harper's going to strike out with that. It's going to know these things. A close-up of a colorful high-end graphics card being plugged in to a gaming computer. Later in the day, on Mad Money, Cramer shared some reasons behind NVIDIA Corporation (NASDAQ:NVDA)'s share price movement: 'But then at the beginning of April, the stock collapsed on word that the president didn't want NVIDIA to sell any of its AI chips to the Chinese… That eventually caused NVIDIA to take a $4.5 billion write-off as they lost access to a market Jensen said could be worth as much as $50 billion. When that happened, the memesters left the building. Stock bottomed at 86 bucks and change in April. Can you believe it? And that was the time to buy as NVIDIA began its unheralded run all the way to $158, where it closed last night. The amazing thing, this rally was based on nothing more than semiconductor superiority and persistent demand from the hyperscalers, the same things that had the stock roaring all last year. I guess you could say that there was nothing wrong with NVIDIA the whole time… So, what did we discover?… NVIDIA's artificial intelligence chips remain unrivaled…' While we acknowledge the potential of NVDA as an investment, our conviction lies in the belief that some AI stocks hold greater promise for delivering higher returns and have limited downside risk. If you are looking for an extremely cheap AI stock that is also a major beneficiary of Trump tariffs and onshoring, see our free report on the best short-term AI stock. READ NEXT: 20 Best AI Stocks To Buy Now and 30 Best Stocks to Buy Now According to Billionaires. Disclosure: None. This article is originally published at Insider Monkey. Error in retrieving data Sign in to access your portfolio Error in retrieving data Error in retrieving data Error in retrieving data Error in retrieving data

‘Wrong': Amazon guru's huge bot, AI claim
‘Wrong': Amazon guru's huge bot, AI claim

Yahoo

time2 hours ago

  • Yahoo

‘Wrong': Amazon guru's huge bot, AI claim

The man responsible for the world's largest mobile industrial robotic fleet is adamant AI and machines will not replace human common sense. This week, retail and tech giant Amazon announced its one millionth custom-designed and built robot was zipping around warehouses worldwide. Chief technologist Tye Brady announced Amazon Robotics had created a new AI model to power these product moving bots Speaking to a global media contingent in Tokyo this week, Mr Brady fielded multiple questions about the prospect of AI robotics replacing entry-level and even skilled jobs. 'Any job that requires common sense, reasoning, problem solving, thinking at a higher level … Those jobs will always be needed,' he explained. 'Those jobs will always be there. 'This idea that it's 'people versus machines' is the wrong mindset.' Amazon ranks as the fifth largest company in the world, according to Forbes' latest rankings. In Australia the company employs about 7000 people - plus contractors - across 15 business arms. But at the business behemoth's heart is logistics and warehousing, and the company's modern warehouses are powered by fleets of robots. The blue bots buzz across the warehouse floor, sliding under stacks of yellow plastic shelves, moving countless products to human workers for sorting, storing and packing. Despite just one of eight Australian warehouses running the AI robots, more than 200 million physical, consumer products are available on Amazon in Australia. With the company's global march to automation, questions persist of how many humans will work at Amazon's gigantic warehouses in the coming decades, and whether entry-level jobs will be eliminated. 'We have built the world's largest, mobile industrial robotics base,' Mr Brady said. 'They solve practical, everyday problems. These are real world, applied problems … 99 per cent is not good enough. We ship billions and billions of packages every year,' Mr Brady told media in Tokyo. A graduate of the Massachusetts Institute of Technology, Mr Brady learned to program computers in the 1970s. He moved around the US to be at the various hubs of computing as the technology advanced, and now leads Amazon's wholly in-house robotics division. 'I want to reframe your mindset with machines … I see a future where smart, physical AI systems help the elderly. I see systems where caretakers can use lifts to help people get out of bed. 'I see systems where people can stay at home longer. I see robotics systems that enable people, and them to be more human. Robotics that extends and amplifies human potential.' *Amazon paid for NewsWire's travel and accommodation in Japan

Elon Musk confirms xAI is buying an overseas power plant and shipping the whole thing to the U.S. to power its new data center — 1 million AI GPUs and up to 2 Gigawatts of power under one roof, equivalent to powering 1.9 million homes
Elon Musk confirms xAI is buying an overseas power plant and shipping the whole thing to the U.S. to power its new data center — 1 million AI GPUs and up to 2 Gigawatts of power under one roof, equivalent to powering 1.9 million homes

Yahoo

time2 hours ago

  • Yahoo

Elon Musk confirms xAI is buying an overseas power plant and shipping the whole thing to the U.S. to power its new data center — 1 million AI GPUs and up to 2 Gigawatts of power under one roof, equivalent to powering 1.9 million homes

When you buy through links on our articles, Future and its syndication partners may earn a commission. Elon Musk's next xAI data centers are expected to house millions of AI chips and consume so much power that Elon Musk has reportedly bought a power plant overseas and intends to ship it to the U.S., according to Dylan Patel from SemiAnalysis, who outlined xAI's recent progress in a podcast. Interestingly, Musk confirmed the statement in a subsequent tweet. Elon Musk's current xAI Colossus AI supercomputer is already one of the world's most powerful and power-hungry machines on the planet, housing some 200,000 Nvidia Hopper GPUs and consuming around an astounding 300 MW of power, and xAI has faced significant headwinds in supplying it with enough power. The challenges only become more intense as the company moves forward — Musk faces a monumental challenge with powering his next AI data center, one that is predicted to house one million AI GPUs, thus potentially consuming the same amount of power as 1.9 million households. Here's how the data center could consume that much power, and how Musk plans to deliver it. Elon Musk's xAI has assembled vast computing resources and a team of talented researchers to advance the company's Grok AI models, Patel said. However, even bigger challenges lay ahead. It is no secret that Elon Musk has already run into trouble powering his existing xAI data center. Currently, the company's main data center, Colossus, which houses 200,000 Nvidia Hopper GPUs, is located near Memphis, Tennessee. To power this machine, xAI installed 35 gas turbines that can produce 420 MW of power, as well as deploying Tesla Megapack systems to smooth out power draw. However, things are going to get much more serious going forward. Beyond the Colossus buildout, xAI is rapidly acquiring and developing new facilities. The company has purchased a factory in Memphis that is being converted into additional data center space, big enough to power around 125,000 eight-way GPU servers, along with all supporting hardware, including networking, storage, and cooling. A million Nvidia Blackwell GPUs will consume between 1,000 MW (1 GW) and 1,400 MW (1.4 GW), depending on the accelerator models (B200, GB200, B300, GB300) used and their configuration. However, the GPUs are not the only load on the power system; you must also account for the power consumption of CPUs, DDR5 memory, storage, networking gear, cooling, air conditioning, power supply inefficiency, and other factors such as lighting. In large AI clusters, a useful approximation is that overhead adds another 30% to 50% on top of the AI GPU power draw, a figure typically expressed as PUE (power usage effectiveness). That said, depending on which Blackwell accelerators xAI plans to use, a million-GPU data center will consume between 1,400 MW and 1,960 MW (given a PUE of 1.4). What can possibly power a data center with a million high-performance GPUs for AI training and inference is a big question, as this undertaking is comparable to powering the potential equivalent of 1.9 million homes. A large-scale solar power plant alone is not viable for a 24/7 compute load of this magnitude, as one would need several gigawatts of panels, plus massive battery storage, which is prohibitively expensive and land-intensive. The most practical and commonly used option is building multiple natural gas combined-cycle gas turbine (CCGT) plants, each capable of producing 0.5 MW – 1,500 MW. This approach is relatively fast to deploy (several years), scalable in phases, and easier to integrate with existing electrical grids. Perhaps, this is what xAI plans to import to the U.S. Alternatives like nuclear reactors could technically meet the load with fewer units (each can produce around 1,000 MW) with no direct carbon emissions, but nuclear plants take much longer to design, permit, and build (up to 10 years). It is unlikely that Musk has managed to buy a nuclear power plant overseas, with plans to ship it to the U.S. In practice, any organization attempting a 1.4 – 1.96 Gigawatt deployment — like xAI — will effectively become a major industrial energy buyer. For now, xAI's Colossus produces power onsite and purchases power from the grid; therefore, it is likely that the company's next data center will follow suit and combine a dedicated onsite plant with grid interconnections. Apparently, because acquiring a power plant in the U.S. can take too long, xAI is reportedly buying a plant overseas and shipping it in, something that highlights how AI development now hinges not only on compute hardware and software but also on securing massive energy supplies quickly. Without a doubt, a data center housing a million AI accelerators with a dedicated power plant appears to be an extreme measure. However, Patel points out that most leading AI companies are ultimately converging on similar strategies: concentrating enormous compute clusters, hiring top-tier researchers, and training ever-larger AI models. To that end, if xAI plans to stay ahead of the competition, it needs to build even more advanced and power-hungry data centers. Follow Tom's Hardware on Google News to get our up-to-date news, analysis, and reviews in your feeds. Make sure to click the Follow button. Error in retrieving data Sign in to access your portfolio Error in retrieving data Error in retrieving data Error in retrieving data Error in retrieving data

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store