logo
#

Latest news with #errorcorrection

IBM plans to build first-of-its-kind quantum computer by 2029 after 'solving key bottleneck'
IBM plans to build first-of-its-kind quantum computer by 2029 after 'solving key bottleneck'

Yahoo

time5 days ago

  • Science
  • Yahoo

IBM plans to build first-of-its-kind quantum computer by 2029 after 'solving key bottleneck'

When you buy through links on our articles, Future and its syndication partners may earn a commission. IBM scientists say they have solved the biggest bottleneck in quantum computing and plan to launch the world's first large-scale, fault-tolerant machine by 2029. The new research demonstrates new error-correction techniques that the scientists say will lead to a system 20,000 times more powerful than any quantum computer in existence today. In two new studies uploaded June 2 and June 3 to the preprint arXiv server, the researchers revealed new error mitigation and correction techniques that sufficiently handle these errors and allow for the scaling of hardware nine times more efficiently than previously possible. The new system, called "Starling," will use 200 logical qubits — made up of roughly 10,000 physical qubits. This will be followed by a machine called "Blue Jay," which will use 2,000 logical qubits, in 2033. The new research, which has not yet been peer-reviewed, describes IBM's quantum low-density parity check (LDPC) codes — a novel fault-tolerance paradigm that researchers say will allow quantum computer hardware to scale beyond previous limitations. "The science has been solved" for expanded fault-tolerant quantum computing, Jay Gambetta, IBM vice president of quantum operations, told Live Science. This means that scaling up quantum computers is now just an engineering challenge, rather than a scientific hurdle, Gambetta added. Related: Google's 'Willow' quantum chip has solved a problem that would have taken the best supercomputer a quadrillion times the age of the universe to crack While quantum computers exist today, they're only capable of outpacing classical computer systems (those using binary calculations) on bespoke problems that are designed only to test their potential. One of the largest hurdles to quantum supremacy, or quantum advantage, has been in scaling up quantum processing units (QPUs). As scientists add more qubits to processors, the errors in calculations performed by QPUs add up. This is because qubits are inherently "noisy" and errors occur more frequently than in classical bits. For this reason, research in the field has largely centered on quantum error-correction (QEC). Error correction is a foundational challenge for all computing systems. In classical computers, binary bits can accidentally flip from a one to a zero and vice versa. These errors can compound and render calculations incomplete or cause them to fail entirely. The qubits used to conduct quantum calculations are far more susceptible to errors than their classical counterparts due to the added complexity of quantum mechanics. Unlike binary bits, qubits carry extra "phase information." While this enables them to perform computations using quantum information, it also makes the task of error correction much more difficult. Until now, scientists were unsure exactly how to scale quantum computers from the few hundred qubits used by today's models to the hundreds of millions they theoretically need to make them generally useful. But the development of LDPC and its successful application across existing systems is the catalyst for change, Gambetta said. LDPC codes use a set of checks to detect and correct errors. This results in individual qubits being involved in fewer checks and each check involving fewer qubits than previous paradigms. The key advantage of this approach is a significantly improved "encoding rate," which is the ratio of logical qubits to the physical qubits needed to protect them. By using LDPC codes, IBM aims to dramatically reduce the number of physical qubits required to scale up systems. The new method is about 90% faster at conducting error-mitigation than all previous techniques, based on IBM research. IBM will incorporate this technology into its Loon QPU architecture, which is the successor to the Heron architecture used by its current quantum computers. Starling is expected to be capable of 100 million quantum operations using 200 logical qubits. IBM representatives said this was roughly equivalent to 10,000 physical qubits. Blue Jay will theoretically be capable of 1 billion quantum operations using its 2,000 logical qubits. RELATED STORIES — IBM's newest 156-qubit quantum chip can run 50 times faster than its predecessor — equipping it for scientific research — Scientists just built a massive 1,000-qubit quantum chip, but why are they more excited about one 10 times smaller? — Error-corrected qubits 800 times more reliable after breakthrough, paving the way for 'next level' of quantum computing Current models have about 5,000 gates (analogous to 5,000 quantum operations) using 156 logical qubits. The leap from 5,000 operations to 100 million will only be possible through technologies like LDPC, IBM representatives said in a statement. Other technologies, including those used by companies like Google, will not scale to the larger sizes needed to reach fault tolerance, they added. To take full advantage of Starling in 2029 and Blue Jay in 2033, IBM needs algorithms and programs built for quantum computers, Gambetta said. To help researchers prepare for future systems, IBM recently launched Qiskit 2.0, an open-source development kit for running quantum circuits using IBM's hardware. "The goal is to move from error mitigation to error correction," Blake Johnson, IBM's quantum engine lead, told Live Science, adding that "quantum computing has grown from a field where researchers are exploring a playground of quantum hardware to a place where we have these utility-scale quantum computing tools available."

God's play? Chinese scientists catch cosmic rays meddling in quantum computer operation
God's play? Chinese scientists catch cosmic rays meddling in quantum computer operation

South China Morning Post

time23-06-2025

  • Science
  • South China Morning Post

God's play? Chinese scientists catch cosmic rays meddling in quantum computer operation

Researchers in China said they have found the first evidence that subatomic particles from cosmic rays may be affecting the efficiency of widely used error correction techniques that are an essential element of fault-tolerant quantum computing. The scientists monitored superconducting quantum chips alongside fundamental subatomic particles – called muons – produced by cosmic rays , as well as gamma ray-induced particle disturbances known as quasiparticle bursts. 'We directly observed quasiparticle bursts leading to correlated errors that are induced solely by muons and separated the contributions of muons and gamma rays,' they said in a paper published last month by the peer-reviewed journal Nature Communications. The findings could be significant for the scaling of quantum processors and the design of fault-tolerant quantum computing systems, which can function properly even if faults or errors are present, the scientists said. According to the team – from the Chinese Academy of Sciences, the Beijing Academy of Quantum Information Sciences and Nanjing Normal University – the proposed detection method could also be applied in cosmic ray and dark matter particle detection. Unlike traditional computing's unit of information that exists either as 0 or 1, its quantum counterpart relies on quantum bits or qubits that can exist in a multidimensional state, making possible more advanced and secure tasks. However, errors can occur simultaneously in multiple qubits. On a small scale, these multiqubit correlated errors can be reduced with optimised error correction methods, though the efficacy of these strategies diminishes in larger-scale computing.

Why IBM Is the Best Quantum Computing Stock to Buy Right Now
Why IBM Is the Best Quantum Computing Stock to Buy Right Now

Globe and Mail

time19-06-2025

  • Business
  • Globe and Mail

Why IBM Is the Best Quantum Computing Stock to Buy Right Now

A future quantum computer could potentially solve problems that are essentially impossible for even the most powerful supercomputer. The magic comes from the nature of quantum physics. While traditional computers operate on bits that can be in only one of two states, a quantum qubit is probabilistic, occupying some combination of those two states. This property opens the door to exponentially faster computations. Today's quantum computers generally aren't capable of solving real-world problems quicker than traditional computers. They are capable of performing some types of computations faster, but these computations are more toy problems than anything else. When Alphabet 's Google unveiled its Willow quantum chip last year, it claimed that Willow could perform a particular benchmark in five minutes that would take a supercomputer 10 septillion years. Unfortunately, that benchmark has no known real-world applications. Where to invest $1,000 right now? Our analyst team just revealed what they believe are the 10 best stocks to buy right now. Continue » Another problem is error correction. Qubits are fragile, and errors are inevitably introduced over the course of a computation. Those errors must be prevented, corrected, or otherwise mitigated for long enough for a computation to be completed. Microsoft made some noise on this front earlier this year with its Majorana 1 quantum chip, which uses exotic particles to create more robust qubits. However, the company is in the early stages of scaling this technology, and it could very well be many years before anything useful comes out of it. International Business Machines (NYSE: IBM), a quantum computing pioneer, now sees a path to full-scale quantum error correction by 2029 and true quantum advantage by the end of 2026. The company has a clear roadmap, and if it can deliver, quantum computing could turn into a major business for the century-old tech giant. The path to fault-tolerant quantum computers IBM is taking a modular approach on its path to the holy grail of quantum computing. This year, IBM will release Nighthawk, its new quantum process with 120 qubits and 5,000 quantum gates. Over the next few years, successive versions of Nighthawk will increase the number of gates, culminating in 2028 with a 15,000-gate version that can be linked together in groups of nine. IBM believes Nighthawk will be able to achieve true quantum advantage. Nighthawk is a stepping stone toward Starling, the fault-tolerant quantum computer planned for 2028. To build Starling, IBM will release three iterations of quantum chips over the next few years that include the necessary technology to make Starling a reality. IBM Quantum Loon comes this year, featuring greater connectivity than the company's current quantum chips. IBM Quantum Kookaburra comes in 2026, bringing the ability to store information and process it with an attached processing unit. And IBM Quantum Cockatoo is set for 2027, allowing entanglement between modules. Starling, which will feature 200 logical qubits and 100 million quantum gates, will be built in 2028 and deliver fault-tolerance by 2029, according to IBM's roadmap. A quantum computing leader Plenty of companies are racing toward viable quantum computing, but IBM has two things that make it unique: a decades-long track record researching and building quantum computers, and a clear roadmap to reach fault-tolerance and true quantum advantage. While it's impossible to predict how large of an opportunity quantum computing could be for IBM, one estimate puts the economic value generated by quantum computing at $850 billion by 2040, with the market for quantum hardware and software potentially worth $170 billion. If IBM can truly pull ahead of its rivals and deliver real-world results with its quantum computers by the end of the decade, it will be in a great position to reap the rewards of the quantum computing revolution. IBM's valuation today looks reasonable considering the enormous potential of quantum computing. Based on the company's outlook for 2025, IBM stock trades for roughly 19 times free cash flow. While the stock isn't as cheap as it was a few years ago, IBM still looks like a solid buy. The company's hybrid cloud and artificial intelligence (AI) businesses are driving growth today, and quantum computing has the potential to drive growth in the 2030s and beyond. Should you invest $1,000 in International Business Machines right now? Before you buy stock in International Business Machines, consider this: The Motley Fool Stock Advisor analyst team just identified what they believe are the 10 best stocks for investors to buy now… and International Business Machines wasn't one of them. The 10 stocks that made the cut could produce monster returns in the coming years. Consider when Netflix made this list on December 17, 2004... if you invested $1,000 at the time of our recommendation, you'd have $658,297!* Or when Nvidia made this list on April 15, 2005... if you invested $1,000 at the time of our recommendation, you'd have $883,386!* Now, it's worth noting Stock Advisor 's total average return is992% — a market-crushing outperformance compared to172%for the S&P 500. Don't miss out on the latest top 10 list, available when you join Stock Advisor. See the 10 stocks » *Stock Advisor returns as of June 9, 2025 Suzanne Frey, an executive at Alphabet, is a member of The Motley Fool's board of directors. Timothy Green has positions in International Business Machines. The Motley Fool has positions in and recommends Alphabet, International Business Machines, and Microsoft. The Motley Fool recommends the following options: long January 2026 $395 calls on Microsoft and short January 2026 $405 calls on Microsoft. The Motley Fool has a disclosure policy.

IBM Starling: 20,000X Faster Than Today's Quantum Computers
IBM Starling: 20,000X Faster Than Today's Quantum Computers

Forbes

time10-06-2025

  • Business
  • Forbes

IBM Starling: 20,000X Faster Than Today's Quantum Computers

IBM quantum computers Today IBM released a roadmap to Starling, a quantum computer with 20,000 times the processing power of today's quantum computers. Starling won't be built until 2029, but IBM says they've cracked the toughest problems on the path, and that this roadmap is trustworthy. A key breakthrough: 14X better error correction, which solves one of the most challenging problems in quantum computing: quantum decoherence. IBM will have a fully fault tolerant large-scale quantum computer by 2029, IBM fellow and director of quantum systems Jerry Chow told me on the TechFirst podcast. 'We really have a path to make this viable in this timescale." IBM is aiming high. Until today, the company says, a clear path to building a large-scale fault-tolerant quantum computer without unrealistic engineering overhead has not been published. Starling will be such a computer, and Blue Jay, the next quantum computer in IBM's roadmap, will have 2,000 logical qubits, and could run a billion quantum operations effectively instantly. 'Our expertise across mathematics, physics, and engineering is paving the way for a large-scale, fault-tolerant quantum computer — one that will solve real-world challenges and unlock immense possibilities for business,' says Arvind Krishna, Chairman and CEO of IBM. One of the key innovations is better error correction. Today's quantum computers can require 10,000 physical qubits to form a single logical qubit that is fault-tolerant enough to run meaningful operations. IBM's new error correction system, first unveiled in Nature magazine in 2024, reduces the number of physical qubits required by about 90%. The result is that instead of needing millions of physical qubits for a useful quantum computer, IBM is aiming to achieve a ratio of hundreds or perhaps thousands of physical qubits per logical qubit. That's a massive improvement: even a quantum leap, if the company can pull it off. IBM's path to Starling is iterative and grounded in hardware milestones: Another innovation is a novel way of interlinking qubits in a three-dimensional matrix that Chow likened to a physical neural network, where qubits are connected like neurons in a brain, enabling richer and more scalable interactions. IBM's new quantum computers will use a much more complex 3D lattice connection that increases the ... More number of connections between qubits. In addition, Chow told me IBM is getting less and less bespoke in its processes for building quantum computers. That means the company is taking steps towards mass manufacturing, another key step to making quantum computers less science experiment and more standard engineering and production problem. 'I've been working with superconducting qubits since 2005, and it's always been a rather bespoke process of design,' he says. 'You simulate, you design, and you lay out by hand, and simulate … but then over time we've really developed a lot of the toolboxes that you'd need for advanced manufacturing methods, advanced simulation methods to really get parameters close to first-time-right.' As projected, Starling will be so massively capable that to represent its computational state would require the memory of more than a quindecillion of the world's most powerful supercomputers, IBM says. A quindecillion is a 1 followed by 48 zeros. To make that make sense, think of counting to one quindecillion. At one number per second, it would take longer than the age of the universe. It gets worse: if you had a quindecillion dollars, and spent a trillion dollars every second, you'd still be spending for longer than the universe has existed. In other words, this is quantum supremacy: the point where a quantum computer can perform a calculation that is practically impossible for any classical computer to do in a reasonable amount of time. That means Starling would enable transformative progress in domains like: • Drug discovery • Advanced materials • Battery chemistry • Optimization problems Of course, to make that reality, there's at least four long years of hard work. Time will tell if IBM Quantum can deliver on this promised roadmap.

Canada's Xanadu achieves worldwide first with error-resistant quantum chip
Canada's Xanadu achieves worldwide first with error-resistant quantum chip

Globe and Mail

time04-06-2025

  • Business
  • Globe and Mail

Canada's Xanadu achieves worldwide first with error-resistant quantum chip

Toronto startup Xanadu Quantum Technologies Inc. is reporting a new milestone in the effort to develop a form of light-based quantum computing that can operate at commercial scale. For the first time anywhere, Xanadu researchers have created a single chip that embodies a powerful type of error-detection code in a pulse of laser light. If a number of such chips could be harnessed together, it would open the door to a quantum computer that can deliver reliable results with practical value. 'This is something that's been on our roadmap for a long time,' Zachary Vernon, Xanadu's chief technology officer for hardware, told The Globe and Mail. A technical description of the chip was published Wednesday in the journal Nature. The development is significant 'because the chip platform is supposed to be scalable,' said Daniel Soh, an associate professor of optical science at the University of Arizona in Tucson. 'In the future, we will need millions or billions of this kind of devices on a chip. This result is a massive step towards that goal,' said Dr. Soh, who is not affiliated with Xanadu. Canada 'a sweet spot' for growing quantum computing industry, expert says Christian Weedbrook, Xanadu's founder and chief executive officer, said the development means it is possible to envision a quantum-computing system operating at the scale of a data centre, with some 5,000 servers fitting into a facility less than 10,000 square metres in size. 'We're also thinking ahead to how we can add more density in there, so that'll change,' he said. Earlier this year Xanadu published a result showing how its form of quantum computing could be easily modularized. This latest step is aimed at making a machine large enough to solve relevant problems but not so large that it becomes impractical for commercial purposes. It is the latest example of a shift in the focus and tempo of advancements in the quantum computing world. Overall, the goal remains to create a computer that runs on qubits – interconnected physical elements that exhibit quantum behaviour – instead of the standard bits of a conventional digital system. Where a bit can be used to represent a one or a zero in a mathematical calculation, a qubit can be a mixture of both. This dual nature, when combined with many other qubits, is what allows a quantum computer, in principle, to vastly outperform a conventional computer at certain kinds of calculations that are important for data security and other applications. While various companies, including Google, IBM and Microsoft, have experimented with different types of qubits, all of them face the same challenge: Quantum systems are sensitive to disturbance and difficult to isolate from the rest of the world, which makes quantum computers especially error-prone. To counter this, qubits can be linked to check each other for signs of failure during a calculation. But the price for such redundancy is that many more qubits are needed to build a reliable computer powerful enough to solve real-world problems. More recently, teams have sought to exploit various mathematical codes, which are ways of tying qubits together, to make error correction more robust. Of particular interest are Gottesman-Kitaev-Preskill (GKP) codes. First proposed in 2001, they are challenging to implement but especially amenable for quantum computer builders such as Xanadu, whose machines use qubits made of light moving through a fibre-optic network. Xanadu's new chip corrals incoming particles of light, called photons, into a quantum state that allows them to work together to form a GKP qubit. The chip has four outputs, three of which are connected to detectors that can reveal whether the fourth is in a state that would allow it to be useful for a quantum calculation. In a working quantum computer, such chips would provide an initial layer of error detection that would then be further augmented by other error-correction techniques when chips are combined. Similar strategies are being explored by other companies. Last week, Nord Quantique, based in Sherbrooke, Que., demonstrated that it had successfully encoded microwave photons bouncing around inside a metal cavity with a GKP code. Meanwhile, Xanadu still has more obstacles to overcome. Chief among them is finding ways to overcome signal loss, which occurs when photons are absorbed by the materials they are moving through. In addition to making its light-based technology work, Xanadu and direct competitors such as PsiQuantum, Corp. of Palo Alto, Calif., are racing against big tech companies developing computers with qubits that rely on special superconducting materials kept at extremely cold temperatures. Light-based systems offer a different set of advantages, including the fact that they can operate at room temperature. While no system has yet emerged as a clear winner, Dr. Soh says light-based quantum computers may end up inching ahead because once the key technical challenges are solved, they will be easier to scale up.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store