logo
Do Wobbling Muons Point the Way to New Physics?

Do Wobbling Muons Point the Way to New Physics?

The Standard Model of particle physics—the best, most thoroughly vetted description of reality scientists have ever devised—appears to have fended off yet another threat to its reign.
At least, that's one interpretation of a long-awaited experimental result announced on June 3 by physicists at the Fermi National Accelerator Laboratory, or Fermilab, in Batavia, Ill. An alternative take would be that the result—the most precise measurement ever made of the magnetic wobble of a strange subatomic particle called the muon —still remains the most significant challenge to the Standard Model's supremacy. The results have been posted on the preprint server arXiv.org and submitted to the journal Physical Review Letters.
The muon is the electron's less stable, 200-times-heavier cousin. And like the electron and all other charged particles, it possesses an internal magnetism. When the muon's inherent magnetism clashes with an external magnetic field, the particle precesses, torquing to and fro like a wobbling, spinning top. Physicists describe the speed of this precession using a number, g, which almost a century ago was theoretically calculated to be exactly 2. Reality, however, prefers a slightly different value, arising from the wobbling muon being jostled by a surrounding sea of 'virtual' particles flitting in and out existence in the quantum vacuum. The Standard Model can be used to calculate the size of this deviation, known as g−2, by accounting for all the influences of the various known particles. But because g−2 should be sensitive to undiscovered particles and forces as well, a mismatch between a calculated deviation and an actual measurement could be a sign of new physics beyond the vaunted Standard Model's limits.
On supporting science journalism
If you're enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.
That's the hope, anyway. The trouble is that physicists have found two different ways to calculate g−2, and one of those methods, per a separate preprint paper released on May 27, now gives an answer that closely matches the measurement of the muon anomalous magnetic moment, the final result from the Muon g−2 Experiment hosted at Fermilab. So a cloud of uncertainty still hangs overhead: Has the most significant experimental deviation in particle physics been killed off by theoretical tweaks just when its best-yet measurement has arrived, or is the muon g−2 anomaly still alive and well? Vexingly, the case can't yet be conclusively closed.
The Latest Word—But Not the Last
The Muon g−2 Collaboration announced the results on Tuesday in a packed auditorium at Fermilab, offering the audience (which included more than 1,000 people watching via livestream) a brief history of the project and an overview of its final outcome. The heart of the experiment is a giant 50-foot-diameter magnet, which acts as a racetrack for wobbling muons. In 2001, while operating at Brookhaven National Laboratory on Long Island, this ring revealed the initial sign of a tantalizing deviation. In 2013 physicists painstakingly moved the ring by truck and barge from Brookhaven to Fermilab, where it could take advantage of a more powerful muon source. The Muon g−2 Collaboration began in 2017. And in 2021 it released the first result that strengthened earlier hints of an apparent anomaly, which was bolstered further by additional results announced in 2023. This latest result is a capstone to those earlier measurements: the collaboration's final measurement gives a value of 0.001165920705 for g−2, consistent with previous results but with a remarkable precision of 127 parts per billion. That's roughly equivalent, it was noted during the June 3 announcement, to measuring the weight of a bison to the precision of a single sunflower seed.
Despite that impressive feat of measurement, interpretation of this result remains an entirely different matter. The task of calculating Standard Model predictions for g−2 is so gargantuan that it brought together more than 100 theorists for a supplemental project called the Muon g−2 Theory Initiative.
'It is a community effort with the task to come up with a consensus value based on the entire available information at the time,' says Hartmut Wittig, a professor at the University of Mainz in Germany and a member of the theory initiative's steering committee. 'The answer to whether there is new physics may depend on which theory prediction you compare against. The consensus value should put an end to this ambiguity.'
In 2020 the group published a theoretical calculation of g−2 that appeared to confirm the discrepancy with the measurements. The May preprint, however, brought significant change. The difference between theory and experiment is now less than one part per billion, a number both minuscule and much smaller than the accompanying uncertainties, which has led to the collaboration's consensus declaration that there is 'no tension' between the Standard Model's predictions and the measured result.
Virtual (Particle) Insanity
To understand what brought this shift in the predictions, one has to look at one category of the virtual particles that cross the muons' path.
'[Excepting gravity] three out of the four known fundamental forces contribute to g−2: electromagnetism, the weak interaction and the strong interaction,' Wittig explains. The influence of virtual photons (particles of light that are also carriers of the electromagnetic force) on muons is relatively straightforward (albeit still laborious) to calculate, for instance. In contrast, precisely determining the effects of the strong force (which usually holds the nuclei of atoms together) is much harder and is the least theoretically constrained among all g−2 calculations.
Instead of dealing with virtual photons, those calculations grapple with virtual hadrons, which are clumps of fundamental particles called quarks glued together by other particles called (you might have guessed) gluons. Hadrons can interact with themselves to create tangled, precision-scuttling messes that physicists refer to as 'hadronic blobs,' enormously complicating calculations of their contributions to the wobbling of muons. Up to the 2020 result, researchers indirectly estimated this so-called hadronic vacuum polarization (HVP) contribution to the muon g−2 anomaly by experimentally measuring it for electrons.
One year later, though, a new way of calculating HVP was introduced based on lattice quantum chromodynamics (lattice QCD), a computationally intensive methodology, and quickly caught on.
Gilberto Colangelo, a professor at the University of Bern in Switzerland and a member of the theory initiative's steering committee, points out that, currently, 'on the lattice QCD side, there is a coherent picture emerging from different approaches. The fact that they agree on the result is a very good indication that they are doing the right thing.'
While the multiple flavors of lattice QCD computations improved and their results converged, though, the experimental electron-based measurements of HVP went the opposite way. Among seven experiments seeking to constrain HVP and tighten predictive precision, only one agreed with the lattice QCD results, while there was also deviation among their own measurements.
'This is a puzzling situation for everyone,' Colangelo notes. 'People have made checks against each other. The [experiments] have been scrutinized in detail; we had sessions which lasted five hours.... Nothing wrong was found.'
Eventually, the theory initiative decided to use only the lattice QCD results for the HVP factor in this year's white paper, while work on understanding the experimental results is going on. The choice moved the total predicted value for g−2 much closer to Fermilab's measurement.
The Standard Model Still Stands Tall
The Standard Model has seen all of its predictions experimentally tested to high precision, giving it the title of the most successful theory in history. Despite this, it is sometimes described as something unwanted or even failed because it does not address general open questions, such as the nature of dark matter hiding in galaxies.
In the solid terms of experimental deviations from its predictions, this century has seen the rise and fall of many false alarms.
If the muon g−2 anomaly goes away, however, it will also take down some associated contenders for new, paradigm-shifting physics; the absence of novel types of particles in the quantum vacuum will put strong constraints on 'beyond the Standard Model' theories. This is particularly true for the theory of supersymmetry, a favorite among theorists, some of whom have tailored a plethora of predictions explaining away the muon g−2 anomaly as a product of as-yet-unseen supersymmetric particles.
Kim Siang Khaw, an associate professor at Shanghai Jiao Tong University in China and a member of Fermilab's Muon g−2, offers a perspective on what will follow. 'The theory initiative is still a work in progress,' he says. 'They may have to wait several more years to finalize. [But] every physics study is a work in progress.' Khaw also mentions that currently Fermilab is looking into repurposing the muon 'storage ring' and magnet used in the experiment, exploring more ideas that can be studied with it.
Finally, on the theory front, he muses: 'I think the beauty of [the g−2 measurement] and the comparison with the theoretical calculation is that no matter if there is an anomaly or no anomaly, we learn something new about nature. Of course, the best scenario would be that we have an anomaly, and then we know where to look for this new physics. [But] if there is nothing here, then we can look somewhere else for a higher chance of discovering new physics.'
Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

Record-Setting Qubit Performance Marks Important Step Toward Practical Quantum Computing
Record-Setting Qubit Performance Marks Important Step Toward Practical Quantum Computing

Gizmodo

timea day ago

  • Gizmodo

Record-Setting Qubit Performance Marks Important Step Toward Practical Quantum Computing

The promise of so-called 'quantum advantage' is simple. By harnessing the counterintuitive rules of quantum mechanics, quantum computers should be able to—in theory—surpass the computational potential of any classical supercomputer. But before quantum advantage drastically changes information technology as we know it, researchers have yet to address the many hurdles that are preventing quantum computers from entering into the mainstream. That said, quantum computing as a field has evolved dramatically over the last few years, and physicists are increasingly getting better at dealing with the extreme quirkiness of these potentially revolutionary systems. One such breakthrough concerns qubits—the smallest unit of information for quantum computers, much like a classical bit (0 or 1) on an ordinary computer. In a paper published Tuesday in Nature Communications, researchers announced a major milestone in improving the quality of qubits: a record-breaking coherence time for transmon qubits, a type of superconducting qubit. Their record—a maximum duration of 1 millisecond—far surpasses the previous time of 0.6 milliseconds, set by Fermilab last year. Scientists are interested in coherence time for a variety of reasons. Unlike classical binary bits, qubits can exist in superpositions of multiple states, much like different points on a sphere. This particularity of qubits allows quantum bits to carry and process an exponentially larger load of data on a scale that far outperforms any conventional supercomputer. Ironically, it's this exact quality that also makes qubits extremely sensitive to background noise, meaning they 'kind of pick up everything you also don't want,' explained Mikko Möttönen, the paper's senior author, during a video call with Gizmodo. When this happens, the qubits lose the valuable information they contain in a process called qubit decoherence. To accommodate for this data loss, scientists commonly apply a procedure called quantum error correction, in which they place single, physical qubits (like a transmon chip) into an intricate circuit collectively referred to as a 'logical qubit,' said Ioan Pop, a physicist at the Karlsruhe Institute of Technology in Germany, during a video call with Gizmodo. Although not involved in the study, Pop—a collaborator of Möttönen on a separate project—noted that such arrangements help quantum computers 'fight decoherence more effectively.' But quantum error correction can't completely recover the information lost from decoherence, prompting Möttönen and his team to investigate alternative approaches for fabricating the physical qubits themselves. The steps they took ranged from testing multiple wiring arrangements to simply making sure they had clean interfaces for the circuits. After multiple attempts, they stumbled upon a revision that resulted in a record-breaking coherence time of 1 millisecond. This might seem like an insignificantly small amount of time, but it's long enough for quantum computers to perform a tremendous number of complex operations, Möttönen explained (generally, qubits operate on a time of nanoseconds; one millisecond is equivalent to one thousand nanoseconds). Longer coherence time should reduce the amount of time and energy that goes into quantum error correction, Möttönen, a physicist at Aalto University in Finland, added. While there's no known way to completely eliminate qubit decoherence—a highly unlikely possibility—longer coherence times mean less frequent errors, especially when qubit numbers are scaled up, as is often the case with many existing quantum computers. For example, Google's Sycamore processor, which the company claimed had achieved quantum advantage in 2019, featured 53 qubits, whereas Quantinuum's processor, which supposedly outperformed Google's results, had 56 (to be clear, neither result, while impressive, actually achieved quantum advantage). 'I think the paper shows how much you can gain from being very careful with the fabrication,' said Pop. 'Am I surprised that clearing interfaces gives better qubits? I would say I'm not surprised. Am I impressed that they managed to do it? Yes—because it's not easy to control; it's basically like cooking, and it's very difficult to keep all parameters under control.' Having said that, the new result is more akin to one of 'probably a hundred or thousand more of these steps' to get to where we ultimately want quantum computers to go in terms of functionality, Pop added. 'I think what's super exciting is now that these quantum computers are already so accurate that you can do reasonable circuits,' Möttönen said. 'I think we just need them to be a little bit better [functionally], not just one random result but something more concrete. It will take a few years but not so long. It seems to be quite close.'

IonQ and the University of Washington Simulate Process Linked To The Universe's Matter-Antimatter Imbalance
IonQ and the University of Washington Simulate Process Linked To The Universe's Matter-Antimatter Imbalance

Business Wire

time25-06-2025

  • Business Wire

IonQ and the University of Washington Simulate Process Linked To The Universe's Matter-Antimatter Imbalance

COLLEGE PARK, Md.--(BUSINESS WIRE)--IonQ (NYSE: IONQ), a leading commercial quantum computing and networking company, today announced the first known simulation using a quantum computer of a process called 'neutrinoless double-beta decay' with profound implications for understanding the universe's imbalance between matter and antimatter. The Big Bang should have made equal amounts of matter and antimatter. However, almost everything we see is made of matter, and there's very little antimatter around. One of the biggest questions in physics is: what happened to the missing antimatter? Scientists are looking for the root cause of this imbalance to uncover insights into the fundamental laws of physics. Using IonQ's Forte Enterprise quantum system, researchers observed in real-time what's known as a 'lepton-number violation,' a phenomenon never directly simulated before on a quantum computer, providing further evidence that quantum computers may be able to model fundamental physics processes beyond the reach of classical systems. This demonstration opens a new path in the global efforts to understand why the universe is composed predominantly of matter rather than antimatter. The hypothesized 'neutrinoless double-beta decay' nuclear process suggests that neutrinos are their own antiparticles and that violates a principle in the Standard Model of particle physics. This technique allows scientists to use quantum computers and simulate the nuclear dynamics on the shortest of time-scales (10 −24 seconds). This is shorter even than the femto-second (10 −15 seconds) imaging demonstrations in the 1990s, which gave chemists new insights into chemical reactions, and revealed how atoms re-arrange during the breaking and formation of chemical bonds. Similar scientific breakthroughs could be enabled by this new quantum computing technique, with potential applications to high-energy physics laboratories around the globe. 'This achievement reinforces IonQ's commitment to pushing the boundaries of what quantum computing can accomplish,' said Niccolo de Masi, CEO of IonQ. 'By simulating a fundamental physics process so rare it's never been observed in nature, we're showing that quantum computers are not just theoretical tools. They're engines of discovery.' The simulation was conducted in collaboration with researchers from the University of Washington's InQubator for Quantum Simulation (IQuS) and the U.S. Department of Energy's Quantum Science Center. The team employed a co-designed approach, customized for taking full advantage of IonQ's quantum hardware capabilities, including all-to-all connectivity, and native gates at the core of IonQ's trapped-ion architecture. This allowed for the efficient mapping of the problem onto Forte-generation systems using 32 qubits, with an additional 4 qubits used for error mitigation. Novel quantum circuit compilation and error-mitigation techniques supported this large simulation with 2,356 two-qubit gates, resulting in high-precision observations. 'This work represents a critical first step in exploring the re-arrangement of quarks and gluons in this fundamental and complex decay-mode of a nucleus on yocto-second time-scales (10 -24) seconds),' said Martin Savage, Professor of Physics at the University of Washington and head of the InQubator for Quantum Simulation (IQuS). 'This was the culmination of a year-long co-design effort between IonQ and IQuS, centered around IonQ's forefront trapped-ion quantum computers.' The findings not only validate the use of quantum modeling in nuclear and particle physics but also set the stage for future research into other processes where lepton number violation may occur. As hardware capabilities grow, IonQ aims to expand these techniques to explore other symmetry-breaking phenomena and advance the frontier of quantum-enabled fundamental physics. The full findings and research paper are available at About IonQ IonQ, Inc. is a leading commercial quantum computing and networking company, delivering high-performance systems aimed at solving the world's largest and most complex commercial and research use cases. IonQ's current generation quantum computers, IonQ Forte and IonQ Forte Enterprise, are the latest in a line of cutting-edge systems and represent the forefront of the company's technological roadmap as it advances toward its goal of building quantum computers with 2 million physical qubits by 2030. The company's innovative technology and rapid growth were recognized in Newsweek's 2025 Excellence Index 1000, Forbes' 2025 Most Successful Mid-Cap Companies list, and Built In's 2025 100 Best Midsize Places to Work in Washington DC and Seattle, respectively. Available through all major cloud providers, IonQ is making quantum computing more accessible and impactful than ever before. Learn more at IonQ Forward-Looking Statements This press release contains certain forward-looking statements within the meaning of Section 27A of the Securities Act of 1933, as amended, and Section 21E of the Securities Exchange Act of 1934, as amended. Some of the forward-looking statements can be identified by the use of forward-looking words. Statements that are not historical in nature, including the words 'advance,' 'aimed,' 'aims,' 'available,' 'commitment,' 'can,' 'could,' 'critical,' 'cutting-edge,' 'delivering,' 'expand,' 'explore,' 'exploring,' 'forefront,' 'frontier,' 'future,' 'grow,' 'implications,' 'latest,' 'leading,' 'may,' 'opens,' 'path,' 'possible,' 'pushing,' 'solving,' and other similar expressions are intended to identify forward-looking statements. These statements include those related to the IonQ's quantum computing capabilities and plans; IonQ's technology driving commercial quantum advantage or delivering scalable, fault-tolerant quantum computing in the future; the relevance and utility of quantum algorithms and applications run on IonQ's quantum computers; the necessity, effectiveness, and future impacts of IonQ's offerings available today; and the scalability, fidelity, efficiency, accessibility, effectiveness, importance, reliability, precision, performance, speed, impact, practicality, feasibility, and commercial-readiness of IonQ's offerings. Forward-looking statements are predictions, projections, and other statements about future events that are based on current expectations and assumptions and, as a result, are subject to risks and uncertainties. Many factors could cause actual future events to differ materially from the forward-looking statements in this press release, including but not limited to: IonQ's ability to implement its technical roadmap; changes in the competitive industries in which IonQ operates, including development of competing technologies; IonQ's ability to deliver, and customers' ability to generate, value from IonQ's offerings; IonQ's ability to deliver higher speed and fidelity gates with fewer errors, enhance information transfer and network accuracy, or reduce noise and errors; IonQ's ability to sell effectively to governmental organizations and large enterprises; IonQ's ability to implement its business plans, forecasts, roadmaps and other expectations, to identify and realize partnerships and opportunities, and to engage new and existing customers; IonQ's ability to effectively enter new markets; IonQ's ability to deliver services and products within currently anticipated timelines; IonQ's customers deciding or declining to extend contracts into new phases; changes in U.S. government spending or policy that may affect IonQ's customers; and risks associated with U.S. government sales, including availability of funding and provisions that allow the government to unilaterally terminate or modify contracts for convenience. You should carefully consider the foregoing factors and the other risks and uncertainties disclosed in the Company's filings, including but not limited to those described in the 'Risk Factors' section of IonQ's filings with the U.S. Securities and Exchange Commission, including but not limited to the Company's most recent Annual Report on Form 10-K and reports on Form 10-Q. These filings identify and address other important risks and uncertainties that could cause actual events and results to differ materially from those contained in the forward-looking statements. Forward-looking statements speak only as of the date they are made. Readers are cautioned not to put undue reliance on forward-looking statements, and IonQ assumes no obligation and does not intend to update or revise these forward-looking statements, whether as a result of new information, future events, or otherwise. IonQ does not give any assurance that it will achieve its expectations.

Scientists may have found evidence of a fifth ‘force of nature'
Scientists may have found evidence of a fifth ‘force of nature'

Yahoo

time22-06-2025

  • Yahoo

Scientists may have found evidence of a fifth ‘force of nature'

If you purchase an independently reviewed product or service through a link on our website, BGR may receive an affiliate commission. Every action in our world is powered by a 'force of nature.' Currently, there are four main forces that scientists cling to; gravity, electromagnetism, weak interaction, and strong interaction. The latter two are technically considered nuclear forces. However, some scientists believe a fifth force of nature may exist, and a new paper claims to have found evidence of it. A group of researchers from Switzerland, Australia, and Germany believe that this fifth force could be hiding deep within the hearts of atoms. While the Standard Model of physics has evolved over the years to help explain quantum and cosmic examples, there are still some massive gaps that leave scientists and physicists baffled. Today's Top Deals Best deals: Tech, laptops, TVs, and more sales Best Ring Video Doorbell deals Memorial Day security camera deals: Reolink's unbeatable sale has prices from $29.98 Dark matter is a big one, of course, and even gravity hasn't been fully solved, despite being one of the primary forces of nature. Introducing a fifth force of nature, as well as other fields and particles, could broaden our understanding of the universe in important ways. But finding the evidence to prove these forces actually exist is the difficult part. That's why the researchers involved in this new study started small. Instead of trying to work at a cosmic scale, they started looking at things on an atomic level. They focused their attention on the nuclei of four different kinds of calcium. Typically, electrons are confined by the attraction between their own charge and the positively charged particles in the center of the atom. But if you give them a little kick, they can actually transcend to a higher orbit. This phenomenon is known as atomic transition. The exact timing of the jump depends heavily on the construction of the nucleus, which means an element can have multiple atomic transitions depending on the number of neutrons found within it. The researchers believe that a fifth force of nature could be the driving engine behind these small interactions. Their experiments found that there was a small amount of room between the atomic transitions — just enough room for a particle with a mass believed to be somewhere between 10 and 10 million electronvolts. Determining whether or not that ambiguity is indeed another force of nature will require additional experimentation and improved calculations, though. More Top Deals Amazon gift card deals, offers & coupons 2025: Get $2,000+ free See the

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store