logo
Why ‘Evolving' Dark Energy Worries Some Physicists

Why ‘Evolving' Dark Energy Worries Some Physicists

Yahoo02-05-2025
In 2024 a shockwave rippled through the astronomical world, shaking it to the core. The disturbance didn't come from some astral disaster at the solar system's doorstep, however. Rather it arrived via the careful analysis of many far-distant galaxies, which revealed new details of the universe's evolution across eons of cosmic history. Against most experts' expectations, the result suggested that dark energy—the mysterious force driving the universe's accelerating expansion—was not an unwavering constant but rather a more fickle beast that was weakening over time.
The shocking claim's source was the Dark Energy Spectroscopic Instrument (DESI), run by an international collaboration at Kitt Peak National Observatory in Arizona. And it was so surprising because cosmologists' best explanations for the universe's observed large-scale structure have long assumed that dark energy is a simple, steady thing. But as Joshua Frieman, a physicist at the University of Chicago, says: 'We tend to stick with the simplest theory that works—until it doesn't.' Heady with delight and confusion, theorists began scrambling to explain DESI's findings and resurfaced old, more complex ideas shelved decades ago.
In March 2025 even more evidence accrued in favor of dark energy's dynamic nature in DESI's latest data release—this time from a much larger, multimillion-galaxy sample. Dark energy's implied fading, it seemed, was refusing to fade away.
[Sign up for Today in Science, a free daily newsletter]
Soon afterward, however, Daniel Green, a physicist at the University of California, San Diego, took to social media to argue over the DESI team's preferred interpretation of the data.
'I'm particularly skeptical of DESI's press release,' Green says. 'The tendency should be to say, 'Hey, why don't we explore all the possible interpretations?' DESI didn't do that many analyses.' The situation, Green says, is akin to looking for a lost set of car keys in a dark parking lot—but only where the light is bright: 'When all you look under is one lamppost, you only see what you find there.'
Other explanations exist for DESI's measurements, Green says, and not all of them require the cosmos-quaking prospect of an evolving dark energy. His preferred model instead invokes the putative decay of another mysterious aspect of cosmology, dark matter—thought to be a substance that gravitationally binds galaxies together but otherwise scarcely interacts with the rest of the universe at all. Yet his and other alternative proposals, too, have drawbacks, and the resulting scientific debate has only just begun.
The standard cosmological model at the heart of all this is known as 'LCDM.' The 'CDM' component stands for 'cold dark matter,' and the 'L' stands for the Greek letter 'lambda,' which denotes a constant dark energy. CDM is the type of dark matter that best accounts for observations of how galaxies form and grow, and—until DESI's proclamation suggested otherwise, that is—a constant dark energy has been the best fit for explaining the distributions of galaxies and other patterns glimpsed in large-scale cosmic structures. 'Once they had this constant, everything snapped into place,' Green says. 'All of the issues that had been around for 20 years that we'd been hoping were just small mistakes were really resolved by this one thing.'
But dark energy's constancy has always been more of a clever inference rather than an ironclad certainty. DESI is an effort to clarify exactly what dark energy really is by closely monitoring how it has influenced the universe's growth. Since 2021 the project has been meticulously measuring the motions and distributions of galaxies across some 11 billion years of cosmic time.
DESI's data on galactic motions come from measurements of redshift, the stretching out of galaxies' emitted light to the red end of the spectrum by the universe's expansion. And its tracing of spatial distributions emerges from spying enormous bubblelike arrangements of galaxies thought to have formed from more primordial templates, called baryon acoustic oscillations (BAOs). BAOs are essentially ripples from giant sound waves that coursed through the hot plasma that filled the early universe, which astronomers can glimpse in the earliest light they can see, the big bang's all-sky afterglow known as the cosmic microwave background (CMB). The waves' matter-dense crests sowed the seeds of future galaxies and galaxy clusters, while galaxy-sparse voids emerged from the matter-poor troughs. Combined with CMB data as well as distance-pegging observations of supernovae, DESI's measurements offer a reckoning of the universe's historic growth rate—and thus the action of dark energy.
DESI co-spokesperson Nathalie Palanque-Delabrouille, a physicist at Lawrence Berkeley National Laboratory, recalls the private December 2023 meeting where she and the rest of the DESI team first learned of the project's early results. Up until then, the researchers had worked on blinded data, meaning the true values were slightly but systematically altered so as to ensure that no one could deliberately or inadvertently bias the ongoing analysis to reach some artificially preordained result. These blinded data showed a huge divergence from LCDM. But when the real data were unveiled, 'we saw all the points came very close to LCDM, and that was initially a huge relief,' she recalls. That alignment suggested 'we did things right.'
Those feelings quickly changed when the group noticed a small, persistent deviation in DESI's estimate for the value of lambda. Still, there was a considerable chance that the results were a statistical fluke. But in DESI's latest results, which were posted to the preprint server arXiv.org last March and incorporated much larger and richer data sets, the statistical robustness of the unexpected lambda value soared, and most talk of flukes dwindled.
Theorists could scarcely contain their excitement—or their profound puzzlement. The results rekindled preexisting ideas about dynamic dark energy first formulated decades ago, not long after dark energy's discovery itself in 1998. One popular theory posits a fifth fundamental force in addition to the known four (electromagnetism, gravity, and the strong and weak nuclear forces), emerging from some as-yet-undiscovered dark matter particle that can influence dark energy. Frieman says the data from DESI is so precise that if this particle is the correct explanation, physicists already know its crucial parameters.
Constrained by the DESI data, Frieman says, the best-fitting model that would support this 'fifth force' hypothesis 'tells us that this [hypothetical] particle has a mass of about 10–33 electron volts.' To put that into perspective, this means such a particle would be 38 orders of magnitude lighter than an electron—which, Frieman notes, is 'by far the lightest stable particle we know of that doesn't have zero mass.'
But while some theorists used DESI's data to revive and sharpen intriguing theories of yesteryear, Green and others issued a warning. The problem: an evolving dark energy would seem to defy well-founded physical principles in other cosmic domains.
The first major point of controversy involves something called the null energy condition, under which—among other things—energy can't propagate faster than light. If circumstances were otherwise, then perilous paradoxes could emerge: time machines could violate causality, matter could repel rather than attract, and even spacetime itself could be destabilized. Theorists have mathematically proven the condition's apparent necessity in numerous circumscribed scenarios within quantum and relativistic domains—but not for the universe at large. Appealing to this sort of theoretical incompleteness, however, 'is like a lawyer saying there's a loophole,' Green says. 'Most physicists would say that's totally crazy.'
A discovery that something in the universe violates the null energy condition would be groundbreaking, to say the least: a more impolitic term would be 'nonsensical.' This astounding violation is exactly what Green and others say most of DESI's analyses are showing, however. On this point, several theorists push back. The controversy goes all the way down to the foundations of modern cosmology, centering on a parameter unceremoniously known as w(z).
In 1917 Albert Einstein first introduced lambda as a way to ensure that a static universe would pop out of his equations. But after work led by Edwin Hubble proved the universe was expanding, Einstein abandoned his fudge factor (even calling it his 'greatest blunder'). It wasn't until the late 1990s, when astronomers found that the universe's expansion wasn't constant but in fact accelerating, that lambda once again returned to theoretical prominence. This time theorists interpreted it to represent the magnitude of the universe's dark energy density, a constant that doesn't change with time.
But if there's one thing modern cosmology has shown, it's that little, if anything, about the universe is ever so neat and tidy. So, despite a lack of evidence, theorists of the time reimagined LCDM as w(z)CDM, where w(z) is a time-varying term representing the ratio of dark energy's pressure to its energy density. When w(z) has a value of exactly –1, w(z)CDM is equivalent to LCDM. For w(z) greater than –1, the universe's dark energy dilutes over time, consistent with DESI's findings. On the other hand, w(z) less than –1 leads to devastating consequences: dark energy's pressure overpowers its density, ultimately causing everything from galaxies all the way down to atoms to be ripped apart—a 'big rip' that violates the null energy condition and would seemingly doom the universe to a violent death.
The DESI group collaboration's March preprint includes a graph that shows w(z) with values below –1 for later epochs in the universe's history, seemingly validating the criticisms of Green and others. But all is not as it seems. Such criticisms 'draw the wrong conclusions,' says Paul Steinhardt, a cosmologist at Princeton University.
That's because in a second graph in the DESI paper, w(z) never crosses the critical –1 line. The difference: despite DESI's curved data, the first chart uses a simple line fit for w(z). Steinhardt and Frieman both say that because of the poor fit, the linear w(z) isn't physically meaningful. Researchers merely find it convenient for comparing different dark energy models and experiments.
The second graph shows a curved fit for w(z) that more closely matches the data. It rolls down to, but never crosses, the critical –1 value, consistent with a weakening dark energy that would avoid the universe ending in a big rip.
But Gabriel Lynch, a Ph.D. student at the University of California, Davis, who has an alternative explanation for the DESI data, says that even if any of DESI's w(z) estimates are physical, coaxing out a theory to support them leads to incredibly fraught circumstances. 'This is saying something weird,' Lynch says. 'It's not impossible, but maybe it would be good to look into some alternatives.'
Whether or not DESI's results would violate the null energy condition, everyone agrees on another problem. Models that accommodate a changing dark energy inevitably conclude that a class of tiny fundamental particles known as neutrinos have a negative mass. Yet multiple generations of empirical experimentation have indisputably shown that neutrinos do have mass. Frieman suggests that something else, perhaps an unknown particle, might be mimicking a negative-mass neutrino.
But a new approach by Lynch and his thesis advisor Lloyd Knox, detailed in a preprint that was posted to arXiv.org in March, sidesteps this 'negative neutrino' problem altogether. If some of the mass in the universe somehow disappeared over time, its influence on DESI's data would be the same as a weakening dark energy—without necessitating a negative mass for neutrinos. Although physicists have good reasons to believe that certain seemingly stable subatomic particles could contribute to this notional effect by decaying over time, this process is thought to be far too slow to account for DESI's observations. For instance, experiments have shown the proton to be so stable that its half-life must be at least a hundred trillion trillion times the age of the universe. But no one knows what the half-life of putative particles of dark matter would be. So, Lynch asks, what if dark matter has a half-life of roughly a billion years? Fast forward about 14 billion years to today, and some would have decayed into dark radiation, erasing the heavy matter signal.
If the idea holds true, DESI's data might be a way to find the exact value for neutrino masses as well as for dark matter particles, which would be a big deal. 'That is a breakdown of LCDM that we totally expected,' Green says. 'And we were just waiting to detect it.'
Owing to dynamic dark energy's paradoxes, 'you really need to explore every alternative explanation [for the results], because evolving dark energy is the absolute last one that I would be willing to believe,' Green says.
Despite such strong words, all parties caution that this debate is still in its early days. 'This is only the first round of the fight,' Steinhardt says, and no model currently explains all of DESI's results. More data are needed, especially from even bigger and better cosmic surveys by planned next-generation telescopes. And, naturally, more analyses are needed, too, before the community can reach any consensus. Whether a resolution comes from dynamic dark energy, dark matter decay or something entirely different, the LCDM model has seemingly been stretched to its breaking point. Every reasonable explanation for DESI's data involves new, scarcely explored physics. 'They are all exotic models. We're beyond LCDM both ways,' Palanque-Delabrouille says. 'We just want to know the truth.'
Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

'Heed our warnings': Nobel laureates plea for diplomacy to prevent nuclear war
'Heed our warnings': Nobel laureates plea for diplomacy to prevent nuclear war

USA Today

timean hour ago

  • USA Today

'Heed our warnings': Nobel laureates plea for diplomacy to prevent nuclear war

Top nuclear experts gathered in Chicago to offer world leaders a playbook for reducing the risk of nuclear war. CHICAGO − In the fall of 2022, U.S. spies said the chances of Russia using tactical nuclear weapons against Ukraine were 50% − a coin flip. Nearly three years later, the risk of nuclear war has only increased, top experts say. The Bulletin of the Atomic Scientists' famed "Doomsday Clock" is the closest it has ever been to midnight. Humanity is 'heading in the wrong direction' on the one threat that 'could end civilization in an afternoon,' warned an assembly of Nobel laureates, nuclear experts, and diplomats gathered at the University of Chicago to mark the 80th anniversary of the planet's first nuclear explosion in 1945 when the U.S. conducted the Trinity test in New Mexico. Although Russia didn't nuke its neighbor, the brutal war of attrition continues in Ukraine. Two nuclear-armed countries, India and Pakistan, attacked each other in May. The U.S. and Israel, which both have nuclear weapons, bombed Iran in June to destroy its nuclear program. Popular support for building nuclear weapons grows in countries like Japan and South Korea. Against this backdrop, more than a dozen Nobel Prize winners and numerous nuclear experts signed a 'Declaration for the Prevention of Nuclear War' on July 16 with recommendations for world leaders to reduce the increasing risk of nuclear conflict. More: 80 years later, victims of 'first atom bomb' will soon be eligible for reparations 'Despite having avoided nuclear catastrophes in the past, time and the law of probability are not on our side,' the declaration says. 'Without clear and sustained efforts from world leaders to prevent nuclear war, there can be no doubt that our luck will finally run out.' The declaration emerged from days of discussion and debate, said assembly leader David Gross, a University of California, Santa Barbara, physicist and 2004 Nobel Prize winner. 'We are calling on our leaders in the world to consider our suggestions and heed our warnings,' Gross said. Longtime Vatican diplomat and nuclear advisor Cardinal Silvano Maria Tomasi argued that faith leaders should embrace a role in providing world leaders with independent moral and ethical assessments of nuclear policy and technology. International agreements key to reducing risk The declaration and speakers at its unveiling spoke extensively of the crucial role diplomacy and treaties played in building trust between countries with nuclear weapons and shrinking their arsenals after the Cold War. Clock ticks on nuke treaties But a key treaty remains unenforced, and the last remaining arms control agreement between the U.S. and Russia expires in February 2026. The Comprehensive Nuclear Test Ban Treaty, or CTBT, is a 1996 international agreement that aims to ban explosive nuclear tests. Although the CTBT Organization, headquartered in Vienna, Austria, successfully detects even underground nuclear tests (and identifies when suspicious seismic events aren't test explosions), the treaty is not in force. Nine more countries, including the U.S. and Russia (which de-ratified the CTBT in 2023), must formally approve the treaty before it becomes binding international law. At the assembly, CTBTO leader and former Australian diplomat Robert Floyd joined the Nobel winners in calling the international community to formally approve the testing ban. Floyd argued that if countries with nuclear weapons resumed testing to build more destructive nukes, it could lead 'other states to develop nuclear weapons and … a renewed global nuclear arms race.' The declaration also highlighted the need for the U.S., Russia, and China to enter arms control discussions. The 2010 New START treaty, which limits American and Russian nuclear weapons deployments and enables the rivals to verify the other's cooperation, expires in February 2026. AI and the atom bomb Artificial intelligence and its role in nuclear weapons matters also weighed heavily. The declaration emphasized the 'unprecedented and serious risks posed by artificial intelligence' and implored 'all nuclear armed states to ensure meaningful and enhanced human control and oversight over nuclear command and control.' Tomasi, the Vatican's representative, said scientists, disarmament experts and faith leaders need to study 'the ethical implications of emerging technologies,' such as AI, on 'nuclear stability.' World leaders, including former President Joe Biden and Chinese President Xi Jinping, generally agree that humans − and not AI algorithms − should control nuclear launch buttons. But debate rages over the ideal, and safe, extent of integrating AI into other nuclear functions such as early warning, targeting, and communications. A February 2025 report from the Center for a New American Security think tank on AI nuclear risk warned that 'overreliance on untested, unreliable, or biased AI systems for decision support during a crisis' could potentially lead decision-makers down an escalatory path during a nuclear crisis. Ultimately, argued Nobel winner Gross, progress in reducing the risks of nuclear weapons hinges on popular pressure on world leaders. 'The main motivation for the advances in reducing the risk of Armageddon was the fear of many … people throughout the world who demanded (action) from their leaders,' Gross said. Davis Winkie's role covering nuclear threats and national security at USA TODAY is supported by a partnership with Outrider Foundation and Journalism Funding Partners. Funders do not provide editorial input.

Why Nanomotion Leads the Way in Motion Systems Innovation
Why Nanomotion Leads the Way in Motion Systems Innovation

Time Business News

time3 hours ago

  • Time Business News

Why Nanomotion Leads the Way in Motion Systems Innovation

When it comes to precision movement in demanding environments, Nanomotion has firmly established itself as a trailblazer in motion systems innovation. With a product line that includes advanced motion solutions, sub-system modules, and highly specialized piezo motor/drive components, Nanomotion caters to a wide spectrum of industries—from medical devices and semiconductor fabrication to cutting-edge optronics. Leveraging proprietary technology rooted in the Piezoelectric Effect, Nanomotion's solutions are helping redefine what's possible in precision positioning and motion control. At the heart of Nanomotion's innovation lies its patented Ultrasonic Standing Wave Motor technology. Unlike traditional electromagnetic motors, these advanced motion systems offer unlimited linear and rotary movement with ultra-quiet operation, making them ideal for high-performance applications in noise-sensitive environments. The unique nature of ultrasonic motion allows Nanomotion motors to perform reliably under vacuum, in cleanrooms, and in harsh environments, all while achieving unparalleled precision. Whether it's micro-machining in semiconductor equipment or targeting accuracy in defense optronics, Nanomotion's motors are built to meet the highest standards. Their technology supports multiple motor sizes and power levels, offering flexibility for low-power applications—such as optronic sensors—and high-performance industrial automation systems. These motors operate within a closed-loop servo system, ensuring consistent, repeatable motion with nanometer-level resolution. So what makes Nanomotion's Piezoelectric Motors so powerful and precise? The answer lies in the Piezoelectric Effect, a phenomenon where certain materials generate an electrical charge when mechanical stress is applied—and conversely, deform when subjected to an electric field. Derived from the Greek word piezein meaning 'to squeeze or press,' this effect is what allows Nanomotion's piezoelectric elements to achieve sub-micron movements with high force. First discovered by Pierre and Jacques Curie in 1880, this principle has been refined and applied to numerous modern technologies—from ultrasound machines and microphones to atomic-level microscopes. Nanomotion uses this effect not only for its high precision but also for its responsiveness, scalability, and ability to operate silently. These properties are especially important in applications that demand both accuracy and compact design—two characteristics where Nanomotion's solutions excel. Nanomotion has leveraged the unique properties of piezo crystals to engineer a family of motors that offer unmatched accuracy, flexibility, and reliability. These Piezoelectric Motors are capable of generating precise motion by applying directional force through ultrasonic vibration. The motor consists of ceramic elements that, when energized, create waves which then drive a ceramic strip or platform forward. One of the standout features of these motors is their ability to maintain a static position without consuming power—providing inherent braking and eliminating servo dither. With motor configurations ranging from a single piezo element (producing 0.4Kg of force) to an eight-element version (producing 3.2Kg of force), Nanomotion motors can support a wide range of applications. Key features include: Linear and Rotary Motion : One motor design for multiple movement types : One motor design for multiple movement types Wide Dynamic Range : Speed from microns/second to 250mm/sec : Speed from microns/second to 250mm/sec Compact Form Factor : Ideal for embedded systems and small-scale assemblies : Ideal for embedded systems and small-scale assemblies High Holding Force: Enables inherent braking without power draw These characteristics make Nanomotion's Piezoelectric Motors ideal for industries like medical imaging, semiconductor lithography, defense targeting systems, and industrial automation. The piezoelectric materials that power Nanomotion's motors come in both natural and synthetic forms. Natural materials like quartz and tourmaline were the first to be used, but modern applications typically rely on synthetic materials like barium titanate and lead zirconate titanate (PZT) due to their higher piezoelectric constants. In response to environmental and regulatory demands (such as the EU's RoHS directive), there's an ongoing push toward lead-free alternatives. Nanomotion continues to explore these greener materials to meet sustainability goals without compromising performance. Nanomotion's motion systems are not generic solutions—they're precision-crafted for specific high-performance environments. Their motors and motion platforms are often integrated into: Semiconductor Manufacturing : Where vibration-free, cleanroom-compatible components are a must. : Where vibration-free, cleanroom-compatible components are a must. Medical Imaging and Surgical Tools : Where silent operation and pinpoint accuracy are critical. : Where silent operation and pinpoint accuracy are critical. Aerospace and Defense Optronics: Where real-time targeting and sensor alignment can't afford a single misstep. What sets Nanomotion apart in these fields is their ability to maintain performance under extreme constraints—whether it's limited space, high-vacuum conditions, or the need for complete electromagnetic silence. There are many reasons why engineers and system designers across the globe choose Nanomotion as their motion systems provider. Here are a few of the standout benefits: Silent Operation : Perfect for sensitive environments like labs and medical facilities. : Perfect for sensitive environments like labs and medical facilities. Nanometer-Level Precision : Enabling ultra-fine positioning for demanding applications. : Enabling ultra-fine positioning for demanding applications. Modular Design : Making it easy to integrate into existing systems. : Making it easy to integrate into existing systems. Low Power Consumption : Especially important for mobile and battery-operated systems. : Especially important for mobile and battery-operated systems. Proven Reliability: Trusted across mission-critical industries like defense and semiconductor manufacturing. Nanomotion's commitment to innovation is evident not just in their patented motor technology but in their deep understanding of motion system integration—from individual piezo components to complete motion platforms. With the industry pushing for more compact, efficient, and intelligent devices, the role of high-precision motion systems has never been more critical. Nanomotion is at the forefront of this revolution, combining deep technical expertise with real-world application knowledge to deliver solutions that push boundaries. Whether you're building the next breakthrough in semiconductor tech or developing advanced surgical instruments, Nanomotion offers the tools and support needed to make your system smarter, quieter, and more precise. From the early research of the Curie brothers to today's advanced Piezoelectric Motors, the journey of piezoelectric technology has been long and transformative. And in this journey, Nanomotion leads the way—not just in theory, but in every application, module, and movement. If you're ready to integrate state-of-the-art motion systems into your next innovation, it's time to explore what Nanomotion can do for you. Learn more about Nanomotion's products and how they're redefining motion control at the micro and macro scale. TIME BUSINESS NEWS

Can U.S. Math Research Survive NSF Funding Cuts?
Can U.S. Math Research Survive NSF Funding Cuts?

Yahoo

timea day ago

  • Yahoo

Can U.S. Math Research Survive NSF Funding Cuts?

A 72 percent reduction in federal funding is devastating to math research. The American Mathematical Society is offering $1 million in backstop grants—but it's likely not enough. Mathematics research typically requires few materials. To explore the secrets of prime numbers, investigate unimaginable shapes or elucidate other fundamental mysteries of our universe, mathematicians don't usually need special labs and equipment or to pay participants in clinical trials. Instead funding for mathematicians goes toward meetings of the mind—conferences, workshops and institutes where they gather for intensive sessions to work out math's knottiest problems. Funding also supports the stipends of research fellows, postdoctoral scholars and promising early-career mathematicians. But under the Trump administration's National Science Foundation, much of this funding is being revoked or cut—which, according to experts, could be catastrophic for the present and future of the field. In one recent example, the NSF canceled funding for the Association for Women in Mathematics' research symposium in Wisconsin just four business days before the event was set to begin in May. The threat to this event catalyzed the American Mathematical Society to offer $1 million in backstop grants to support programs whose federal funding has been cut or remains in limbo. These grants are meant to provide a financial safety net that will temporarily allow math programs, researchers and departments to continue operating—but it's not a permanent solution. (Disclosure: The author of this article currently has a AAAS Mass Media Fellowship at Scientific American that is sponsored by the American Mathematical Society.) 'The funding cut is severe, and all of mathematics will be impacted,' says Raegan Higgins, president of the Association for Women in Mathematics and a mathematician at Texas Tech University. [Sign up for Today in Science, a free daily newsletter] Movies and television shows often portray mathematicians scribbling on chalkboards in seclusion, but that picture is often far from accurate. 'None of us work in isolation,' Higgins says. In fact, mathematicians rely heavily on their ability to gather and discuss ideas with their peers—perhaps even more than researchers in other fields do. For mathematicians, conferences, workshops and research talks are not just opportunities to share research and network but also crucial moments to work out tough problems together with colleagues, pose field-propelling questions and generate new ideas. 'It's a thinking science, [and] it's a communication science, so we rely on being together to share ideas and to move the needle forward,' says Darla Kremer, executive director of the Association for Women in Mathematics. According to John Meier, CEO of the American Mathematical Society, 'the ability of mathematicians to gather and talk with each other is absolutely central to the vitality of the field.' Federal dollars, largely through the NSF, are responsible for a significant portion of math funding. But a lot of that funding is disappearing under the Trump administration. In April NSF staff members were instructed to 'stop awarding all funding actions until further notice.' Over the past 10 years, on average, the NSF has awarded $113 million in grants to mathematics by May 21 of each year. This year the NSF has awarded only $32 million, representing a 72 percent reduction. By this metric, mathematics is one of the most deeply affected subjects, second only to physics, which has seen an 85 percent reduction. The administration is also canceling and freezing funding that it had previously promised to researchers. More than $14 million of funding already promised to mathematics programs was revoked earlier this year, according to an analysis by Scientific American. In response to a request for comment, the National Science Foundation told Scientific American that 'the agency has determined that termination of certain awards is necessary because they are not in alignment with current NSF priorities and/or programmatic goals.' This withdrawal of grants is eroding trust and seeding uncertainty, experts say, and it comes with long-term consequences. Even if funding gets renewed again later, it can be very difficult for halted programs to recover. 'If you have to shut down a lab and mothball it, that actually takes time and effort,' Meier says. 'You can't just walk in two weeks later, flip a switch and have everything running again. You've got to rebuild it.' Even in mathematics, that process of rebuilding is time-intensive and not always possible if the space has been reallocated or the people have moved on. American Mathematical Society leadership fears these cuts will hurt young mathematicians the most. Like in the sciences, the funding cuts are eliminating research experiences and supportive programming for undergraduates, fellowships for graduate students and positions for postdoctoral researchers. Travel funding for conferences is also disappearing, which leaves young researchers to choose between shelling out for airfare and lodging they can't really afford and forgoing major career and research building opportunities. As these opportunities disappear, young mathematicians are beginning to look elsewhere—either to more lucrative jobs in the private sector or to more supportive countries. 'We worry about diminishing opportunities in the United States and people early in their career deciding that maybe there's a more profitable venue for them to pursue mathematics in another country,' Meier says. 'We love good mathematics wherever it arises, but we'd really like to see a lot of it arising in the United States. We think that's very, very important.' The $1 million in backstop grants can't fill the hole left by the more than $14 million in promised funding that has been denied or the more than $80 million in reduced funding so far this year. But it might be enough to keep many projects afloat simply by offering guaranteed access to funds in a turbulent time. 'I think one of the great difficulties that we're dealing with right now is the high level of uncertainty,' Meier says. Some mathematicians, for example, simply don't know whether their projects are still being funded or not. In some applications for the backstop grants, researchers 'basically talk about being ghosted,' Meier explains. 'They say, 'I can't actually verify that we no longer have funding. I can only tell you my program officer [at the NSF] isn't replying to my request for information.'' Meier hopes the grants can provide some backup for programs that aren't sure where they stand with the NSF. Without it, researchers, universities and independent organizations may find themselves facing impossible situations. Do they pay their research assistants, run their conferences and continue to fund travel out of pocket, assuming all the financial risk themselves and hoping the grants come through? Or do they halt their projects, losing valuable momentum and perhaps leaving important stakeholders unpaid for their work? Still, the backstop grants are a one-time offering—not a sustainable source of funding for an imperiled field. 'I really view them as trying to take a little bit of the sharp edges off of the sudden loss of funding, as opposed to anything that could sustain the field long-term,' Meier explains. The effects of the Trump administration's cuts to mathematics research—unlike research on, say, Alzheimer's disease, vaccines or climate change—may not be the most immediately concerning to human health and safety. But experts like Meier say that ignoring the role mathematics plays in that development is shortsighted. As a spokesperson of the NSF itself put it in response to an inquiry about the organization's changing priorities (and as the agency has said on its website), 'Mathematical sciences are crucial to everyday society and play an essential role in the innovation engine that drives the U.S. economy, strengthens national security and enhances quality of life.' And the search for the answers to math's biggest mysteries also seeds development in physics, earth science, biology, technology, and more. Any progress we make on these questions in the future, Meier says, is 'based entirely [on what] we are doing in research mathematics right now.' Solve the daily Crossword

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store