
Teeth Hurt? It Could Be Because Of A 500-Million-Year-Old Fish
The exact origin of teeth -- and what they were for -- has long proved elusive to scientists.
Their evolutionary precursors are thought to be hard structures called odontodes which first appeared not in mouths but on the external armour of the earliest fish around 500 million years ago.
Even today, sharks, stingrays and catfish are covered in microscopic teeth that make their skin rough like sandpaper.
There are several theories for why these odontodes first appeared, including that they protected against predators, helped with movement through the water or stored minerals.
But the new study published in the journal Nature supports the hypothesis that they were originally used as sensory organs which transmitted sensations to nerves.
At first, the study's lead author Yara Haridy was not even trying to hunt down the origins of teeth.
Instead the postdoctoral researcher at the University of Chicago was probing another major question puzzling the field of palaeontology: what is the oldest fossil of an animal with a backbone?
Haridy asked museums across the United States to send her hundreds of vertebrate specimens -- some so small they could fit on the tip of a toothpick -- so she could analyse them using a CT scanner.
She began focusing on dentine, the inner layer of teeth that sends sensory information to nerves in the pulp.
THINGS GET FISHY
A fossil from the Cambrian period called Anatolepis seemed to be the answer she was looking for. Its exoskeleton has pores underneath the odontodes called tubules that could indicate they once contained dentine.
This has previously led paleontologists to believe that Anatolepis was the first known fish in history.
But when Haridy compared it to the other specimens she had scanned, she found that the tubules looked much more like sensory organs called sensilla of arthropods, a group of animals that includes crustaceans and insects.
The mighty Anatolepis was therefore demoted to the rank of an invertebrate.
For modern arthropods such as crabs, scorpions and spiders, sensilla are used to perceive temperature, vibration and even smell.
How little these features have changed over time suggests they have been serving these same functions for half a billion years.
The researchers said they found "striking" similarities between these features in Anatolepis and vertebrate fish from around 465 million years ago -- as well as some better-known fish.
"We performed experiments on modern fish that confirmed the presence of nerves in the outside teeth of catfish, sharks and skates," Haridy told AFP.
This shows that "tooth tissues of odontodes outside the mouth can be sensitive -- and perhaps the very first odontodes were as well," she added.
"Arthropods and early vertebrates independently evolved similar sensory solutions to the same biological and ecological problem."
Senior study author Neil Shubin, also from the University of Chicago, said that these primitive animals evolved in "a pretty intense predatory environment".
"Being able to sense the properties of the water around them would have been very important," Shubin said in a statement.
Haridy explained that over time, fish evolved jaws and "it became advantageous to have pointy structures" near their mouth.
"Little by little some fish with jaws had pointy odontodes at the edge of the mouth and then eventually some were directly in the mouth," she said.
"A toothache is actually an ancient sensory feature that may have helped our fishy ancestors survive!"

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


The Hindu
2 hours ago
- The Hindu
Gold superheated far beyond its melting point can stay solid
When gold is heated really quickly, it seems it remains solid at temperatures far beyond the point where it should have become liquid, a new study in Nature has found. The term for when a solid stays a solid at or beyond its melting point is superheating. Most materials can be superheated only in a short range after that point, before they promptly melt. Scientists used to think this range was fixed because of a limit called the entropy catastrophe. Entropy is a measure of disorder in a system. When you heat a substance, its entropy increases (among other attributes). Previously, scientists thought that if you heated a crystal to about three-times its melting temperature, it wouldn't be able to stay a solid any more: it'd have to melt because its atoms would have become too disordered. Entropy catastrophe In 1948, an American chemist named Walter Kauzmann flipped this. He found that when he continuously cooled a liquid to under its melting point but at the same time prevented it from crystallising, the amount of entropy in the liquid would be less than that in a crystal of the same material beyond a specific temperature — which shouldn't be possible. This came to be called the Kauzmann paradox. Four decades later, Hans-Jörg Fecht from Germany and William Johnson from the US flipped this once more. They reported that when a solid was superheated to around three-times its melting point, it would eventually possess more entropy than its liquid form beyond a particular temperature, which is another impossibility. This temperature was called T EC , where EC stood for 'entropy catastrophe'. Both these outcomes are 'catastrophic' because of the second law of thermodynamics, which states that in an isolated system evolving spontaneously, the entropy can't decrease over time. For two phases at the same temperature and pressure, the phase with higher entropy is (loosely speaking) the more disordered one. As the German physicist Rudolf Clausius interpreted this law, the entropy of an isolated system can't spontaneously decrease — yet that is what the entropy of a solid being higher than that of a liquid implies. The 'catastrophe' is thus a warning that extrapolating to those problematic temperatures in the Kauzmann and Fecht-Johnson experiments doesn't enjoy thermodynamic legitimacy. It's also a sign that something happens before those temperatures to prevent the impossible outcomes. Heat it quickly For example, Kauzmann found that the liquid would either crystallise first or that it would turn into a glass well before it reached the 'catastrophe' temperature. This avoidance is why every ordinary piece of glass you come across — like the one on your windows, say — forms around a glass‑transition temperature that's noticeably higher than the problem temperature. Similarly, a crystal melts long before its 'catastrophe' temperature or simply vaporises. The new study with gold explores what happens to these expectations when the metal is heated very quickly. Understanding the limit of how much heat a solid can imbibe without changing its phase (i.e. turning from solid to liquid) is important for engineers to design materials that work in extreme environments, such as on planets with brutal atmospheres or in facilities that manufacture them using punishing physical conditions. As with a lot of research of this type, the new study used a simple process but wasn't possible to conduct until now because the technologies required have only just become accessible. The researchers, from Germany, Italy, the UK, and the US, used powerful laser pulses to heat gold films about 50 nm thick. They used the lasers in order to heat the gold rapidly, without giving it time to disintegrate or liquefy. Each pulse lasted only 45 femtoseconds and was just 400 nanometres long. Then the team used a technique called high-resolution inelastic X-ray scattering to determine the gold atoms' temperature. A device produced and emitted streaks of X-ray radiation, which struck the gold atoms and scattered off only a few picoseconds after they'd been heated. By measuring the changes in energies of those X-rays and the directions in which they emerged from the nanofilms, the team could deduce how fast the atoms were moving, and from that figure out the temperature. (The temperature of a material is simply the average kinetic energy of its constituent particles.) Older results stay Thus the team found that solid gold superheated to 14-times its melting point — leagues beyond the three-times limit — remains solid for a few trillionths of a second, which is a significantly long time in the microscopic realm. The X-ray diffraction patterns revealed the atoms were still arranged in the ordered pattern typical of solid crystals. According to the researchers, the rapid heating could overtake the effects that came with heating more slowly. This isn't a gimmick so much as a signal that if a material is heated rapidly enough, there may not actually be an 'entropy catastrophe'. The ultrashort laser pulses made sure the gold atoms didn't have time to 'relax' before the X-ray instrument came on, revealing the nanofilm to have been solid even at a temperature where melting was expected to be unavoidable. In fact, when the researchers calculated the nanofilms' entropy in conditions where the films lacked the time to expand due to heating, they found that the films themselves could never reach the classical catastrophe temperature. The findings challenge materials scientists' core assumptions about how matter behaves at extreme conditions. At the same time, they don't invalidate the work of Kauzmann and Fecht and Johnson: the latter two assumed the material they worked with could expand when heated whereas the new study didn't allow for that possibility. Nonetheless, the implications could go beyond the earth. For example, certain substances may be able to survive in the cores of planets or on stars in a particular phase for longer than what models have predicted. Such details may come to light when scientists apply the technique in this experiment to more materials.


Time of India
8 hours ago
- Time of India
Why are non-smokers getting Lung Cancer? Scientists cite possible risk factors
Lung cancer, a type of cancer that originates in the lungs, has long been synonymous with smoking. It occurs when cells in the lungs grow uncontrollably and form tumors, potentially interfering with the lungs' ability to function properly. These tumors can also spread to other parts of the body. Lung cancer is a leading cause of cancer-related deaths worldwide. If we look at the recent past, an alarming number of non-smokers are now being diagnosed with the disease. In recent years, oncologists and researchers have observed a growing trend: individuals with no history of smoking, sometimes even young, healthy adults, are developing lung cancer. What else, other than smoking, could be the culprit behind one of the deadliest cancers? Scientists are probing to find the answer to the pressing question, and here are the factors that have been narrowed down into so far. The changing face of lung cancer Traditionally viewed as a smoker's disease, lung cancer has undergone a startling demographic shift. According to statistics provided by the American Cancer Society , up to 20% of lung cancer cases in the United States now occur in people who have never smoked. In some regions of Asia, this figure rises to nearly 50%, especially among women. This growing incidence among non-smokers has not only perplexed researchers but also reshaped the scientific approach to lung cancer research. Risk factor #1: Air pollution (and fine particulate matter – PM2.5) One of the most significant factors behind lung cancer is air pollution, especially fine particulate matter known as PM2.5. These microscopic particles, less than 2.5 micrometers in diameter. They can penetrate deep into lung tissues and even enter the bloodstream. Studies have found that long-term exposure to PM2.5 is strongly linked to the development of lung cancer, particularly in non-smokers. A 2022 study published in Nature by researchers at the Francis Crick Institute found that air pollution triggers pre-existing mutations in lung cells, acting as a catalyst for cancer development. Unlike smoking, which causes mutations, air pollution may awaken already dormant mutations, essentially turning a dormant risk into an active disease. Risk factor #2: Radon gas exposure Radon is an odorless, colorless radioactive gas that occurs naturally from the decay of uranium in soil and rocks. It can seep into homes through cracks in foundations or walls, particularly in poorly ventilated basements. According to the US Environmental Protection Agency (EPA) , radon is the second leading cause of lung cancer overall, and the leading cause among non-smokers in the United States. Prolonged exposure to radon increases the risk of lung cancer significantly, especially when combined with other environmental factors. Unfortunately, many people remain unaware of radon exposure because it's undetectable without specialized equipment. Risk factor #3: Secondhand smoke Now, this one's a factor we keep on overlooking. Even if a person has never smoked, secondhand smoke can pose a serious health threat. Inhaling smoke from someone else's cigarettes exposes non-smokers to the same carcinogens as active smokers. The US Centers for Disease Control and Prevention (CDC) reports that secondhand smoke causes approximately 7,300 lung cancer deaths annually in non-smoking adults in the US alone. Children and spouses of smokers are at the highest risk due to prolonged indoor exposure, making it a persistent, yet preventable, risk factor. Risk factor #4: Genetic susceptibility and mutations Genetic predisposition also plays a critical role. Certain mutations in the EGFR (epidermal growth factor receptor) gene are far more common in non-smokers diagnosed with lung cancer. These mutations are particularly prevalent among Asian women and younger patients, suggesting a strong hereditary or ethnic component. Unlike smoking-related lung cancers, which often involve KRAS mutations, non-smoking-related lung cancers tend to respond differently to treatments. Fortunately, the presence of EGFR mutations has led to the development of targeted therapies that improve survival rates for affected individuals. Risk factor #5: Indoor pollution from cooking fumes In many developing countries, especially in parts of Asia and Africa, cooking with biomass fuels (like wood, charcoal, or dung) in poorly ventilated kitchens exposes individuals to harmful smoke and toxins. A 2020 study in The Lancet Planetary Health found that indoor air pollution is a major contributor to lung cancer risk, particularly among women who spend more time cooking at home. Oil fumes from high-temperature frying in non-ventilated kitchens have also been linked to increased cancer risk. Polycyclic aromatic hydrocarbons (PAHs), produced during frying, are well-known carcinogens. Risk factor #6: Viruses and infections Though still under investigation, researchers have begun to explore the role of viruses in triggering lung cancer among non-smokers. Certain strains of human papillomavirus (HPV) and Epstein-Barr virus (EBV) have been detected in lung tumor samples, particularly in patients with no history of smoking. The exact mechanism remains unclear, but it's hypothesized that these viruses may induce cellular changes that lead to tumor formation over time. The importance of awareness The rise in lung cancer among non-smokers calls for a broader public health approach. In 2025, according to the Lung Cancer Research Foundation , lung cancer remains a significant global health concern, projected to be the leading cause of cancer death worldwide. Approximately 227,000 people are expected to be diagnosed with lung cancer in the US, while an estimated 125,000 lives will be lost. Because lung cancer is often diagnosed in later stages, especially in those not deemed high-risk (i.e., non-smokers), early screening and detection are critical. Several countries are now reconsidering screening guidelines to include long-term exposure to pollution or genetic risk as criteria. Additionally, raising awareness about environmental risks, promoting indoor air quality testing (for radon and other pollutants), and encouraging the use of clean cooking technologies can go a long way in reducing preventable cases.


Time of India
11 hours ago
- Time of India
IIT study challenges urban pollution assumptions
1 2 Bhubaneswar: Normally, cities are regarded as pollution hotspots, with elevated pollution levels compared to surrounding non-urban regions — a pattern commonly known as the 'urban pollution dome' or 'urban pollution island' effect. However, a recent study by IIT Bhubaneswar found that this pattern does not hold true in many northern Indian cities. "Instead of a concentrated urban pollution dome, these cities display a 'clean island' effect, or what the researchers describe as a 'punctured pollution dome', where the city centres are, unexpectedly, relatively cleaner than the heavily polluted surrounding areas," stated the study, published in the scientific journal Nature. Researchers V Vinoj, asssociate professor at the school of earth, ocean and climate sciences, IIT Bhubaneswar, and research scholar Soumya Satyakanta Sethi attributed this unexpected pattern to an 'invisible barrier' formed by city's tall buildings and uneven structures, which slows down wind. "This limits pollutant dispersion, causing pollution to accumulate within the city and form a typical urban pollution dome. However, this same barrier can also prevent polluted air from outside the city from entering. As a result, in some cases, pollution builds up in areas surrounding the city, making the city centre appear relatively cleaner," said Vinoj. Based on two decades of high-resolution aerosol data across 141 Indian cities, the study found that southern cities — less affected by pollution transported from afar — exhibit classic domes with more pollution inside. In contrast, cities in northern and northwestern India, particularly those in the Indo-Gangetic plain, experience heavy regional and long-range pollution, such as dust. There, the city's barrier blocks incoming pollutants, causing them to accumulate in surrounding non-urban areas and forming what the researchers describe as "clean air domes. " The study stated that around 57% of cities exhibited urban aerosol pollution islands, while the remaining 43% showed urban aerosol clean islands. "Delhi and Mumbai are pollution islands normally, but they become clean islands whenever high dust is transported to the surrounding areas of these cities due to various reasons," said Vinoj. Bhubaneswar, on the other hand, remains a pollution island even when high dust is transported to surrounding areas, he added. These findings challenge long-standing assumptions about urban air pollution — particularly the notion that transported aerosols simply add up over cities and uniformly degrade air quality. The study also highlights that monitoring air pollution solely at city boundaries may provide an incomplete picture, as the actual dynamics involve a complex interplay of local emissions, regional transport, microclimatic effects, and atmospheric processes. "Uncovering these hidden atmospheric dynamics is only the beginning. Achieving truly sustainable and climate-resilient cities requires a deeper, integrated understanding of how urban environments interact with atmospheric processes," the researchers concluded.