
Scientists reconstruct 10,500-year-old woman's face using DNA
A team led by scientists from Ghent University found that the woman would have had blue eyes and slightly lighter skin than most other people from the Mesolithic period in Western Europe who have been analyzed to date, according to a statement from the university on Tuesday.
Isabelle De Groote, an archaeologist at Ghent University who leads the research project on Mesolithic Belgium, told CNN that the woman came from the same population group as the Cheddar Man, who lived in what is now the United Kingdom at around the same time, but had lighter skin.
The findings challenge previous assumptions that European hunter gatherers shared the same genetic makeup, and demonstrates that there was already considerable variation in skin color among different populations, said De Groote.
'From the skull we could also tell that she was somewhere between 35 and 60 years old,' De Groote told CNN on Wednesday.
'She also had a nose with a high nasal bridge, which is similar to Cheddar Man,' De Groote added. 'She also has strong brow ridges despite being a female.'
The woman's remains were found in the Margaux cave in Dinant during an archaeological dig in 1988-1989 alongside the bodies of eight other women, said De Groote.
This was 'an unusual finding' as most Mesolithic burial sites contain a mixture of men, women and children, she added.
'Many of the skeletons were sprinkled with ochre, a practice associated with ritual or symbolic behavior,' said De Groote.
Most of the bodies were carefully covered with stone fragments, while one individual had cut marks on her skull that were made after her death, she added.
'Also interesting is that this burial cave was used over a period of several hundreds of years so that they were places of memory that people would go back to despite their mobile hunter-gatherer lifestyle,' said De Groote.
'These findings point to complex burial customs and raise intriguing questions about the social structure and cultural practices of this early hunter-gatherer community,' she added.
Philippe Crombé, an archaeologist at the university who is part of the project team, said that the ancient woman's skin color was 'a bit of a surprise,' but there's a limited pool of Mesolithic people with whom to compare.
'All individuals so far analyzed on ancient DNA in Western Europe have belonged to the same genetic group,' he said.
'So it's a bit of a surprise, but on the other hand, it is to be expected that in the wide area of Western Europe there's some variability, as there is today.'
When the remains were recovered there was no way to conduct research into ancient DNA, said Crombé.
'Techniques have developed since the excavation,' he told CNN on Wednesday, adding that the interdisciplinary project is 'a re-analysis of old excavations using state of the art methods.'
Crombé detailed how 'quite good quality' DNA was taken from the woman's skull, allowing for the creation of 'a very detailed reconstruction.'
Her skin color, hair color and eye color is all based on ancient DNA, while other elements such as her jewelry and tattoos are based on archaeological data obtained from other excavations in the River Meuse basin, which also allowed them to build a picture of her daily life.
At one excavation – a former campsite on the banks of the river – scientists found stone tools, bones from wild game and fish remains, said Crombé, providing evidence that these people would have been nomadic.
'They're still moving around because they are entirely dependent on natural resources: wild game, wild plants, fish,' he said. 'So that forced them to move through the landscape and to move their settlements.'
Many questions remain about these Mesolithic communities, which were the last hunter-gatherers in Western Europe, said Crombé.
Now the team are analyzing the remains to piece together the relationships between people who were buried together, and also plan to study the extent to which they would have eaten fish, he added.
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles
Yahoo
37 minutes ago
- Yahoo
New blood test detects cancers 3 years before typical diagnosis, study hints
When you buy through links on our articles, Future and its syndication partners may earn a commission. Blood plasma can harbor DNA changes that could flag cancer years before existing diagnostic tests, an early study hints. The recent study, published May 22 in the journal Cancer Discovery, found traces of free-floating DNA from dead precancerous or cancerous cells in plasma that had been donated three years before a diagnosis. "It's an important step toward preclinical cancer detection, which could potentially revolutionize cancer screening," said Catherine Alix-Panabières, a cancer researcher at the University of Montpellier in France who was not involved with the work. "Earlier detection typically correlates with better outcomes across many cancer types due to earlier intervention," she told Live Science in an email. The prognosis for cancer patients generally grows worse the later their disease is caught, especially once it has grown and spread to other tissues. Yet the gene changes, or mutations, that give rise to tumors tend to appear decades beforehand. Consultant oncologist Dr. Yuxuan Wang at Johns Hopkins University and her colleagues wanted to see if they could detect tumor DNA in plasma long before cancer manifests. They examined plasma — the liquid that blood cells are suspended within — that was collected from patients roughly 40 years ago for an unrelated study. They focused on 26 participants who had developed cancer within six months of donating blood, as well as 26 controls who did not develop cancer for at least 17 years post-donation. Related: Simple blood tests could be the future of cancer diagnosis Wang's group found between one and three common cancerous mutations in seven of the plasma samples, all of which were taken from participants that developed cancer within four months of donating blood. Six of these patients had also donated blood between 3.1 and 3.5 years beforehand, so Wang's team turned back the clock further and assessed those earlier samples for the same mutations. Two of the early samples contained the same DNA errors, confirming that these warning signs were detectable years before the tumors appeared, at least in some people. Since they found only a few common mutations in two of the six plasma samples taken three years before diagnosis, they then sequenced the plasma DNA to find additional mutations that were unique to each patient. Using the genomes of their white blood cells — a type of immune cell — as a reference, they found between four and 90 unique mutations in the plasma DNA from three patients. All told, they found hints of cancer in three of the five early samples they examined. The patients in this study had a variety of cancers, including breast, colon, liver, lung, pancreas, and rectal cancer. However, it's not clear if the testing method works equally well for all tumor types. "Some organs will shed tumor DNA more than others," Wang told Live Science, noting that the blood-brain barrier, a protective membrane, may prevent brain cancer DNA crossing out of the organ and into the bloodstream. In addition, the new research didn't find any cancer DNA in 18 of the 26 participants who developed tumors in the months after their samples were collected. That's not ideal for a clinical test, Wang said. But she suggested that detection could potentially improve if doctors took larger volumes of plasma from each patient. Since the test could potentially detect cancer years before symptoms first appear, it could one day be useful for screening patients preemptively. However, further experiments are needed to ensure this diagnostic doesn't lead to false positive results, which could unnecessarily alarm patients and possibly lead to unnecessary treatments or invasive diagnostic procedures, like biopsies. "Ethically, implementing such tests in routine screening would require clear guidelines on how to handle incidental findings," Alix-Panabières said. And because the study only included plasma samples from 52 people, larger investigations involving hundreds or thousands of participants would be needed to validate the test before doctors could use it with confidence. "Realistically, widespread clinical adoption may take another 5–10 years," Alix-Panabières predicted. RELATED STORIES —Detecting cancer in minutes possible with just a drop of dried blood and new test, study hints —New blood test detects ovarian cancer years before conventional methods —Simple blood test could reveal likelihood of deadly skin cancer returning, study suggests Finding personalized mutations requires sequencing the patient's DNA, which can cost several hundred or thousands of dollars, Wang said. So even if such a test can be validated in larger trials, it's "probably not going to be something we can provide for everyone who we want to screen," and the test may need to be reserved for at-risk groups whose families have known histories of cancer, for instance. The recent study consisted mostly of Black and white men and women between the ages 45 to 64 from four U.S. states. Future investigations could explore the efficiency of the test in people from other, genetically diverse backgrounds. This article is for informational purposes only and is not meant to offer medical advice.
Yahoo
8 hours ago
- Yahoo
What a new study suggests about pregnancy diet and type 1 diabetes—and why it's not about being perfect
It's not about being perfect—it's about having the tools to make informed choices. When you're pregnant, it can feel like everything comes with a warning label. And now, a new study adds another layer to the conversation—this time linking a mom's diet during pregnancy to the risk of type 1 diabetes in her child. But before this sparks anxiety over your last bite of pizza or bowl of pasta, let's take a breath—and break down what this research actually means for you and your baby. Because while the study's findings are significant, they aren't meant to shame—they're meant to empower. It's not about being perfect. It's about having the tools to make informed choices. Researchers behind a large-scale Danish study, published in the Journal of Epidemiology & Community Health, analyzed data from more than 67,000 mother-child pairs over a 17-year period. They discovered that when pregnant women ate diets higher in inflammation-promoting foods—like processed meats, sugary drinks, and refined carbs—their children were observed to have a 16% higher risk of developing type 1 diabetes for every one-point increase in the diet's inflammatory score. This does not mean the diet caused diabetes, only that a pattern was observed. This dietary score, called the EDII (Empirical Dietary Inflammatory Index), was calculated using food frequency questionnaires filled out around 25 weeks into pregnancy. Type 1 diabetes is an autoimmune condition—often diagnosed in childhood—where the immune system mistakenly attacks the body's insulin-producing cells. While genetics play a role, the rising number of cases in developed countries suggests that environmental factors, including prenatal exposures, may also be at play. The study also found that high gluten intake and maternal smoking during mid-pregnancy were independently associated with increased diabetes risk in children—pointing to this stage of pregnancy as a potentially critical window for fetal immune development. Related: New study: Cutting sugar in the first 1,000 days could shape your baby's health for life 'Inflammatory' doesn't just mean sugary foods. In this study, higher EDII scores were associated with frequent intake of: Processed or red meats Refined grains (like white bread and pastries) Fried foods Sugary beverages Foods containing trans fats In contrast, lower EDII scores—indicating a more anti-inflammatory diet—were linked to greater consumption of: Leafy greens and cruciferous vegetables Garlic and tomatoes Fruits and whole grains Coffee and tea These food patterns closely resemble the Mediterranean diet, long celebrated for its role in supporting heart health and reducing chronic inflammation. Related: Eating a Mediterranean diet could increase your chances of becoming pregnant, studies show It's easy to read a study like this and feel an onslaught of food guilt. But here's the reality: this isn't about moral judgment—it's about informed awareness. It's also important to remember that many parents whose children develop type 1 diabetes followed healthy or typical diets. Autoimmune conditions are complex, and no one decision during pregnancy can guarantee or prevent an outcome.' Importantly, the researchers emphasized that their findings are observational—meaning they show associations, not direct cause-and-effect. Still, the patterns were strong enough to suggest that diet during mid-pregnancy may have a meaningful influence on the developing immune system. So what can you do with this information? You don't need to toss your cravings out the window or obsess over every ingredient. Instead, you can take small, sustainable steps that feel good to you and your body. If you're pregnant (or planning to be), consider these realistic, non-restrictive shifts: Add before you subtract. Focus on including more fiber-rich, whole foods before worrying about cutting things out. Make swaps where it feels easy. Choose brown rice or quinoa instead of white rice; opt for olive oil instead of butter when you can. Don't stress over every meal. It's what you do most of the time—not all of the time—that matters. Small shifts count. If access to fresh produce or high-quality ingredients is limited, know that every small shift still counts. Frozen vegetables, canned beans, or even simple substitutions like whole-grain bread are powerful steps. Ask for support. A registered dietitian or your OB-GYN can help you personalize your approach based on your cravings, health needs, and energy levels. This study doesn't mean that every food decision during pregnancy needs to be scrutinized or optimized. It means that we're learning more—and that knowledge can be powerful. Especially when it comes from a place of support, not shame. By understanding how inflammation works and how certain foods may influence a child's risk of developing autoimmune conditions like type 1 diabetes, moms can feel more confident making the choices that are right for them. If you're learning this after your pregnancy—or after a diagnosis—it's never too late to apply that knowledge in ways that support your child or future pregnancies. Growth is a sign of strength, not regret. Because you deserve to feel informed, supported—and never judged—for doing the best you can with the knowledge you have. Source: Journal of Epidemiology & Community Health. 2025. 'Association between a pro-inflammatory dietary pattern during pregnancy and type 1 diabetes risk in offspring: prospective cohort study'


Fox News
10 hours ago
- Fox News
Neanderthals extracted animal fat in advanced food prep process 125,000 years ago: report
Neanderthals living 125,000 years ago in what is now modern-day Germany may have extracted and eaten fat from animal bones through an organized food preparation process that scientists describe as a 'fat factory.' While excavating the site of a former lake landscape called Neumark-Nord, archaeologists discovered thousands of bones from at least 172 large mammals, along with flint artifacts. The bones, which date back to an interglacial period in which Neanderthals lived, were from animals like red deer and horses, according to a study published on July 2 in Science Advances. While many of the bones that contained less bone marrow were spread out across the archaeological site, researchers observed that many of the marrow-rich bones were located in clusters — sites they call 'fat factories.' Researchers believe our extinct ancestors used tools to smash the bones into small fragments and then boiled them for hours. The grease, which then floated to the surface of the water, could be skimmed off the top and eaten — providing a calorie-dense food source for the archaic people. Prior to this, evidence of the practice had only dated back to 28,000 years ago, according to the research. "Neanderthals were clearly managing resources with precision — planning hunts, transporting carcasses, and rendering fat in a task-specific area," Dr. Lutz Kindler, the study's first author, said. "They understood both the nutritional value of fat and how to access it efficiently — most likely involving caching carcass parts at places in the landscape for later transport to and use at the grease rendering site. Fat was a "life-sustaining" resource for Neanderthals, especially during the winter and spring seasons when carbohydrates were scarce. Their diets consisted largely of animal protein, and consuming lots of protein without other nutrients could lead to a sometimes deadly condition called protein poisoning, the research noted. "The sheer size and extraordinary preservation of the Neumark-Nord site complex gives us a unique chance to study how Neanderthals impacted their environment, both animal and plant life," Dr. Fulco Scherjon, data manager and computer scientist on the project, said. "That's incredibly rare for a site this old—and it opens exciting new possibilities for future research." In recent years, scientists have also discovered that Neanderthals went diving for seashells that they could chip with stone hammers into thin and sharp cutting edges. Similarly, another study suggested Neanderthals may have buried their dead with flowers. Researchers Lutz Kindler and Wil Roebroeks did not immediately respond to Fox News Digital's request for comment.