logo
Red Meat Tied to High RA Risk; Fruits and Veggies Protective

Red Meat Tied to High RA Risk; Fruits and Veggies Protective

Medscape16-07-2025
TOPLINE:
The risk for developing rheumatoid arthritis (RA) rose with a high intake of red and processed meat vs fruits and vegetables in a case-control study.
METHODOLOGY:
The risk of developing RA based on dietary habits was explored in a Swedish nested case-control study within a prospective study conducted between January 1991 and September 1996 (n = 28,098).
Diet was assessed at baseline using a 7-day menu book, a 168-item questionnaire, and an interview lasting 45-60 minutes, and each RA patient was matched to four control individuals for analysis.
Compliance with the 2015 Swedish Dietary Guidelines for intake of fiber, fruits and vegetables, fish/shellfish, added sugars, and red or processed meat was scored and categorized as low, moderate, or high.
Data from registries and validated electronic medical records identified RA incidence through December 2016; data on rheumatoid factor and anticyclic citrullinated peptide antibody were also retrieved.
TAKEAWAY:
A total of 305 incident cases of RA were identified (76.1% women; 66.9% seropositive), with a mean age of 56.8 years at baseline and an average 12-year interval until diagnosis.
Consuming < 500 g/wk of red and processed meat (adjusted odds ratio [aOR], 0.60; 95% CI, 0.38-0.97) and ≥ 400 g/d of vegetables and fruits (aOR, 0.64; 95% CI, 0.43-0.94) significantly reduced the odds of developing RA.
Lower odds of developing RA were linked to fruits and vegetables (aOR per SD, 0.70; 95% CI, 0.57-0.87) and fiber (aOR per SD, 0.80; 95% CI, 0.64-0.99) vs higher odds with red and processed meat (aOR per SD, 1.31; 95% CI, 1.07-1.59).
Adhering to intake guidelines for all food types was linked to a lower risk for seropositive RA; higher odds were linked to higher consumption of red and processed meat (aOR, 3.43; 95% CI, 1.69-6.96).
IN PRACTICE:
'[The study] findings suggest a dose-response relationship between red/processed meat intake and the risk of seropositive, but not seronegative, RA. This study helps to improve our understanding of the effects of dietary components in RA development, particularly for red and processed meats, but the exact mechanisms behind our findings need further investigation,' the authors concluded.
SOURCE:
This study was led by Rebecka Bäcklund, Lund University, Malmö, Sweden. It was published online on July 10, 2025, in Annals of the Rheumatic Diseases.
LIMITATIONS:
The diet was assessed only at baseline. Data were retrieved from a study that examined the association between diet and cancer, which may have introduced selection bias. Dietary guidelines, especially regarding meat intake, may have changed over time.
DISCLOSURES:
This study received support from the Swedish Rheumatism Association, the Gustav V 80-year fund, the Greta and Johan Kock foundation, and others. The authors declared having no conflicts of interest.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication.
Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

Physicists Create First-Ever Antimatter Qubit, Making the Quantum World Even Weirder
Physicists Create First-Ever Antimatter Qubit, Making the Quantum World Even Weirder

Gizmodo

time18 minutes ago

  • Gizmodo

Physicists Create First-Ever Antimatter Qubit, Making the Quantum World Even Weirder

Readers following our existential physics coverage may remember a recent breakthrough from CERN concerning matter's evil twin, antimatter. An outstanding mystery in physics is that our universe contains more matter than antimatter, contradicting most theoretical predictions. Scientists, therefore, understandably want to explain why and how this is the case. CERN announced yet another significant leap for studying antimatter—and this time, the achievement creeps into the realm of quantum computing. In a Nature paper published on July 23, CERN's Baryon Antibaryon Symmetry Experiment (BASE) collaboration announced the first-ever demonstration of an antimatter quantum bit, or qubit—the smallest unit of information for quantum computers. The qubit in question is an antiproton, a proton's antimatter counterpart, caught in a curious quantum swing—arcing back and forth between 'up' and 'down' spin states in perfect rhythm. The oscillation lasted for 50 seconds. The technical prowess that enabled this result represents a significant leap forward in our understanding of antimatter, the researchers claim. For the experiment, the team applied a technique called coherent quantum transition spectroscopy, which measures—with chilling precision—a particle's magnetic moment, or its behavior inside magnetic fields. First, the team brought in some antiprotons from CERN's antimatter factory, trapping the particles in an electromagnetic Penning trap—a superposition of magnetic fields. Next, they set up a second multi-trap inside the same magnet, extracting individual antiprotons to measure and tweak the particle's spin states in the process. Quantum states are fragile and easily disturbed by outside distractions. The wrong push can immediately send them spiraling down the drain toward decoherence—at which point the system loses the valuable information physicists hope to find. This fundamental limitation of quantum systems was a major concern for the BASE collaboration, who in 2017 used a similar setup to the new experiment to confirm that protons and antiprotons had practically identical magnetic moments. The team made substantial revisions to its technology, paying special attention to developing the mechanisms needed to suppress and eliminate decoherence. This hard work paid off; the antiproton performed a stable quantum swing for 50 seconds—a motion akin to how qubits exist in superpositions of states, which theoretically could allow them to store exponential loads of information. Additionally, it marked the first time physicists observed this phenomenon in a single free nuclear magnetic moment, whereas previous experiments had only seen it in large groups of particles. 'This represents the first antimatter qubit and opens up the prospect of applying the entire set of coherent spectroscopy methods to single matter and antimatter systems in precision experiments,' BASE spokesperson Stefan Ulmer said in a statement. That said, the team doesn't believe the new results will introduce antimatter qubits to quantum computing, at least not anytime soon. 'It does not make sense to use [the antimatter qubit] at the moment for quantum computers, because, simply speaking, engineering related to production and storage of antimatter is much more difficult than for normal matter,' Latacz explained, adding that since matter and antimatter are known to share fundamental properties, opting for the latter wouldn't make practical sense. 'However, if in the future [we find] that antimatter behaves differently than matter, then it may be interesting to consider this.' There are additional improvements the team hopes to make, which will happen sometime very soon, Latacz said. The upgrades to BASE—termed BASE-STEP—will greatly improve our capacity to study antiprotons with higher precision and allow us to 'improve the measurement of the magnetic moment of the antiproton by at least a factor of 10, and in a longer perspective even a factor of 100,' she said. The new breakthrough could contribute to engineering advances in quantum computing, atomic clocks, and other areas. But as the researchers emphasize, such technological applications aren't anything we should expect any time soon. Nevertheless, the result itself presents some fascinating lessons for fundamental physics—questions that may take years to answer, but to quote physicist Sean Carroll from the other recent CERN finding, 'Well, it's a small part of a much bigger puzzle—but you know, every part matters.'

Jul 25 2025 This Week in Cardiology
Jul 25 2025 This Week in Cardiology

Medscape

time2 hours ago

  • Medscape

Jul 25 2025 This Week in Cardiology

Please note that the text below is not a full transcript and has not been copyedited. For more insight and commentary on these stories, subscribe to the This Week in Cardiology podcast , download the Medscape app or subscribe on Apple Podcasts, Spotify, or your preferred podcast provider. This podcast is intended for healthcare professionals only. In This Week's Podcast For the week ending July 25, 2025, John Mandrola, MD, comments on the following topics: The group at the University of Leeds in the UK have published an interesting paper on endurance athletes, cardiac fibrosis, and ventricular arrhythmia. They cheekily named it Ventoux to correspond to a recent Tour de France stage finishing on Mount Ventoux. I've climbed Ventoux. It took me 1 hour 42 minutes. Tadej and Jonas did it this week in 54 minutes—a record by a minute. Imagine liking, no, loving, the notion of climbing a 10% gradient for well more than hour and you will understand the mindset of older endurance athletes. Thinking this is fun is a sort of disease. People who do this day in and day out for decades have an affliction. I know because I do. We are like rodents who, given a wheel in our cage, will run on it, for no purpose other than to run on it. Before I say anything else, I want to start with the fact that exercise is one of the pillars of health. It's in every expert consensus document and oodles of observational studies correlate exercise with longevity. But 60-year-old people who cycle or run more than 10 hours per week for decades are not normal exercisers. For instance, if you do an hour ride 5 days per week, to get to that level of exercise, you have to do a 5-hour ride on the weekend or two 3-hour rides on Saturday and Sunday. That's a lot. I co-wrote a book called The Haywire Heart in which my chapters were basically narrative reviews of the cardiac effects of all this exercise. The one, single observation that got me interested in this topic is that endurance athletes have a heightened relative risk of getting atrial fibrillation (AF) and, electrically and structurally, their atria remodeling resembles that of a person of obesity and hypertension. Isn't that wild? But AF is not the topic of this paper. Endurance athletes can also develop both myocardial fibrosis (scar) and ventricular arrhythmias (VA). The VA story is less fleshed out than the AF story, perhaps because it is less common. But again, you have the paradox: endurance exercise correlates with scar seen in healed myocarditis and other cardiomyopathies. Something amazing for health — regular exercise — perhaps can be detrimental in huge doses. And we all know that myocardial scar can predispose to ventricular arrhythmias—which, can in rare cases, lead to sudden cardiac death. Now to the Ventoux study: The Leeds group recruited about 100 endurance athletes and 27 controls with the purpose of studying MRI, CMR, and monitoring for VA with implantable loop recorders (ILRs) over 2 years. All participants had to agree to an ILR implant, which is likely an important factor in the study's interpretation. This study is a great effort. I congratulate the authors. The average age was about 60. All were men. None could have any symptoms or heart disease. The athletes were pretty strong. The functional threshold power, what can be held for an hour, was around 244 watts. That's not close to elite but it's not Mickey Mouse either. Another factor was that the Leeds group is a well-known and experienced in imaging. So, the MRI imaging can be trusted, and an expert in imaging told me these are likely real findings. And here are the topline results: Nearly half (47%) had myocardial fibrosis on MRI, the vast majority had inferior lateral location of scar. The main way the authors display the results is with two groups—those with scar and those without scar. About 1in 5 athletes overall had some ventricular arrythmia on the ILR during 2 years of follow-up Those with fibrosis had a 4.7% higher relative risk of VA. (But crucially, the 95% CI were wide from 80% higher risk to 13x higher). Athletes with fibrosis were slightly older than athletes without fibrosis (62 years versus 57 years) Athletes with fibrosis exhibited a greater prevalence of premature ventricular contractions (PVC) during exercise testing than athletes without fibrosis (71% vs 42%; P =.003) All three of the athletes with VT longer than 30 seconds, which we call sustained VT, had fibrosis on cardiac MRI. And all 3 who experienced sustained VT were symptomatic and developed an episode of nonsustained VT before the onset of VT. One athlete received an ICD due to presyncope, one was scheduled for an EP study (the results of which were not known), and the third was advised to cease competing due to recurrent VT during exercise but declined further investigation Of those who experienced a ventricular arrhythmia, 78% of the athletes had myocardial fibrosis on CMR compared with 22% athletes who did not ( P < .001). Two other predictors of having VT were left ventricular (LV) dilation and exercise-induced PVCs. Late gadolinium enhancement (LGE) at the right ventricular (RV) insertion point was super common. This is also well known in athletes but it had no statistically significant correlation with VA. But confidence intervals (CI) were wide again. Both groups had normal left ventricular ejection fraction (LVEF). Both had similar and normal RV function. There were no significant differences in T2 times between athletes with and without ventricular arrhythmia. No athlete had T2 values indicative of acute myocardial edema, and no athlete fulfilled Lake Louise Criteria for acute myocarditis. The authors concluded that fibrosis incidence was high and associated with VA, and RV insertion point LGE was not associated with VA and…sit down for this conclusion: 'Further studies are needed to establish whether myocardial fibrosis itself is arrhythmogenic or in the case of athlete's indicative of a myopathic process.' I like this conclusion, and it's different from many of the posts I have seen on social media—which go too far in scaring people about exercise. There is much to say about these observations, but, sadly, most are questions rather than answers. First: Why was the fibrosis presence so high? Almost half of people. I think the issue here is a systemic bias in this study: it's a selection bias. Or more specifically , self-selection . A researcher who does a lot of normal volunteer studies told me that, after the fact, many asymptomatic volunteers admit to symptoms they wanted checked. Now…Recall this study was done in the UK, where getting checked out may not be so easy. So you enroll 100 people who have to agree to get ILRs. To me this is a special population, who are probably more than just curious; they are probably concerned. Endurance athletes tend to be obsessive and read a lot about the heart issues in athletes. I would almost consider getting an ILR a collider bias. You are not looking at a general population of endurance exercisers but rather those endurance exercisers interested in monitoring for some reason. Second issue: There is the issue of ILR monitoring . These things are always on. They, like pacemakers, pick up everything . They are far more sensitive than a 2-week ECG or even an Apple Watch. So you have a population enhanced by concern wearing a monitor that misses nothing. In my device clinic, I probably get three nonsustained VT alerts per day — 99% of which I say, just follow. I've become increasingly convinced that short episodes of atrial and ventricular tachycardia are probably normal in older adults. Indeed, in VENTOUX, only 3 of the 100 individuals had symptoms that required clinical action—an ICD, an EP study, results unknown, and an exercise restriction. All three were symptomatic, so clinically, I agree 100% with the authors, that there is nothing in this data that would suggest screening with MRIs and ILRs. Let me repeat: VENTOUX does not support screening athletes with MRI. You can and should wait for symptoms. The third matter is the degree of LGE. The scar burden was only 2%. These are tiny scars, and most serious VT we take care of in EP comes from much larger scars. But while it is true that any scar is abnormal, we don't know if it is from exercise or healed myocarditis. And we don't know the benefit/harm ratio of lifelong exercise. Lifelong exercise is protective against diabetes, hypertension, coronary artery disease (CAD), but it may cause small LGEs in a very small proportions of patients. Notice that all these are comments are questions. This study is interesting. Intensely interesting. It's a great effort, but to offer more actionable results, beyond, don't ignore symptoms , which is an easy thing to tell people, we would need larger samples that were crucially more random in sampling. Though they tried, VENTOUX is not all a random sampling of heavy exercisers. Imagine a study wherein you went to an endurance race, and you signed up one out of every five 30-year-olds and followed them serially, as they did in Framingham, with CMR every 5 years for two or three decades. Then we might know more. But such a study is vast and super expensive. For now, I recommend regular exercise as if it was a heart pill or AF pill. You take it daily. Todos los dias. The vast majority of people I see don't exercise enough. What bike racers like to do is not at all for health . It is for fun, or for sanity; whether it is a net harm remains to be seen. But even if it were, I am not sure most of us would stop doing it. Because rodents must run on the wheel. An Australian group of researchers have published a massive systematic review and meta-analysis of daily steps in and health. Lancet Public Health published the study, which was covered in 176 news outlets as of this morning. Most of the news stories say something like the 10,000-step myth has been busted and 7000 is sufficient. The news' stories are wrong. And I will briefly cover this study because it bookends the extreme exercise study quite well. The research team had 57 studies from 35 cohorts included in the systematic review and 31 studies from 24 cohorts included in meta-analyses. They looked at dose response (in steps) and many outcomes including all-cause mortality, cardiovascular disease incidence, dementia, falls, and type 2 diabetes. The comparator or control group was 2000 steps. Compared to that those who reported or documented with counter 7000 steps per day had a: 47% reduction in all-cause death (HR 0.53 [95% CI, 0.46–0.60]; I 2 =36.3; 14 studies) 25% reduction in CV disease incidence (HR 0·75 [0·67–0·85]; I 2 =38·3%; 6 studies) 47% reduction in CV mortality (HR 0·53 [0·37–0·77]; I 2 =78·2%; 3 studies) 37% reduction in cancer mortality (HR 0·63 [0·55–0·72]; I 2 =64·5%; 3 studies) 14% reduction type 2 diabetes (HR 0·86 [0·74–0·99]; I 2 =48·5%; 4 studies) 38% reduction in dementia (HR 0·62 [0·53–0·73]; I 2 =0%; 2 studies) 28% reduction in falls (HR 0·72 [0·65–0·81]; I 2=47·5%; four studies) My issue with this study is that when you go to the main figure, the plots with hazard ratio (HR) on the y-axis and step counts on the x-axis, you see a clear dose response of steps and specific outcomes. 2000 steps is where the HR is 0. If it's less than 2000 steps, the outcomes are actually higher, but as the step count increases the HR drops. The authors pick 7000, I assume because that is where the slope of benefit seems to plateau, but when you look at the curves, the HR keeps dropping with more steps. The authors quantify the added benefit of > 7000 steps per day in Table 8 of the supplement. For all-cause death, there is an added 10% lower HR with 10,000 vs 7000. Same for cancer mortality and depressive symptoms—an added 10% lower relative risk. So I don't think any myths were busted. 7000 is fine. 7000 steps per day is associated with lower bad health outcomes. But for many, including all-cause death, 10,000 is better. The added benefits reached statistical significance. So if a patient asks, the number is still 10,000. Though 7000 is also good. And 5000 is better than 4000, which is better than 3000. After reviewing this study, the thought about European vs American life popped into my head. We have probably 2000 people working at my hospital. Less than 5 of them walk to work; less than 10 of them ride their bike. When I visited the team at Basel, Switzerland last fall, it looked like more than 75% walked or cycled to work. Very few American cities are set up for walking. That's sad. So Americans have to make an effort to be active. I think it's worth it. And I recommend it in the clinic. For my height, 7000 steps is about 3 miles. 10,000 steps is nearly 5 miles. The optimal dose is the longer one. But some is better than none. Everyone Deserves a Shot at the American Dream: Sinus Rhythm Let me say a few words here about rate vs rhythm control, because this may actually be the number one issue in all of electrophysiology. The stimulus for writing such a review piece I think comes from the PRAGUE 25 trial of lifestyle modification vs AF ablation. I have opined on that in my July 11 podcast. In sum, AF ablation led to less AF than risk factor (RF) modification alone, though 35% (or 1 in 3 patients) in the risk factor modification group had sinus rhythm (SR) without ablation. And RF modification led to more weight loss, better glycemic control, and better fitness as measured by VO 2 max. PRAGUE 25 also found no statistical differences in AF burden nor quality of life measures. The sits in the literature as a 'positive' ablation trial, but I actually think, healthwise, it is a 'positive' trial for RF modification. The Medscape article cites a 'hybrid' approach wherein all patients who pursue rhythm control also get risk factor modification, which I totally agree with, and I have to say, is underused, at least in my zip code. The absolute wrong thing to do is ablate the AF and not help the patient lose weight and improve cardiometabolic health. Because if you do this, you have merely reduced a surrogate marker — AF episodes. Health is not improved if obesity, hypertension, diabetes, and poor exercise tolerance remain. You succeed as a proceduralist but fail as a doctor in this scenario. The Medscape article goes on to celebrate the benefits of SR over AF. The next logical step is to laud rhythm control over rate control. And here I have a problem. And I somewhat disagree with friend and colleague Eric Prystowsky, MD. Eric is well known for his criticism of AFFIRM and how that trial set EP back years. But I had the pleasure of speaking in Calgary and met the late Dr. George Wyse, the principal investigator of AFFIRM. AFFIRM is one of those landmark trials that deserves your attention. Published in 2002, a total of 4060 patients with AF were randomized to rate or rhythm control. Mortality was the primary endpoint. Patients in AFFIRM were like those we see every day. 70 years old. Most with hypertension, a third with CAD. At 5 years, 23.8% in the rhythm control arm died vs 21.3% in the rate control arm. The HR was 1.15, or 15% worse for rhythm control. CIs were 0.99-1.34 so the P value for arm was just outside 0.05 but the upper bound or worse case was a 34% higher rate of death in the rhythm control arm. More patients in the rhythm-control group than in the rate-control group were hospitalized, and there were more adverse drug effects in the rhythm-control group as well. AFFIRM was largely interpreted as showing no differences in the two strategies. But, really, there was a strong trend for worse outcomes in the rhythm-control arm. One of the major changes in knowledge that came from AFFIRM is the importance of maintaining oral anticoagulation (AC). In AFFIRM, patients in the rhythm control arm who were maintaining SR could stop their oral AC. This led to a difference in AC use 85% vs 70% rate vs rhythm control. In fact, AFFIRM largely changed the view that patients with AF with either strategy should remain on oral AC. The Sherman et al substudy in JAMA Internal Medicine found that patients who remained on warfarin were 68% less likely to have stroke. A large proportion of ischemic strokes (113 of 157) occurred in patients in whom anticoagulation had been stopped (on the basis of re-established normal sinus rhythm) and who had a subtherapeutic international normalized ratio. Eric's point about AFFIRM is it led to too many patients with AF being told that there was no reason to try to get into SR, and if you don't try, and you leave patients in AF, it becomes impossible to restore SR after a year or so. The other problems (or criticisms) of AFFIRM were that patients had to be able to tolerate rate-control. So highly symptomatic patients were excluded. AFFIRM should never have been applied to these patients. Many of these symptomatic patients were younger, and it is a serious error to just leave a symptomatic younger person with AF forever in AF. Another criticism of AFFIRM was that it only included antiarrhythmic drugs (AAD), and amiodarone was the most common one used. AAD were all that was available at the time. We now know that AF ablation is far more effective at rhythm control than drugs. So there is a like a bridge to SR early on, and many patients can be put into SR with rhythm control. Proponents of aggressive rhythm control also cite the EAST-AF trial, a rhythm vs rate control trial, which strongly favored rhythm control. But EAST-AF suffered from serious performance bias issues wherein the rhythm control arm got oodles more interactions with the health system. Here is my take of the decision: AFFIRM still applies. If you have an older person with minimal to no symptoms from AF, rate control is not only fine but maybe preferred. But if there are a) symptoms, and b) clues that rhythm control is possible (e.g., the LA size is not ginormous, or the patient can cooperate, and maybe the AF is not more than 2 years persistent), I try rhythm control. But I tell patients that while there is benefit from SR (in terms of quality of life) rhythm control is hard. It costs a lot, not only in money, but investment in their time and effort. Patients have to know that RF modification is crucial, they will also have to spend a few days in the hospital (for cardioversions, maybe drug initiation or ablation). Remember, when you are getting cardioversions and AAD and ablations you are not at work or on a bike. You are being a patient. It's fine. It's an investment but patients need to know that rhythm control is unlike a gallbladder operation or an appendectomy. Rhythm control is a process that requires a friendship with a cardiologist. It's not one and done. There are also risks to rhythm control. Drug side effects and ablation complications do occur. My friends, be careful flying close to the sun with rhythm control. One of the biggest mistakes I see in general cardiology is leaping to cardioversion without a plan. CV of AF is fine, but you have to have a plan for what will happen in a week or month when the patient is back in AF. CV doesn't modify the problem of AF. It just resets the heart. In the end: EP is here to help. Get us involved. Especially when there are symptoms. And doubly especially, when there is heart failure. But don't dismiss AFFIRM. It is important trial that shows that rate control is not a terrible strategy in selected patients. GLP-1 RAs Protective Against Stroke, Neurodegeneration? A GLP-1 study purports to show benefit in cerebrovascular health. It actually shows how observational studies can mislead you. The title of the study is, 'Neurodegeneration and Stroke After Semaglutide and Tirzepatide in Patients With Diabetes and Obesity.' It was published in JAMA Network Open . The goal of the Taiwan group was to evaluate the association of semaglutide and tirzepatide with the incidence of dementia, Parkinson disease, ischemic stroke, intracerebral hemorrhage, and all-cause mortality compared with other antidiabetic drugs in adults with type 2 diabetes and obesity. It's an important question. The best way would be to randomize, but that would be hard and costly. So this was a retrospective cohort study using electronic health records from the TriNetX US network, 2017-2024. The two groups in this study were those on GLP-1, either semaglutide or tirzepatide, vs those on any other diabetic medicines, such as metformin, sulfonylureas, DPP4, SGLT2 inhibitors, and others. This was a large study. About 30,000 in each of the two groups. The groups are not randomized. A doctor chose which of the two groups of drugs to use. So, since it's not randomized the authors did propensity matching. Age 57. Half female. BMI on average 40, and 70 % with hypertension. Here were the main results: During a 7-year follow-up, GLP-1 RA users had a lower risk of dementia (HR, 0.63; 95% CI, 0.50-0.81) lower risk of stroke (HR, 0.81; 95% CI, 0.70-0.93) lower all-cause mortality (HR, 0.70; 95% CI, 0.63-0.78) and had no significant differences in the risk of Parkinson disease or intracerebral hemorrhage. The authors concluded: 'These findings suggest potential neuroprotective and cerebrovascular benefits of GLP-1 receptor agonists beyond glycemic control, warranting further trials to confirm these outcomes.' Maybe these drugs are beneficial for cerebrovascular health, especially in young people (age 57 and diabetes and BMI of 40) But this is a hopelessly confounded study where healthier patients got the more pricey and newer drug. How do I know that? There are two clues. First, is that the Kaplan-Meier curve diverges immediately and continues in parallel. That's what you expect when healthier patients get one treatment. Immediately better outcomes. If the GLP-1 was better than other drugs, you'd see gradually increasing benefit. Second reason: the mortality benefit is huge. A 30% reduction in death. In the SELECT trial of semaglutide vs placebo in patients with heart disease and obesity, semaglutide only reduced CV mortality by 15%. In the SUSTAIN-6 trial of semaglutide vs placebo, semaglutide had no sig reduction in death or CV death and required a composite endpoint to drive positive results. My overall take, therefore, is that GLP-1 drugs induce weight loss. They do modify disease in patients with obesity and diabetes and patients with obesity and atherosclerotic disease. But whether they reduce important cerebrovascular outcomes like dementia cannot be answered by these confounded observational studies. I am not sure it's worth doing these studies because the only value is to show readers the signs of bias in non-random, retrospective comparison studies.

Our Neanderthal Cousins Were Big Maggot Eaters, Scientists Argue
Our Neanderthal Cousins Were Big Maggot Eaters, Scientists Argue

Gizmodo

time2 hours ago

  • Gizmodo

Our Neanderthal Cousins Were Big Maggot Eaters, Scientists Argue

Modern humanity's most famous cousins, the Neanderthals, may have had a clever, if unappealing, dietary trick for survival: maggots. Research out today posits these creepy crawly fly larvae provided Neanderthals an ample source of essential nitrogen and fat. Scientists at Purdue University, the University of Michigan, and others conducted the study, published Friday in Science Advances. Using both experimental and historical data, they showed that maggot-infused meat is rich in fat and nitrogen and that similar human populations have commonly included such foods in their diets. The team argues that maggots are the most reasonable explanation for why Neanderthals had very high levels of nitrogen in their system. 'Fly larvae are a fat-rich, nutrient dense, ubiquitous, and easily procured insect resource, and both Neanderthals and anatomically modern humans, much like recent foragers, would benefit from taking full advantage of them,' lead author Melanie Beasley, a paleoanthropologist at Purdue, told Gizmodo. Nitrogen is a much-needed nutrient; among other things, it's used to help create amino acids, the building blocks of proteins. Speaking of protein, dietary nitrogen is most abundantly found in animal meat (though certain leafy vegetables and legumes are also high in it). The excavated remains of Neanderthals are known to have high levels of nitrogen isotopes, indicating they had plenty of nitrogen in their diets. According to Beasley, most researchers have assumed this meant Neanderthals were hypercarnivores—predators at the top of the food chain that ate lots of freshly killed large animals, mammoths included. But in 2017, co-author John Speth put forth a different hypothesis: that Neanderthals were actually eating lots of stored and putrid meat filled with maggots. Both then and now, researchers note that some Indigenous groups in the Northern Hemisphere have regularly and intentionally eaten maggot-rich food—practically as a delicacy. In 1931, for instance, Knud Rasmussen, a polar explorer and anthropologist, wrote this anecdote about him and some members of an Inuit community coming across a cache of meat: 'The meat was green with age, and when we made a cut in it, it was like the bursting of a boil, so full of great white maggots was it. To my horror my companions scooped out handfuls of the crawling things and ate them with evident relish.' Beasley heard about Speth's argument and said she could help him test it out experimentally. At the time, she was pursuing a postdoctoral degree that involved studying muscle tissue decomposition in deceased people. This work also meant Beasley would spend much of her time around the maggots that feed on decaying tissue. Beasley and her colleagues documented the changing nitrogen levels in these samples of decaying tissue along with three different species of fly maggots. As the tissue decayed, levels of nitrogen inside changed modestly. The maggots themselves, however, were chock-full of nitrogen. Given the conditions back then, it would have been impossible for Neanderthals to avoid some maggots ending up in any animal meat they tried to store. Rather than a hindrance, though, these hominids probably made the most of the situation, using the maggots to turn their lean meat into a 'fat-rich, more complete food resource,' Beasley said. The researchers are still collecting more evidence to shore up their argument for maggot-eating Neanderthals, and they're also working to understand how the nutritional benefits of maggot-rich food change over time (exactly when is rotten meat too rotten, in other words?). However Neanderthals ate their meat, though, there are many people today still using insects and maggots to spice up their diet, the researchers point out. In Europe, for instance, there's casu marzu, a Sardinian sheep's milk cheese that's intentionally laced with cheese fly (Piophila casei) maggots. Much love to my Neanderthal brethren and casu marzu fans, but I think I'll still just stick to some classic sharp cheddar for my next cheese plate.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store