logo
Flogging Molly's Dave King Suffered Brain Hemorrhage, Spent Two Weeks in Coma

Flogging Molly's Dave King Suffered Brain Hemorrhage, Spent Two Weeks in Coma

Yahoo17-03-2025
Flogging Molly frontman Dave King suffered a brain hemorrhage and spent two weeks in a coma earlier this year, according to a new statement from his wife and bandmate, Bridget Regan.
In a post shared on Flogging Molly's social media pages, Regan offered new details about her husband's recent health crisis, which forced the band to put their performance plans on hold indefinitely. At the time, Flogging Molly said only that King was 'battling a very serious health condition' and that they would share details when they could.
More from Rolling Stone
Flogging Molly Scraps 2025 Shows While Dave King Battles 'Serious Health Condition'
Walker Roaders Return With Booming Cover of Howlin' Wolf's 'Smokestack Lightning'
Flogging Molly Detail 'Swagger' 20th Anniversary Vinyl Box Set
In her new note, Regan said King suffered a brain hemorrhage on Jan. 24 and 'underwent two subsequent surgeries to save his life.' He then spent two weeks in a coma, she added, 'followed by varying stages of treatment and recovery.'
Most recently, on Feb. 28, King 'underwent yet another surgery,' with Regan saying she now felt 'confident we are on the other side of this.' She continued: 'He is now entering into the next phase of his recovery and wants nothing more than to play music again. The road ahead is uncertain but we, as ever, will roll with the punches and hope to see you all in the near future.'
Regan went on to thank King's neurosurgeon and all the medical professionals who've looked after him over the past few months. 'Your extraordinary level of care was also integral to him being where he is today,' she said.
Ending her note, Regan said: 'To friends and family whose support was unwavering throughout this ordeal, a heartfelt thank you. And a special thanks to everyone who sent well wishes and messages of support. Please look after each other and tell your people you love them. Life can change in an instant.'
King's hemorrhage forced Flogging Molly to miss their four-day Salty Dog Cruise festival, which set sail last month (the 2026 iteration is already on the books for next October). The band's headlining tour, which was set to begin on Feb. 24 in St. Petersburg, Florida, was also canceled.
Best of Rolling Stone
The 50 Greatest Eminem Songs
All 274 of Taylor Swift's Songs, Ranked
The 500 Greatest Albums of All Time
Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

'I'm Sick of the Superhero Title, Colon, Other-Name Thing' — James Gunn Explains Why Supergirl: Woman of Tomorrow Is Now Just Called Supergirl
'I'm Sick of the Superhero Title, Colon, Other-Name Thing' — James Gunn Explains Why Supergirl: Woman of Tomorrow Is Now Just Called Supergirl

Yahoo

time11 hours ago

  • Yahoo

'I'm Sick of the Superhero Title, Colon, Other-Name Thing' — James Gunn Explains Why Supergirl: Woman of Tomorrow Is Now Just Called Supergirl

Supergirl: Woman of Tomorrow is now officially called Supergirl after DC Universe chief James Gunn confirmed the change. Speaking to Rolling Stone, Gunn explained the decision, which follows a similar move for this July's Superman: Legacy to be renamed Superman. He revealed that the creative group behind his films do what's called a 'premorterm,' where they try to pre-empt issues that might cause a film to flop before it starts shooting. The name Superman: Legacy was mentioned at that point. 'I'm always cutting,' he said. 'Legacy was really — we do something called a premortem. A premortem is you get together with your group that's doing the project. It's usually about a couple months before shooting, and you go, hypothetically, 'If it's an epic disaster, what are the things that we're doing today that are going to cause it to be an epic disaster? Everyone here can speak freely.' The things you find on other productions are the things that people are whispering. 'Oh, God, I don't know why they cast that actor — he doesn't fit the role.' Or, 'The production designer's never on time.' 'One of the things I brought up was, it was called Superman: Legacy. Even though I was the one that gave it that title, I just wasn't sure. First of all, I'm sick of the superhero title, colon, other-name thing. And then also it seemed to be looking back when we're looking forward, even though it does have to do with legacy in the movie itself. And everybody was like, 'Oh, yeah, no, change it.'' Supergirl, directed by Craig Gillesipe and written by Ana Nogueira, stars Milly Alcock in the title role. We know next to nothing about it (Alcock has remained quiet in recent interviews), but A Minecraft Movie star Jason Momoa has been loose-lipped, teasing his highly anticipated Lobo costume in the forthcoming film. Gunn subsequently had a laugh about the whole thing. Supergirl — not Supergirl: Woman of Tomorrow — is due out on June 26, 2026. Wesley is Director, News at IGN. Find him on Twitter at @wyp100. You can reach Wesley at wesley_yinpoole@ or confidentially at wyp100@

John Lennon Admitted He Was ‘Scared' of This Rock Legend Despite Wanting to Work With Him
John Lennon Admitted He Was ‘Scared' of This Rock Legend Despite Wanting to Work With Him

Yahoo

time11 hours ago

  • Yahoo

John Lennon Admitted He Was ‘Scared' of This Rock Legend Despite Wanting to Work With Him

John Lennon Admitted He Was 'Scared' of This Rock Legend Despite Wanting to Work With Him originally appeared on Parade. Despite being one of the most popular musicians of all time, John Lennon still had his own insecurities when it came to other stars. In a resurfaced interview, Lennon once admitted that he was 'scared' of Elvis Presley. '[Bob] Dylan would be interesting because I think he made a great album in Blood on the Tracks, but I'm still not keen on the backings. I think I could produce him great. And Presley. I'd like to resurrect Elvis,' Lennon told Rolling Stone during a 1975 interview, when asked which musicians he'd most like to work with. 'But I'd be so scared of him I don't know whether I could do it. But I'd like to do it. Dylan, I could do, but Presley would make me nervous.' He continued, 'But Dylan or Presley, somebody up there. I know what I'd do with Presley. Make a rock 'n' roll album. Dylan doesn't need material. I'd just make him some good backings. So if you're reading this Bob, you know….' However, Lennon and Presley had previously crossed paths, and the 'Kentucky Rain' singer allegedly wasn't a fan of the Beatle due to his political stances. 'His dislike of the pacifist Beatle was born from the night I took the Fab Four to his house for their first — and last — meeting,' journalist Chris Hutchins, who introduced the two men, told the Daily Mail in 2011, per Express. 'John had annoyed Presley by making his anti-war feelings known the moment he stepped into the massive lounge and spotted the table lamps — model wagons engraved with the message: 'All the way with LBJ.'' He continued, 'Presley allied himself with the FBI director Edgar Hoover and encouraged him to have Lennon thrown out of the US.' Hutchins added that Lennon was vocal about his 'hatred' of President Lyndon B. Johnson because he 'raised the stakes in the Vietnam War.' Presley, on the other hand, was a supporter of Johnson and Lennon's opinion rubbed him the wrong way. 🎬 SIGN UP for Parade's Daily newsletter to get the latest pop culture news & celebrity interviews delivered right to your inbox 🎬 John Lennon Admitted He Was 'Scared' of This Rock Legend Despite Wanting to Work With Him first appeared on Parade on Jul 14, 2025 This story was originally reported by Parade on Jul 14, 2025, where it first appeared.

What Is Up With These Tech Billionaires? This Astrophysicist Has Answers
What Is Up With These Tech Billionaires? This Astrophysicist Has Answers

Yahoo

time17 hours ago

  • Yahoo

What Is Up With These Tech Billionaires? This Astrophysicist Has Answers

Fresh off a Ph.D. in astrophysics, science journalist Adam Becker moved to Silicon Valley with an academic's acclimation to hearing the word 'no.' 'In academic science, you need to doubt yourself,' he says. 'That's essential to the process.' So it was strange to find himself suddenly surrounded by a culture that branded itself as data-oriented and scientific but where, he soon came to realize, the ideas were more grounded in science fiction than in actual science and the grip on reality was tenuous at best. 'What this sort of crystallized for me,' says Becker, 'was that these tech guys — who people think of as knowing a lot about science — actually, don't really know anything about science at all.' In More Everything Forever: AI Overlords, Space Empires, and Silicon Valley's Crusade to Control the Fate of Humanity, published this spring, Becker subjects Silicon Valley's ideology to some much-needed critical scrutiny, poking holes in — and a decent amount of fun at — the outlandish ideas that so many tech billionaires take as gospel. In so doing, he champions reality while also exposing the dangers of letting the tech billionaires push us toward a future that could never actually exist. 'The title of the book is More Everything Forever,' says Becker. 'But the secret title of the book, like, in my heart is These Fucking People.' More from Rolling Stone It's Been One Year Since Musk Endorsed Trump. Was It Worth It? Grok Says It's Done Posting 'Hitler Fanfic' Blue States Invest Retirees' Savings in Firms Boosting Trump's Extreme Agenda Over several Zooms, Rolling Stone recently chatted with Becker about these fucking people, their magical thinking, and what the rest of us can do to fight for a reality that works for us. A lot of people who move to Silicon Valley get swept up in its vibe. How did you avoid it? I did sort of see the glittering temptation of Silicon Valley, but there's a toxic positivity to the culture. The startup ethos out here runs on positive emotion, and especially hype. It needs hype. It can't function without it. It's not enough that your startup could be widely adopted. It needs to change the world. It has to be something that's going to make everything better. So this ends up becoming an exercise in meaning-making, and then people start talking about these startups — their own or other people's — in semi-religious or explicitly-religious terms. And it was just a shock to see all of these people talking this way. It all feels plastic and fake. I thought, Oh wow, this is awful. I want to watch these people and see what the hell they're up to. I want to understand what is happening here, because this is bad. And what were they up to, as far as you could tell? Underpinning a lot of that toxic positivity was this idea that if you just make more tech, eventually tech will improve itself and become super-intelligent and godlike. [The technocrats] subscribe to a kind of ideology of technological salvation — and I use that word 'salvation' very deliberately in the Christian sense. They believe that technology is going to bring about the end of this world and usher in a new perfect world, a kind of transhumanist, algorithmically-guaranteed utopia, where every problem in the world gets reduced to a problem that can be solved with technology. And this will allow for perpetual growth, which allows for perpetual wealth creation and resource extraction. These are deeply unoriginal ideas about the future. They're from science fiction, and I didn't know how seriously people were taking them. And then I started seeing people take them very, very seriously indeed. So, I was like, 'OK, let me go talk to actual experts in the areas these people are talking about.' I talked to the experts, and: Yeah, it's all nonsense. What exactly is nonsensical about it? It's a story that is based on a lot of ideas that have no evidence for them and a great deal of evidence against them. It's based on a lot of wrong ideas. For example, I think the public perception of AI has been driven by narratives that have no foundation in reality. What does it mean to say a machine is as intelligent as a human? What does 'intelligence' mean? What does it mean to say that an intelligent machine could design an even more intelligent one? Intelligence is not this monolithic thing that is measured by IQ tests, and the history of humans trying to think about intelligence as a monolithic thing is a deeply troubling and problematic history that usually gets tied to eugenics and racism because that's what those tests were invented for. And so, unsurprisingly, there's a fair amount of eugenics and racism thrown around in these communities that discuss these ideas really seriously. There's also no particular reason to believe that the kinds of machines that we are building now and calling 'AI' are sufficiently similar to the human brain to be able to do what humans do. Calling the systems that we have now 'AI' is a kind of marketing tool. You can see that if you think about the deflation in the term that's occurred just in the last 30 years. When I was a kid, calling something 'AI' meant Commander Data from Star Trek, something that can do what humans do. Now, AI is, like, really good autocomplete. That's not to say that it would never be possible to build an artificial machine that does what humans do, but there's no reason to think that these can and a lot of reason to think that they can't. And the self-improvement thing is kind of silly, right? It's like saying, 'Oh, you can become an infinitely good brain surgeon by doing brain surgery on the brain surgery part of your brain.' Can you explain the difference between the systems we have now, which we call 'AI,' and the systems that would qualify as AGI? How big is the gulf and what are the major impediments to bridging it? So one of the problems here is that 'AGI' is ill-defined, and the vagueness is strategically useful for the people who talk about this stuff. But put that aside and just take a look at what a large language model like ChatGPT does. It's a text generation engine. I feel like that's a much better way of talking about it than calling it 'AI.' ChatGPT only cares about one thing: Generating the next word based on what words have already been generated and produced in the conversation so far. And to do that, ChatGPT consumes roughly the entire internet. It was trained on the entire Internet to pull out statistical patterns in the language usage. It's like this smeared-out average voice of the internet, and when you ask it a question, all it cares about is answering that question in that voice. It doesn't care about things like answering the question correctly. That only happens accidentally as a result of trying to sound like the text it was trained on. And so when these machines, quote-unquote, 'hallucinate,' when they make things up and get things wrong, they're not doing anything differently than they're doing when they get the right answer, because they only know how to do one thing. They're constantly hallucinating. That's all they do. So what we're calling 'artificial intelligence' is really just kind of like an advanced version of spellcheck? Yeah, in a way. I mean, this is not even the first time in the history of AI that people have been having conversations with these machines and thinking, 'Oh wow, there's actually something in there that's intelligent and helping me.' Back in the 1960s, there was this program called Eliza, that basically acted like a very simple version of a therapist that just reflects everything that you say back to you. So you say, 'Hey Eliza, I had a really bad day today,' and Eliza says, 'Oh, I'm really sorry to hear that. Why did you have a really bad day today?' And then you say, 'I got in a fight with my partner,' and they say, 'Oh, I'm really sorry to hear that. Why did you get in a fight with your partner?' I mean, it's a little bit more complicated than that but not a lot more complicated than that. It just kind of fills in the blanks. These are stock responses — something that's very clearly not thinking. And people would say, 'Oh, Eliza really helped me. I feel like Eliza really understands who I am.' The human impulse for connection is powerful. Precisely. It's the human impulse for connection — and the impulse to attribute human-like characteristics to things that are not humans, which we do constantly. We do it with our pets. We do it with random patterns that we find in nature. We'll see an arrangement of rocks and think, 'Oh, that's a smiley face.' That's called 'pareidolia.' And that's what this is. So current AI is not even close to being human, but these tech titans think it could be godlike? Sam Altman gave a talk two or three years ago, and he was asked a question about global warming, and he said something like, 'Oh, global warming is a really serious problem, but if we have a super-intelligent AI, then we can ask it, 'Hey, how do you build a lot of renewable energy? And hey, how do you build a lot of carbon capture systems? And hey, how do we build them at scale cheaply and quickly?' And then it would solve global warming.' What Sam Altman is saying is that his plan for solving global warming is to build a machine that nobody knows how to build and can't even define and then ask it for three wishes. But they really believe that this is coming. Altman said earlier this year that he thinks that AGI is coming in the next four years. If a godlike AI is coming, then global warming doesn't matter. All that matters is making sure that the godlike AI is good and comes soon and is friendly and helpful to us. And so, suddenly, you have a way of solving all of the problems in the world with this one weird trick, and that one weird trick is the tech that these companies are building. It offers the possibility of control, it offers the possibility of transcendence of all boundaries, and it offers the possibility of tremendous amounts of money. If you have an understanding of what the technology is doing right now — versus some magical idea of what it could be doing — it sounds like it would be hard to trust it with the future of humanity. Is it just complete delusion? There's a lot of delusional thinking at work, and it's really, really easy to believe stuff that makes you rich. But there's also a lot of groupthink. If everybody around you believes this, then that makes it more likely that you're going to believe it, too. And then if all of the most powerful people and the wealthiest people and the most successful people and the most intelligent-seeming people around you all believe this, it's going to make it harder for you not to believe it. And the arguments that they give sound pretty good at first blush. You have to really drill down to find what's wrong with them. If you were raised on a lot of science fiction, especially, these ideas are very familiar to you — and I say this as a huge science fiction fan. And so when you start looking at ideas like super-intelligent AI or going to space, these ideas carry a lot of cultural power. The point is, it's very easy for them to believe these things, because it goes along with this picture of the future that they already had, and it offers to make them a lot of money and give them a lot of power and control. It gives them the possibility of ignoring inconvenient problems, problems that often they themselves are contributing to through their work. And it also gives them a sense of moral absolution and meaning by providing this grand vision and project that they're working toward. They want to save humanity. [Elon] Musk talks about this all the time. [Jeff] Bezos talks about this. Altman talks about this. They all talk about this. And I think that's a pretty powerful drug. Then throw in, for the billionaires, the fact that when you're a billionaire, you get insulated from the world and from criticism because you're surrounded by sycophants who want your money, and it becomes very hard to change your mind about anything. Your reality testing gets pretty messed up. Yeah, exactly. Also, a lot of these ideas just sound ridiculous, and so there hasn't been as much trenchant criticism as there should have been for the past decades. And now, suddenly, these guys have lots of money, and they're saying what the future is, and people are just believing that. So what you're telling me is that I'm not gonna get to live on Mars. Yeah, that's right. You're not going to. But you shouldn't be disappointed because Mars sucks. Mars fucking sucks. Just to name a few of the problems: gravity is too low, the radiation is too high, there's no air, and the dirt is made of poison. Sounds fun. Also you're going to freeze even if you solve all of those problems. I mean there are some spots where you wouldn't freeze if you really bundled up, but Elton John was right: Mars isn't the place to raise your kid. It's really terrifying to see the most powerful people in the world — and some of the loudest voices in the world — confuse these beliefs with reality. You talk about in the book about how this is a sort of messianic belief, but also about how technological utopia won't be available to everyone — which is a pretty common view in apocalyptic narratives, right? There's a chosen group that will get to enjoy the utopia, but not everyone will. Look, inequality is a fundamental feature of the world, and I think nobody knows that better than these billionaires. I don't mean 'fundamental' in the sense that it's unalterable. I just mean it's fundamental to how we've structured our society, and billionaires are beneficiaries of that. But I think that in the version of these utopias that are promoted by these tech billionaires, there are definitely unseen and unquestioned forms of inequality that would lead to some people having a lot more control and a lot more of that utopia than other people would get. A lot of this is in the form of questions that, surprisingly, people don't tend to ask these tech billionaires. Jeff Bezos says that he wants humanity living in giant space stations that have millions of people, and he wants millions of these space stations, so there'll be one trillion people in space generations from now. And that leads to questions like, 'OK, buddy, who's gonna own that?' One of the nice things about living on Earth is that we have these shared natural resources. If you go out into space into an artificial environment that, say, Blue Origin is going to be building, doesn't that mean that Blue Origin or some successor company is going to own those space stations and all of the air and water and whatnot inside? And doesn't that mean that there's somebody who's going to be effectively king of the space station? And if everybody lives in these space stations, isn't that going be not just a company town but a company civilization? Musk talks about a city with a million people on Mars. The air won't even be free, right? You'll have to pay Musk just to stay alive. That's not my vision of utopia, and I think not many other people's either. It seems pretty unlikely that these guys are going to get this utopia of which they dream, so how concerned should we even be about their delusions? They have so much power and so much money that the choices that they make about how to exercise that power and spend that money unavoidably affect the rest of us. This is a real danger that we are seeing and experiencing right now. Musk thinks that his mission to go to Mars and beyond is the salvation of humanity — he has said as much in as many words — and he believes that, therefore, nothing should be allowed to stand in his way, not even law. So, therefore, he supported a lawless candidate for President of the United States, a literal felon, and said that it was important for the future of humanity that that felon win. This is a billionaire interfering with the democratic process and trying to erode the democratic fabric of this country — and succeeding — in order to pursue his own personal vision of utopia that will never happen. That's a fucking problem. And that makes it everybody's business. I suppose it's also a question of who gets to decide which problems are humanity's biggest. Which is what a lot of this comes down to, right? Part of the problem with trying to solve issues in the world through billionaire philanthropy is that it's fundamentally undemocratic who gets to make the decision: The billionaire gets to make the decision. Who elected the billionaire? Nobody. And so billionaire philanthropy is an exercise of power and deserves skepticism rather than gratitude. But I think a lot of these billionaires see wealth as proof of someone's value and intelligence, and since they're the wealthiest people who have ever lived, that makes them the smartest people who have ever lived and so they are the ones who should be leading us into this new utopia. And if the rest of us can't see it or think that it doesn't work, well, that's because we're not as smart as they are. And if experts tell them that it can't work, well, then the experts are wrong, because, you know, if they are smart, why are they so poor? It's like [these technocrats] are constantly high on a drug called billions of dollars, and the human brain was not built to deal with that. It insulates them from criticism and makes it harder for them to think critically. What can we do about all this? Are we all just basically fucked? Well, look, the billionaires have an enormous amount of power and money, but there's a lot more of us than there are of them. Also, we can think critically, and so I think there's a few different things that we can do. In the short term we need to organize. One of the things that these guys are completely terrified by — and it's one of the reasons they love AI — is the idea of labor organization. They don't want workers rising up. They don't want to have to deal with workers at all, and so I think labor organizing is really important. I think political organizing is really important. We need to build political power structures that can counterbalance the massively outsized power of this really very small community of individuals who just have massive amounts of wealth. And I know that that sounds kind of facile, but I really do think it's what we have to do, and historically it is how [people] have always combated the very wealthy and their fantasies of power. We can also point out when they're wrong. Say 'The emperor has no clothes, we are not going to Mars, and that is ridiculous.' Public ridicule of these ideas — informed, factually accurate public ridicule — is part of what I'm trying to do, and I think it's a really important and powerful tool. And then in the longer term — hopefully not that far away, if we get to a place where we have political power to balance these guys out — I think we've got to tax their wealth away. They did not earn that money alone. They needed the infrastructure and community that the rest of us provide and they also, frankly, needed a lot of government investment. They are the biggest welfare queens in existence, right? Silicon Valley got enormous amounts of government spending to benefit it over the years, both on infrastructure and in buying products and whatnot. The government built the internet. The government was the biggest client of Silicon Valley back when it was first starting up through buying computer chips for the space program. The government built the space program without which you wouldn't be able to have something like SpaceX. So I think it's time to stop giving them handouts and start saying, 'What we invested, the bill has come due.' Best of Rolling Stone Every Super Bowl Halftime Show, Ranked From Worst to Best The United States of Weed Gaming Levels Up

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store