Latest news with #SAG-AFTRA


NZ Herald
an hour ago
- Entertainment
- NZ Herald
‘Disturbing': Mrs Doubtfire star's wild plan to revive Robin Williams with AI
The idea came to Lawrence after watching one of Williams' old commercials. The actor, who died by suicide in 2014 at age 63, had done a computerised voiceover that Lawrence found eerily before its time. 'It's kinda like this very contemporary, modern, almost sort of foreshadowing of what's going on commercial that he did, where he did this computerised voiceover,' he said. 'And it always stuck with me. And then, during his passing, with the AI coming out, I'm like, 'Man, he's gotta be the voice of AI He's gotta be the voice in something.' So yeah, I would love to do that.' There's no denying Williams' voice is instantly recognisable. Hearing it again would likely spark nostalgia and joy for many. Still, using AI in a project like this raises serious concerns. Lawrence, clearly enthusiastic, suggested plenty of possibilities for Robin's voice in tech – even using it for driving navigation. 'It would be Robin!' Lawrence said. 'It would be so cool. I'm telling you.' Yelling at Siri or Alexa when you miss a turn is normal these days. But yelling at the voice of Robin Williams might come with some follow-up guilt. Williams' daughter, Zelda Williams, has publicly slammed AI recreations of her father. During the SAG-AFTRA strike in October 2023, she posted a statement on social media expressing her discomfort with the technology. 'I am not an impartial voice in SAG's fight against AI,' Zelda, 35, wrote. 'I've witnessed for YEARS how many people want to train these models to create/recreate actors who cannot consent, like Dad.' 'I've already heard AI used to get his 'voice' to say whatever people want and while I find it personally disturbing, the ramifications go far beyond my own feelings,' she continued. 'Living actors deserve a chance to create characters with their choices, to voice cartoons, to put their HUMAN effort and time into the pursuit of performance.' Lawrence was just a pre-teen when he played Williams' son in 1993's Mrs Doubtfire, and their relationship clearly made a lasting impression. At the first annual '90s Con in 2022, Lawrence shared how Williams – who struggled with substance abuse – warned him about the dangers of drugs during filming. '[Williams] was very serious. He was like, 'You know when you come to my trailer and you see me like that?'' Lawrence shared. 'He's like, 'That's the reason why. And now I'm fighting for the rest of my life because I spent 10 years doing something very stupid every day. Do not do it.' I stayed away from it because of him.' In an earlier interview with Entertainment Weekly, Lawrence reflected on two key lessons Williams taught him: the importance of compassion, and not judging others without walking in their shoes. 'He really quantified what it was to be a real artist for me in the sense that he was definitely,' Lawrence said, 'and I worked with some great people, and he was definitely the most brilliant artist I've ever worked with.'


Los Angeles Times
5 days ago
- Entertainment
- Los Angeles Times
De-aged stars, cloned voices, resuscitated dead icons: AI is changing the art and business of acting
For filmmaker Scott Mann, three dozen F-bombs had the makings of a million-dollar headache. When Mann wrapped 'Fall,' a 2022 thriller about two women stranded atop a 2,000-foot radio tower, he figured the hard part was over. Shot in the Mojave Desert on a $3-million budget, the film didn't have money to burn and seemed on course. But Lionsgate wanted a PG-13 rating and, with 35 expletives, 'Fall' was headed for an R. Reshoots would cost more than $1 million — far beyond what the production could afford. In the past, a director might have taken out a second mortgage or thrown themselves at the mercy of the ratings board. Mann instead turned to AI. A few years earlier, he had been dismayed by how a German dub of his 2015 thriller 'Heist' flattened the performances, including a key scene with Robert De Niro, to match stiff, mistranslated dialogue. That frustration led Mann to co-found Flawless, an AI startup aimed at preserving the integrity of an actor's performance across languages. As a proof of concept, he used the company's tech to subtly reshape De Niro's mouth movements and restore the emotional nuance of the original scene. On 'Fall,' Mann applied that same technology to clean up the profanity without reshoots, digitally modifying the actors' mouths to match PG-13-friendly lines like 'freaking' — at a fraction of the cost. As AI stirs both hype and anxiety in Hollywood, Mann understands why even such subtle digital tweaks can feel like a violation. That tension came to a head during the 2023 SAG-AFTRA strike, in which AI became the defining flash point in the fight over acting's future. 'Ours is a rights-based industry,' says Mann, 45, who helped develop a digital rights management platform at Flawless to ensure performers approve any changes to their work. 'It's built on protecting human creativity, the contributions of actors, directors, editors, and if those rights aren't protected, that value gets lost.' Still, Mann doesn't see AI as a threat so much as a misunderstood tool — one that, used carefully, can support the artists it's accused of replacing. Flawless' DeepEditor, for example, lets directors transfer facial expressions from one take to another, even when the camera angle or lighting changes, helping actors preserve their strongest moments without breaking continuity. 'Plenty of actors I've worked with have had that moment where they see what's possible and realize, 'Oh my God, this is so much better,'' Mann says. 'It frees them up, takes off the pressure and helps them do a better job. Shutting AI out is naive and a way to end up on the wrong side of history. Done right, this will make the industry grow and thrive.' AI isn't hovering at the edges of acting anymore — it's already on soundstages and in editing bays. Studios have used digital tools to de-age Harrison Ford in 'Indiana Jones and the Dial of Destiny,' resurrect Peter Cushing's Grand Moff Tarkin in 'Rogue One' and clone Val Kilmer's voice in 'Top Gun: Maverick' after throat cancer left him unable to speak. The technology has reshaped faces, smoothed dialogue and fast-tracked everything from dubbing to reshoots. And its reach is growing: Studios can now revive long-dead stars, conjure stunt doubles who never get hurt and rewrite performances long after wrap. But should they? As the tools grow more sophisticated, the threat to actors goes beyond creative disruption. In an industry where steady work is already elusive and the middle class of working actors is vanishing, AI raises the prospect of fewer jobs, lower pay and, in a dystopian twist, a future in which your disembodied face and voice might get work without you. Background actors were among the first to sound the alarm during the 2023 strike, protesting studio proposals to scan them once and reuse their likenesses indefinitely. That scenario is already beginning to unfold: In China, a state-backed initiative will use AI to reimagine 100 kung fu classics, including films starring Jackie Chan and Bruce Lee, through animation and other digital enhancements. Lee's estate said it was unaware of the project, raising questions about how these actors' likenesses might be used, decades after filming. If the soul of acting is a human presence, what remains when even that can be simulated? 'You want to feel breath — you want to feel life,' said actor and director Ethan Hawke during a panel at 2023's Telluride Film Festival, where strike-era unease over AI was palpable. 'When we see a great painting, we feel a human being's blood, sweat and tears. That's what we're all looking for, that connection with the present moment. And AI can't do that.' Justine Bateman may seem like an unlikely crusader in Hollywood's fight against AI. Launched to fame as Mallory Keaton on the 1980s sitcom 'Family Ties,' she later became a filmmaker and earned a computer science degree from UCLA. Now, as founder of the advocacy group CREDO23, Bateman has become one of the industry's fiercest voices urging filmmakers to reject AI-generated content and defend the integrity of human-made work. Loosely modeled on Dogme 95, CREDO23 offers a certification of films made without AI, using minimal VFX and union crews. It's a pledge backed by a council including 'Mad Men' creator Matthew Weiner, 'The Handmaid's Tale' director Reed Morano and actor Juliette Lewis. The 2023 SAG-AFTRA contract set new guardrails: Studios must get actors' consent to create or use digital replicas of their likenesses, and those replicas can't generate new performances without a separate deal. Actors must also be compensated and credited when their digital likeness is used. But to Bateman, a former SAG-AFTRA board member and negotiating committee rep, those protections are little more than sandbags against an inevitable AI flood: hard-won but already straining to keep the technology at bay. 'The allowances in the contract are pretty astounding,' Bateman says by phone, her voice tight with exasperation. 'If you can picture the Teamsters allowing self-driving trucks in their contract — that's on par with what SAG did. If you're not making sure human roles are played by human actors, I'm not sure what the union is for.' To Bateman, the idea that AI expands access to filmmaking — a central tenet of its utopian sales pitch — is a dangerous myth, one that obscures deeper questions about authorship and the value of creative labor. 'Anyone can make a film — my last two, I shot on an iPhone,' Bateman says. 'The idea that AI is 'democratizing film' doesn't even make sense. What it really does is remove the barrier of skill. It lets people pretend they're filmmakers when they're not, by prompting software that wouldn't even function without having stolen a hundred years of film and TV production made by real filmmakers.' Bateman's opposition to AI is rooted in a deep distrust of Silicon Valley's expanding influence over the creative process and a belief that filmmaking should be driven by artists, not algorithms. 'The tech bro business completely jumped the shark with generative AI,' she says. 'Is it solving plastics in the ocean? Homelessness? L.A. traffic? Not that I'm aware of.' She scoffs at the supposed efficiencies AI brings to the filmmaking process: 'It's like saying, whatever somebody enjoys — sex or an ice cream sundae — 'Hey, now you can do it in a quarter of the time.' OK, but then what do you think life is for?' To Bateman, an actor's voice, face, movements or even their choice of costume is not raw material to be reshaped but an expression of authorship. AI, in her view, erases those choices and the intent behind them. 'I'm deeply against changing what the actor did,' she says. 'It's not right to have the actor doing things or saying things they didn't do — or to alter their hair, makeup or clothes in postproduction using AI. The actor knows what they did.' While Bateman has been public and unwavering in her stance, many actors remain unsure whether to raise their voices. In the wake of the strikes, much of the conversation around AI has moved behind closed doors, leaving those who do speak out feeling at times exposed and alone. Scarlett Johansson, who lent her smoky, hypnotic voice to the fictional AI in Spike Jonze's Oscar-winning 2013 film 'Her,' now finds herself in a uniquely uncomfortable position: She's both a symbol of our collective fascination with artificial performance and a real-world example of what's at stake when that line is crossed. Last year, she accused OpenAI of using a chatbot voice that sounded 'eerily similar' to hers, months after she declined to license it. OpenAI denied the claim and pulled the voice, but the incident reignited concern over consent and control. Johansson has long spoken out against the unauthorized use of her image, including her appearance in deepfake pornography, and has pushed for stronger safeguards against digital impersonation. To date, though, she is one of the few major stars to publicly push back against the creeping mimicry enabled by AI — and she's frustrated that more haven't joined her. 'There has to be some agreed-upon set of boundaries in order for [AI] to not be detrimental,' she told Vanity Fair in May. 'I wish more people in the public eye would support and speak out about that. I don't know why that's not the case.' Ed Ulbrich, 60, a pioneering visual effects producer and co-founder of Digital Domain, has spent his career helping actors do the impossible, one pixel at a time. In 2008's 'The Curious Case of Benjamin Button,' he led the team of more than 150 artists in building a fully digital version of Brad Pitt's face so the actor could convincingly age in reverse — a two-year effort that earned Ulbrich and three colleagues an Oscar for visual effects and set a new benchmark for digital performance. (Nearly two decades later, the achievement is still impressive, although some scenes, especially those with Pitt's aged face composited on a child's body, now show their digital seams.) For 2010's 'Tron: Legacy,' Ulbrich helped digitally transform Jeff Bridges into his 1982 self using motion capture and CGI. Working on last year's 'Here' — Robert Zemeckis' technically daring drama starring Tom Hanks and Robin Wright as a couple whose lives play out across decades in a single New Jersey living room — showed Ulbrich just how far things have come. For someone who jokes he has 'real estate in the uncanny valley,' it wasn't just the AI-enabled realism that floored him. It was the immediacy. On set, AI wasn't enhancing footage after the fact; it was visually reshaping the performance in real time. 'You look up and see 67-year-old Tom Hanks. You look down at the monitor — he's 20, and it looks better than the best CGI,' Ulbrich says. 'In my world, the human face is the holy grail. That is the most complicated thing you can do. And now it's getting done in near real time before your eyes. The actor can come back and look at the monitor and get new ideas, because they're seeing a different version of themselves: younger, older, as an alien or whatever.' This kind of seamless AI-driven alteration marks a new frontier in postproduction. Modern AI systems can now 'beautify' actors' faces, like some would with a Instagram or Zoom filter: smooth out wrinkles, alter skin tone, sharpen jawlines, subtly nudge eye position to better match a desired gaze. What once required painstaking VFX can now be handled by fast, flexible AI tools, often with results invisible to audiences. Once limited to only big-budget sci-fi and fantasy productions, this digital touch-up capability is expanding into rom-coms, prestige dramas, high-end TV and even some indie films. Dialogue can be rewritten and re-lipped in post. Facial expressions can be smoothed or swapped without reshoots. More and more, viewers may have no way of knowing what's real and what's been subtly adjusted. 'Here' was largely rejected by both audiences and critics, with some deeming its digitally de-aged performances more unsettling than moving. But Ulbrich says digitally enhanced performance is already well underway. Talent agency CAA has built a vault of client scans, a kind of biometric asset library for future productions. Some stars now negotiate contracts that reduce their time on set, skipping hours in the makeup chair or performance-capture gear, knowing AI can fill in the gaps. 'Robert Downey, Brad Pitt, Will Smith — they've all been scanned many times,' says Ulbrich, who recently joined the AI-driven media company Moonvalley, which pitches itself as a more ethical, artist-centered player in the space. 'If you've done a studio tentpole, you've been scanned. 'There is a lot of fear around AI and it's founded,' he adds. 'Unless you do something about it, you can just get run over. But there are people out there that are harnessing this. At this point, fighting AI is like fighting against electricity.' While many in Hollywood wrestle with what AI means for the oldest component of moviemaking, others take a more pragmatic view, treating it as a tool to solve problems and keep productions on track. Jerry Bruckheimer, the powerhouse producer behind 'Top Gun,' 'Pirates of the Caribbean' and this summer's 'F1,' is among those embracing its utility. 'AI is not going anywhere and it's only going to get more useful for people in our business,' he said in a recent interview with The Times. He recalled one such moment during post-production on his new Brad Pitt–led Formula One drama, a logistical feat filmed during actual Formula One races across Europe and the Middle East, with a budget north of $200 million. 'Brad was in the wilds of New Zealand, and we had test screenings coming up,' Bruckheimer says. 'We couldn't get his voice to do some looping, so we used an app that could mimic Brad Pitt. I'm sure the union will come after me if you write that, but it wasn't used in the movie because he became available.' While he's skeptical of AI's ability to generate truly original ideas — 'We're always going to need writers,' he says — Bruckheimer, whose films have grossed more than $16 billion worldwide, sees AI as a powerful tool for global reach. 'They can take Brad's voice from the movie and turn it into other languages so it's actually his voice, rather than another actor,' he says. 'If it's not available yet, it will be.' The debate over AI in performance flared earlier this year with 'The Brutalist,' Brady Corbet's award-winning drama about a Hungarian architect. After the film's editor, Dávid Jancsó, revealed that AI voice-cloning software had been used to subtly modify the Hungarian accents of stars Adrien Brody and Felicity Jones, the backlash followed swiftly. Some critics accused the film of using AI to smooth over performances while presenting itself as handcrafted, a move one viral post derided as trying to 'cheap out without soul.' Corbet later clarified that AI was used sparingly, only to adjust vowel sounds, but the decision left some viewers uneasy — even as Brody went on to win the Oscar for lead actor. If the controversy over 'The Brutalist' struck some as a moral crisis, David Cronenberg found the whole thing overblown. Few filmmakers have probed the entanglement of flesh, identity and technology as relentlessly as the director of 'Videodrome,' 'The Fly' and last year's 'The Shrouds,' so he's not particularly rattled by the rise of AI-assisted performances. 'All directors have always messed around with actors' performances — that's what editing is,' Cronenberg told The Times in April. 'Filmmaking isn't theater. It's not sacred. We've been using versions of this for years. It's another tool in the toolbox. And it's not controlling you — you can choose not to use it.' Long before digital tools, Cronenberg recalls adjusting actor John Lone's vocal pitch in his 1993 film 'M. Butterfly,' in which Lone played a Chinese opera singer and spy who presents as a woman to seduce a French diplomat. The director raised the pitch when the character appeared as a woman and lowered it when he didn't — a subtle manipulation to reinforce the illusion. Far from alarmed, Cronenberg is intrigued by AI's creative potential as a way of reshaping authorship itself. With new platforms like OpenAI's Sora and Google's Veo 3 now capable of generating increasingly photorealistic clips from simple text prompts, an entire performance could conceivably be conjured from a writer's keyboard. 'Suddenly you can write a scene — a woman is walking down the street, she looks like this, she's wearing that, it's raining, whatever — and AI can create a video for you,' Cronenberg says. 'To me, this is all exciting. It absolutely can threaten all kinds of jobs and that has to be dealt with, but every technological advance has done that and we just have to adapt and figure it out.' In the Hollywood of the late 1970s, there was no AI to tweak an actor's face. So when 'Star Wars' star Mark Hamill fractured his nose and left cheekbone in a serious car crash between shooting the first and second films, the solution was to tweak the story. The 1980 sequel 'The Empire Strikes Back' opened with Luke Skywalker being attacked by a nine-foot-tall snow beast called a wampa on the ice planet Hoth, partly to account for the change in his appearance. Decades later, when Hamill was invited to return as a younger version of himself in the 2020 Season 2 finale of 'The Mandalorian,' the chance to show Luke 'at the height of his powers was irresistible,' he says. But the reality left him feeling oddly detached from the character that made him famous. Hamill shared the role with a younger body double, and digital de-aging tools recreated his face from decades earlier. The character's voice, meanwhile, was synthesized using Respeecher, a neural network trained on old recordings of Hamill to mimic his speech from the original trilogy era. 'I didn't have that much dialogue: 'Are you Luke Skywalker?' 'I am,'' Hamill recalled in an interview with The Times earlier this year. 'I don't know what they do when they take it away, in terms of tweaking it and making your voice go up in pitch or whatever.' When fans speculated online that he hadn't participated at all, Hamill declined to correct the record. 'My agent said, 'Do you want me to put out a statement or something?'' Hamill recalls. 'I said, 'Eh, people are going to say what they want to say.' Maybe if you deny it, they say, 'See? That proves it — he's denying it.'' When Luke returned again in a 2022 episode of 'The Book of Boba Fett,' the process was even more synthetic: Hamill was minimally involved on camera and the character was built almost entirely from digital parts: a de-aged face mapped onto a body double with an AI-generated voice delivering his lines. Hamill was credited and compensated, though the exact terms of the arrangement haven't been made public. The visual effect was notably improved from earlier efforts, thanks in part to a viral deepfake artist known as Shamook, whose YouTube video improving the VFX in 'The Mandalorian' finale had racked up millions of views. He was soon hired by Industrial Light & Magic — a rare case of fan-made tech critique turning into a studio job. 'In essence, yes, I did participate,' Hamill says. It's one thing to be digitally altered while you're still alive. It's another to keep performing after you're gone. Before his death last year, James Earl Jones — whose resonant baritone helped define Darth Vader for generations — gave Lucasfilm permission to recreate his voice using AI. In a recent collaboration with Disney, Epic Games deployed that digital voice in Fortnite, allowing players to team up with Vader and hear new lines delivered in Jones' unmistakable tones, scripted by Google's Gemini AI. In May, SAG-AFTRA later filed a labor charge, saying the use of Jones' voice hadn't been cleared with the union. Last year's 'Alien: Romulus' sparked similar backlash over the digital resurrection of Ian Holm's android character Ash nearly a decade after Holm's death. Reconstructed using a blend of AI and archival footage, the scenes were slammed by some fans as a form of 'digital necromancy.' For the film's home video release, director Fede Álvarez quietly issued an alternate cut that relied more heavily on practical effects, including an animatronic head modeled from a preexisting cast of Holm's face. For Hollywood, AI allows nostalgia to become a renewable resource, endlessly reprocessed and resold. Familiar faces can be altered, repurposed and inserted into entirely new stories. The audience never has to say goodbye and the industry never has to take the risk of introducing someone new. Hamill, for his part, seems ready to let go of Luke. After his final arc in 2017's 'The Last Jedi,' he says he feels a sense of closure. 'I don't know the full impact AI will have but I find it very ominous,' he says. 'I'm fine. I had my time. Now the spotlight should be on the current and future actors and I hope they enjoy it as much as I did.' Actor Tye Sheridan knows how dark an AI future could get. After all, he starred in Steven Spielberg's 2018 'Ready Player One,' a sci-fi thriller set inside a corporate-controlled world of digital avatars. But Sheridan isn't trying to escape into that world — he's trying to shape the one ahead. With VFX supervisor Nikola Todorovic, Sheridan co-founded Wonder Dynamics in 2017 to explore how AI can expand what's possible on screen. Their platform uses AI to insert digital characters into live-action scenes without green screens or motion-capture suits, making high-end VFX more accessible to low-budget filmmakers. Backed by Spielberg and 'Avengers' co-director Joe Russo, Wonder Dynamics was acquired last year by Autodesk, the software firm behind many animation and design tools. 'Since the advent of the camera, technology has been pushing this industry forward,' Sheridan, 28, says on a video call. 'AI is just another part of that path. It can make filmmaking more accessible, help discover new voices. Maybe the next James Cameron will find their way into the industry through some AI avenue. I think that's really exciting.' With production costs spiraling, Todorovic sees AI as a way to lower the barrier to entry and make riskier, more ambitious projects possible. 'We really see AI going in that direction, where you can get those A24-grounded stories with Marvel visuals,' he says. 'That's what younger audiences are hungry for.' The shift, Todorovic argues, could lead to more films overall and more opportunities for actors. 'Maybe instead of 10,000 people making five movies, it'll be 1,000 people making 50,' he says. Still, Todorovic sees a threshold approaching, one where synthetic actors could, in theory, carry a film. 'I do think technically it is going to get solved,' Todorovic says. 'But the question remains — is that what we really want? Do we really want the top five movies of the year to star humans who don't exist? I sure hope not.' For him, the boundary isn't just about realism. It's about human truth. 'You can't prompt a performance,' he says. 'You can't explain certain movements of the body and it's very hard to describe emotions. Acting is all about reacting. That's why when you make a movie, you do five takes — or 40. Because it's hard to communicate.' Sheridan, who has appeared in the 'X-Men' franchise as well as smaller dramas like 'The Card Counter' and 'The Tender Bar,' understands that instinctively and personally. 'I started acting in films when I was 11 years old,' he says. 'I wouldn't ever want to build something that put me out of a job. That's the fun part — performing, exploring, discovering the nuances. That's why we fall in love with certain artists: their unique sensibility, the way they do what no one else can.' He knows that may sound contradictory coming from the co-founder of an AI company. That's exactly why he believes it's critical that artists, not Silicon Valley CEOs, are the ones shaping how the technology is used. 'We should be skeptical of AI and its bad uses,' he says. 'It's a tool that can be used for good or bad. How are we going to apply it to create more access and opportunity in this industry and have more voices heard? We're focused on keeping the artist as an essential part of the process, not replacing them.' For now, Sheridan lives inside that paradox, navigating a technology that could both elevate and imperil the stories he cares most about. His next acting gig? 'The Housewife,' a psychological drama co-starring Naomi Watts and Michael Imperioli, in which he plays a 1960s New York Times reporter investigating a suspected Nazi hiding in Queens. No AI. No doubles. Just people pretending to be other people the old way, while it lasts.


Scottish Sun
22-07-2025
- Entertainment
- Scottish Sun
Netflix admits it used AI to make ‘amazing' scene in hit TV show – but did YOU spot it?
Click to share on X/Twitter (Opens in new window) Click to share on Facebook (Opens in new window) NETFLIX has admitted to using generative AI to create visual effects in a new original TV show - are you able to spot it? The streaming giant confirmed the move in its latest earnings call, with co-CEO Ted Sarandos saying they traded in traditional VFX for generative AI in one scene. Sign up for Scottish Sun newsletter Sign up 2 The Eternaut began airing on Netflix in April Credit: Netflix Doing so was not only faster, but much cheaper than outsourcing the shot to a traditional VFX house, Engadget first reported. Creators of the sci-fi Netflix original, The Eternaut, wanted a collapsing building sequence to anchor a key moment in the story. But to detail such a scene would have required VFX that was apparently out of budget for the Argentine post-apocalyptic drama. "Using AI-powered tools, they were able to achieve an amazing result with remarkable speed," Sarandos said. READ MORE ON AI PER-FUMING Gobsmacked Molly-Mae targeted by AI scam as she issues warning to fans "In fact, that VFX sequence was completed 10 times faster than it could have been completed with... traditional VFX tools and workflows." Sarandos added that the shot "just wouldn't have been feasible for a show on that budget." Generative AI, or Gen AI, is a type of artificial intelligence that can create text, images, music, and videos from prompts given to it by humans. This content can be in all kinds of style - cartoonish, or even hyper-realistic, and therefore difficult to distinguish from real life. Netflix has reportedly got plans to roll out AI-generated adverts for ad-tier subscribers in 2026. The company is also testing a new search feature powered by OpenAI models, according to Bloomberg. Hugely popular Netflix show is ENDING after seven years – leaving fans devastated But The Eternaut marks a milestone, becoming "the very first Gen AI final footage to appear on screen in a Netflix original series or film," Sarandos said. The shift towards generative AI is already happening within Hollywood. Films like 10-time Oscar nominee The Brutalist and Late Night with the Devil faced backlash for even light AI involvement. The issue is already on the radar of SAG-AFTRA, a union whose members went on strike against AI use in video games last summer. "The video game industry generates billions of dollars in profit annually. The driving force behind that success is the creative people who design and create those games," SAG-AFTRA president, Fran Drescher, said at the time. "That includes the SAG-AFTRA members who bring memorable and beloved game characters to life, and they deserve and demand the same fundamental protections as performers in film, television, streaming and music: fair compensation and the right of informed consent for the AI use of their faces, voices and bodies."


Buzz Feed
21-07-2025
- Entertainment
- Buzz Feed
Ex-Euphoria Actor's New Job Sparks Fan Outrage
Euphoria fans just received a huge reality check after seeing what it's really like being a struggling actor in Hollywood. Nika King is widely known for playing Leslie Bennett (aka Rue and Gia's mom) on the hit HBO series for the first two seasons. Three years after Season 2 debuted in January 2022, it was announced that Season 3 had finally begun filming. It even included a photo of Zendaya on set for additional proof. Delays in production were due to a slew of reasons, which ranged from the SAG-AFTRA and WGA strikes in 2023 to alleged behind-the-scenes drama with claims of a "toxic" work environment and grueling "18-hour" workdays. During this delay, Euphoria producer Kevin Turen died in November 2023 from multiple heart issues, and Angus Cloud, who starred as the beloved Fez, tragically died of an accidental overdose at the age of 25 in July 2023. On top of all that, two cast members revealed they had no plans on returning to the show: Barbie Ferriera announced the shocking news shortly after Season 2 ended, while Storm Reid revealed she would be focusing on other things, like graduating from USC and working on projects through her production company. The delays also caused strain and stress for some of the cast members, including Nika. In March 2024, during one of her stand-up comedy shows posted on her TikTok, the actor and comedian admitted she was having trouble paying her rent because filming for Euphoria was taking so long to commence. 'People are like, 'We need Season 3.' I'm like, 'Bitch, I need Season 3. I haven't paid my rent in six months. I thought my career was on the rise after Euphoria. I thought I was good. It don't work that way. I called Taraji [P. Henson} and she was like, 'Bitch, get used to it.''While she recently starred in the American drama film Sound of Hope: The Story of Possum Trot, we should note that filming for it began back in 2022. Fast forward to February 2025, Nika announced her "character is not coming back to the show," after she received tons of DMs and comments asking about Season 3. "Unfortunately, I'm not [returning]," she said in a video posted online. "My character is not coming back to the show, but I am forever grateful to HBO, Zendaya, and Sam Levinson for giving me the opportunity to come on set." "Without Euphoria, I was not able to step into who I am as an actor." Fans showered her with words of encouragement on her future endeavors, while also expressing sadness that they won't see her in future episodes. But concern for Nika grew when she recently shared a video of her cleaning what appeared to be a restaurant kitchen. She captioned the clip, "When ppl ask me if I'm filming S3 of Euphoria?" while the caption for the post read, "A job is a job." Reactions ranged from saying how "fucked up" it is that her promising acting career led her to a non-Hollywood position, to applause for her vulnerability in sharing the realities of being a struggling actor in the industry: At first glance, I had similar thoughts, but I quickly found out there was more to her post than meets the eye. After noticing she tagged an account on the post I checked the page out, and discovered Nika not only works in that kitchen, but she co-owns the establishment it's located in. The account is linked to Blue Tree Cafe, a vegan soul food restaurant run by Nika and her mother, Sharon Allen. The page was a pleasant surprise because the food looks delicious, and their playful relationship shines in their videos. And once others caught on as well, But owning a restaurant is no easy feat either. Nika hopped on a popular social media trend where people show how they mask their true emotions while dealing with a crisis — and for Nika, that was the current status of the restaurant. Nika and her mother also set up a GoFundMe as a way to help the establishment from closing. I truly appreciate the transparency, even if it's done through laughter. I don't know about you, but I'm wishing Nika success in all of her passions — acting, comedy, and restaurateur.

Engadget
18-07-2025
- Entertainment
- Engadget
Netflix is already using generative AI in its original shows
Netflix admitted during its earnings call on Thursday that it used generative AI to create VFX in The Eternaut , a Netflix original from Argentina that was released in April 2025. The company's co-CEO Ted Sarandos said that generative AI was specifically used for a VFX shot in the post-apocalyptic drama, but the move is one of several ways Netflix is embracing AI. According to Sarandos, the creators of The Eternaut wanted to include a shot of building collapsing in Buenos Aires, and rather than contract a studio of visual effects artists to create the footage, Netflix used generative AI to create it. "Using AI powered tools, they were able to achieve an amazing result with remarkable speed," Sarandos shared during the earnings call. "In fact, that VFX sequence was completed 10 times faster than it could have been completed with... traditional VFX tools and workflows." The shot "just wouldn't have been feasible for a show on that budget," Sarandos says, as someone with some input on the show's budget. The executive says that The Eternaut features "the very first Gen AI final footage to appear on screen in a Netflix original series or film." Clearly, the show is also a prototype for how Netflix can avoid costs it doesn't want to swallow in the future. Workers in the entertainment industry have not taken kindly to the use of generative AI. Labor strikes — including the recently resolved SAG-AFTRA video game strike — have made securing protections against AI a central issue. The Oscar-nominated film The Brutalist came under fire in 2024 for using AI tools during production. Beyond that, whether generative AI models were illegally trained on copyrighted material is still an open question. Netflix plans to use generative AI to create ads for its ad-support Netflix subscription, and the company is reportedly testing a new search feature powered by OpenAI models. Using generative AI in production might seem par for the course for a company that's already invested, but it could help to normalize a technology that many creatives remain actively against.