Latest news with #FairlyTrained
Yahoo
11-06-2025
- Entertainment
- Yahoo
How the Disney-Midjourney Suit Could Reshape AI Copyright Law
Mickey Mouse at Disneyland in California. Credit - Getty Images On Wednesday, the long-simmering dispute between Hollywood and the AI industry escalated dramatically when Disney and Universal sued Midjourney, one of the most prominent AI image generators, for copyright infringement. The two Hollywood heavyweight studios argue that Midjourney allows its users to 'blatantly incorporate and copy Disney's and Universal's famous characters,' such as Shrek and Spider-Man. 'Piracy is piracy, and the fact that it's done by an AI company does not make it any less infringing,' Horacio Gutierrez, Disney's chief legal officer, said in a general statement. The lawsuit challenges one of the AI industry's fundamental assumptions: that it should be allowed to train upon copyrighted materials under the principle of fair use. How the case gets resolved could have major implications for both AI and Hollywood going forward. 'I really think the only thing that can stop AI companies doing what they're doing is the law,' says Ed Newton-Rex, the CEO of nonprofit organization Fairly Trained, which provides certifications for AI models trained on licensed data. 'If these lawsuits are successful, that is what will hopefully stop AI companies from exploiting people's life's work.' AI companies train their models upon vast amounts of data scoured from across the web. While most of these companies have resisted admitting that they scrape copyrighted material, there are already dozens of AI copyright-related lawsuits in the U.S. alone alleging otherwise. Midjourney, which allows its millions of registered users to generate images from prompts, faces a class-action suit led by artists including Kelly McKernan, who found that users were inputting the artist's name as a keyword in Midjourney to spit out eerily similar artworks. 'These companies are profiting wildly off our unpaid labor,' they told TIME in 2023. For the last few years, Hollywood has refrained from entering the fray, while sending mixed messages about AI. During contract negotiations in 2023, AI was a major source of contention between unions like SAG-AFTRA and producers, who advanced a 'groundbreaking AI proposal' involving the use of 'digital replicas' to fill out the backgrounds of film scenes. Read More: Even AI Filmmakers Think Hollywood's AI Proposal Is Dangerous But while some in Hollywood hope AI will make filmmaking more efficient and less expensive, many more have grown concerned about the AI industry's usage of copyrighted material. This concern has come to a head with the Disney-Universal lawsuit, which is the first major lawsuit brought by Hollywood studios against an AI company. The lawsuit seeks damages and an injunction that would immediately stop Midjourney's operations—and casts generative AI theft as a problem that 'threatens to upend the bedrock incentives of U.S. copyright law.' Midjourney did not immediately respond to a request for comment. 'We are bringing this action today to protect the hard work of all the artists whose work entertains and inspires us and the significant investment we make in our content,' said Kim Harris, executive vice president and general counsel of NBCU. Newton-Rex believes that this lawsuit is particularly significant because of the size, influence and resources of Disney and Universal. 'The more that these mainstays of the American economy weigh into this fight, the harder it is to ignore the simple truth here,' he says. In February, a Delaware judge dealt a blow to the AI industry's 'fair use' argument, ruling that a legal research firm was not allowed to copy the content of Thomson Reuters to build a competing AI-based legal platform. If the Disney-Universal lawsuit is similarly successful, that would have major implications for both AI and Hollywood, says Naeem Talukdar, the CEO of the AI video startup Moonvalley. Many AI companies might have to retrain their visual models from the ground up with licensed content. And Hollywood, if given legal clarity, might actually accelerate its usage of AI models built upon licensed content, like ones built by Natasha Lyonne's and Bryn Mooser's Asteria Film Co. 'Nobody wants to touch these models with a 10-foot pole, because there's a sense that you'll just get sued on the outputs later,' Talukdar says. 'I would expect that if this judgment falls a certain way, you'll see a lull, and then you'll have a new class of models emerge that pays the creators. And then you'll see this avalanche of studios that can now actually start using these models much more freely.' Unsurprisingly, AI companies are fighting back in court. They're also working on another path forward to retain their ability to train their models as they see fit: through governmental policy. In January, OpenAI sent a memo to the White House arguing their ability to train on copyrighted material should be 'preserved.' They then relaxed several rules around copyright in the name of 'creative freedom,' which triggered a flood of Studio Ghibli-style images on social media. In the U.K., the government announced plans to give AI companies access to any copyrighted work that rights holders hadn't explicitly opted out of, which drew a huge backlash from stars like Paul McCartney and Dua Lipa. Last week, the House of Lords rejected the legislation for a fourth time. Newton-Rex says that this dispute over AI and copyright will not be resolved any time soon. 'Billion-dollar AI companies have staked their entire businesses on the idea that they are allowed to take people's life's work and build on it to compete with them. I don't think they're easily going to give that up because of one lawsuit,' he says. Nevertheless, he says that the announcement of this lawsuit is 'really good for creators everywhere.' Contact us at letters@


Time Magazine
11-06-2025
- Entertainment
- Time Magazine
How the Disney-Midjourney Lawsuit Could Reshape the Battle Over AI and Copyright
On Wednesday, the long-simmering dispute between Hollywood and the AI industry escalated dramatically when Disney and Universal sued Midjourney, one of the most prominent AI image generators, for copyright infringement. The two Hollywood heavyweight studios argue that Midjourney allows its users to 'blatantly incorporate and copy Disney's and Universal's famous characters,' such as Shrek and Spider-Man. 'Piracy is piracy, and the fact that it's done by an AI company does not make it any less infringing,' Horacio Gutierrez, Disney's chief legal officer, said in a general statement. The lawsuit challenges one of the AI industry's fundamental assumptions: that it should be allowed to train upon copyrighted materials under the principle of fair use. How the case gets resolved could have major implications for both AI and Hollywood going forward. 'I really think the only thing that can stop AI companies doing what they're doing is the law,' says Ed Newton-Rex, the CEO of nonprofit organization Fairly Trained, which provides certifications for AI models trained on licensed data. 'If these lawsuits are successful, that is what will hopefully stop AI companies from exploiting people's life's work.' A growing backlash against AI training norms AI companies train their models upon vast amounts of data scoured from across the web. While most of these companies have resisted admitting that they scrape copyrighted material, there are already dozens of AI copyright-related lawsuits in the U.S. alone alleging otherwise. Midjourney, which allows its millions of registered users to generate images from prompts, faces a class-action suit led by artists including Kelly McKernan, who found that users were inputting the artist's name as a keyword in Midjourney to spit out eerily similar artworks. 'These companies are profiting wildly off our unpaid labor,' they told TIME in 2023. For the last few years, Hollywood has refrained from entering the fray, while sending mixed messages about AI. During contract negotiations in 2023, AI was a major source of contention between unions like SAG-AFTRA and producers, who advanced a 'groundbreaking AI proposal' involving the use of 'digital replicas' to fill out the backgrounds of film scenes. But while some in Hollywood hope AI will make filmmaking more efficient and less expensive, many more have grown concerned about the AI industry's usage of copyrighted material. This concern has come to a head with the Disney-Universal lawsuit, which is the first major lawsuit brought by Hollywood studios against an AI company. The lawsuit seeks damages and an injunction that would immediately stop Midjourney's operations—and casts generative AI theft as a problem that 'threatens to upend the bedrock incentives of U.S. copyright law.' Midjourney did not immediately respond to a request for comment. 'We are bringing this action today to protect the hard work of all the artists whose work entertains and inspires us and the significant investment we make in our content,' said Kim Harris, executive vice president and general counsel of NBCU. Newton-Rex believes that this lawsuit is particularly significant because of the size, influence and resources of Disney and Universal. 'The more that these mainstays of the American economy weigh into this fight, the harder it is to ignore the simple truth here,' he says. In February, a Delaware judge dealt a blow to the AI industry's 'fair use' argument, ruling that a legal research firm was not allowed to copy the content of Thomson Reuters to build a competing AI-based legal platform. If the Disney-Universal lawsuit is similarly successful, that would have major implications for both AI and Hollywood, says Naeem Talukdar, the CEO of the AI video startup Moonvalley. Many AI companies might have to retrain their visual models from the ground up with licensed content. And Hollywood, if given legal clarity, might actually accelerate its usage of AI models built upon licensed content, like ones built by Natasha Lyonne's and Bryn Mooser's Asteria Film Co. 'Nobody wants to touch these models with a 10-foot pole, because there's a sense that you'll just get sued on the outputs later,' Talukdar says. 'I would expect that if this judgment falls a certain way, you'll see a lull, and then you'll have a new class of models emerge that pays the creators. And then you'll see this avalanche of studios that can now actually start using these models much more freely.' A governmental loophole? Unsurprisingly, AI companies are fighting back in court. They're also working on another path forward to retain their ability to train their models as they see fit: through governmental policy. In January, OpenAI sent a memo to the White House arguing their ability to train on copyrighted material should be ' preserved.' They then relaxed several rules around copyright in the name of 'creative freedom,' which triggered a flood of Studio Ghibli-style images on social media. In the U.K., the government announced plans to give AI companies access to any copyrighted work that rights holders hadn't explicitly opted out of, which drew a huge backlash from stars like Paul McCartney and Dua Lipa. Last week, the House of Lords rejected the legislation for a fourth time. Newton-Rex says that this dispute over AI and copyright will not be resolved any time soon. 'Billion-dollar AI companies have staked their entire businesses on the idea that they are allowed to take people's life's work and build on it to compete with them. I don't think they're easily going to give that up because of one lawsuit,' he says. Nevertheless, he says that the announcement of this lawsuit is 'really good for creators everywhere.'


New York Times
25-02-2025
- Entertainment
- New York Times
Their Album is Wordless. Will Their Protest Against A.I. Resound?
Sometimes, silence speaks louder than song. That's the hope, at least, for more than 1,000 musicians who released a lyric-less album on Tuesday to protest the British government's proposal to expand the ways that developers can use copyright-protected works to train artificial intelligence models. The album, which was created by artists including Annie Lennox, Billy Ocean, Hans Zimmer and Kate Bush, is not exactly silent: It features recordings of empty studios, which the artists say represent 'the impact we expect the government's proposals would have on musicians' livelihoods.' There are footsteps and rustles — is that a door closing? a page turning? a fly? — but only the most out-there contemporary composers would refer to the sounds as songs. 'Doesn't that silence say it all?' Kate Bush, who contributed to the album, said in a statement, adding, 'If these changes go ahead, the life's work of all the country's musicians will be handed over to A.I. companies for free.' Under the government's proposals, artists would have to opt out, or 'reserve their rights,' to keep their works from being used to train A.I. The window for public comments on the proposal, which is part of a broader government consultation on copyright and artificial intelligence, was set to close Tuesday night. 'Opt-out shifts the burden of controlling your works onto the rights holder,' said Ed Newton-Rex, who organized the album and is the chief executive of Fairly Trained, a nonprofit that certifies generative A.I. companies for the training data they use. 'Basically,' he said, of the current government proposal, 'it flips copyright on its head.' Even as some artists experiment with artificial intelligence, many fear that developers are inappropriately using their work without compensating them. (Publishers and journalists are also concerned: The New York Times has sued OpenAI and Microsoft for copyright infringement of news content related to A.I. systems. OpenAI and Microsoft have denied those claims.) The album — titled 'Is This What We Want?' — has 12 songs, each of which has a one-word title that together spell out the sentence: 'The British government must not legalize music theft to benefit A.I. companies.' Only some of the artists who were part of the album project directly contributed to the audio, Mr. Newton-Rex said, although he said that all shared in the credits. Mr. Newton-Rex and other critics fear that artists may not even know if their work is being used to train the A.I. models. He said that he had previously run opt-out schemes at generative A.I. companies, which he called an 'illusion,' in part because copyrighted work can spread so quickly online that creators can lose control of it. Powerful A.I. developers have repeatedly shown that they are willing to skirt copyright law to train systems. And Britain, desperate to revive its sluggish economy, is aggressively trying to court A.I. developers. Prime Minister Keir Starmer recently said he plans to push Britain to be 'the world leader.' The country has already signaled its willingness to break with the European Union and some of its other allies, like Australia and Canada, in its attitude to the technology. At a recent A.I. summit in Paris, Britain sided with the United States in declining to sign a communiqué calling for A.I. to be 'inclusive and sustainable.' Now, Britain is arguing that a 'competitive copyright regime' is part of what is needed to 'build cutting-edge, secure and sustainable A.I. infrastructure.' The proposals, which were announced late last year, call the current system unclear and say that it is hampering innovation for both A.I. developers and artists. Britain argues that the proposed changes are meant to give artists more control over the way their work is used and more opportunities for payment. In response to a request for comment, the Department for Science, Innovation and Technology said that Britain's current copyright structure is holding both artists and A.I. companies back from full innovation. But it also noted that no decisions had been finalized and that it would consider the responses it received before setting out next steps. Britain's consultation process, in which the government asks for public input at the early stages of policy proposals, is designed to take in feedback and often leads to revisions. As the consultation period ended on Tuesday, British artists and publishers released a series of protests. Several newspapers featured identical campaign images across their front pages that read: 'Make it fair: The government wants to change the U.K.'s laws to favor big tech platforms so they can use British creative content.' The musicians Paul McCartney, Elton John and Dua Lipa, the novelist Kazuo Ishiguro and the actor Stephen Fry were among the artists who signed a letter in protest that was published in The Times of London. 'There is no moral or economic argument for stealing our copyright,' the artists wrote. 'Taking it away will devastate the industry and steal the future of the next generation.'


The Independent
25-02-2025
- Business
- The Independent
UK creative industries launch campaign against AI tech firms' content use
The UK 's creative industries have launched a new campaign to fight back against their content being used for free by global tech AI firms. Campaigners have warned that the arts face an 'existential threat' from AI models which scrape creative content from the internet without permission or payment. It comes at the end of a government consultation which will determine whether to let tech companies use content without permission unless the creators specifically say 'no'. Those affected could include artists, authors, journalists, illustrators, photographers, filmmakers, scriptwriters, singers and songwriters, who argue that they will now have to police their work. The campaign has stressed that if the government legitimises this use of content, the impact will be devastating on an industry which collectively brings in £120bn per year to the UK economy. Throughout the next week, media outlets will run the 'Make It Fair' campaign with the message: 'The government wants to change the UK's laws to favour big tech platforms so they can use British creative content to power their AI models without our permission or payment. Let's protect the creative industries – it's only fair.' Launching the campaign today, Owen Meredith, CEO of News Media Association, said: 'We already have gold-standard copyright laws in the UK. They have underpinned growth and job creation in the creative economy across the UK – supporting some of the world's greatest creators – artists, authors, journalists, scriptwriters, singers and songwriters to name but a few. 'And for a healthy democratic society, copyright is fundamental to publishers' ability to invest in trusted quality journalism. 'The only thing which needs affirming is that these laws also apply to AI, and transparency requirements should be introduced to allow creators to understand when their content is being used. Instead, the government proposes to weaken the law and essentially make it legal to steal content. 'There will be no AI innovation without the high-quality content that is the essential fuel for AI models. We're appealing to the great British public to get behind our 'Make It Fair' campaign and call on the government to guarantee creatives are able to secure proper financial reward from AI firms to ensure a sustainable future for both AI and the creative industries.' Launching a music industry campaign to coincide with the 'Make It Fair' campaign, choral composer Ed Newton-Rex, founder of Fairly Trained, a non-profit that certifies generative AI companies for training data practices that respect creators' rights, said: 'One thousand UK musicians released a joint album today, recordings of empty studios, calling on the government to change course or risk empty studios becoming the norm. 'The government's proposals would hand the life's work of the UK's talented creators – its musicians, its writers, its artists – to AI companies, for free. The government must change course and make it fair.'


Voice of America
25-02-2025
- Entertainment
- Voice of America
Musicians release silent album to protest UK's AI copyright changes
More than 1,000 musicians including Kate Bush and Cat Stevens on Tuesday released a silent album to protest proposed changes to Britain's copyright laws which could allow tech firms to train artificial intelligence models using their work. Creative industries globally are grappling with the legal and ethical implications of AI models that can produce their own output after being trained on popular works without necessarily paying the creators of the original content. Britain, which Prime Minister Keir Starmer wants to become an AI superpower, has proposed relaxing laws that currently give creators of literary, dramatic, musical and artistic works the right to control the ways their material may be used. The proposed changes would allow AI developers to train their models on any material to which they have lawful access, and would require creators to proactively opt out to stop their work being used. The changes have been heavily criticized by many artists, who say it would reverse the principle of copyright law, which grants exclusive control to creators for their work. "In the music of the future, will our voices go unheard?" said Bush, whose 1985 hit "Running Up That Hill" enjoyed a resurgence in 2022 thanks to Netflix show "Stranger Things." The co-written album titled "Is This What We Want?" features recordings of empty studios and performance spaces to represent what organizers say is the potential impact on artists' livelihoods should the changes go ahead. A public consultation on the legal changes closes later on Tuesday. Responding to the album, a government spokesperson said the current copyright and AI regime was holding back the creative industries, media and AI sector from "realizing their full potential." "We have engaged extensively with these sectors throughout and will continue to do so. No decisions have been taken," the spokesperson said, adding that the government's proposals will be set out in due course. Annie Lennox, Billy Ocean, Hans Zimmer, Tori Amos and The Clash are among the musicians urging the government to review its plans. "The government's proposal would hand the life's work of the country's musicians to AI companies, for free, letting those companies exploit musicians' work to outcompete them," said organizer Ed Newton-Rex, the founder of Fairly Trained, a non-profit that certifies generative AI companies for fairer training data practices. "The UK can be leaders in AI without throwing our world-leading creative industries under the bus."