logo
#

Latest news with #SidneyStein

The New York Times wants your private ChatGPT history — even the parts you've deleted
The New York Times wants your private ChatGPT history — even the parts you've deleted

The Hill

time06-07-2025

  • Business
  • The Hill

The New York Times wants your private ChatGPT history — even the parts you've deleted

Millions of Americans share private details with ChatGPT. Some ask medical questions or share painful relationship problems. Others even use ChatGPT as a makeshift therapist, sharing their deepest mental health struggles. Users trust ChatGPT with these confessions because OpenAI promised them that the company would permanently delete their data upon request. But last week, in a Manhattan courtroom, a federal judge ruled that OpenAI must preserve nearly every exchange its users have ever had with ChatGPT — even conversations the users had deleted. As it stands now, billions of user chats will be preserved as evidence in The New York Times's copyright lawsuit against OpenAI. Soon, lawyers for the Times will start combing through private ChatGPT conversations, shattering the privacy expectations of over 70 million ChatGPT users who never imagined their deleted conversations could be retained for a corporate lawsuit. In January, The New York Times demanded — and a federal magistrate judge granted — an order forcing OpenAI to preserve 'all output log data that would otherwise be deleted' while the litigation was pending. In other words, thanks to the Times, ChatGPT was ordered to keep all user data indefinitely — even conversations that users specifically deleted. Privacy within ChatGPT is no longer an option for all but a handful of enterprise users. Last week, U.S. District Judge Sidney Stein upheld this order. His reasoning? It was a 'permissible inference' that some ChatGPT users were deleting their chats out of fear of being caught infringing the Times's copyrights. Stein also said that the preservation order didn't force OpenAI to violate its privacy policy, which states that chats may be preserved 'to comply with legal obligations.' This is more than a discovery dispute. It's a mass privacy violation dressed up as routine litigation. And its implications are staggering. If courts accept that any plaintiff can freeze millions of uninvolved users' data, where does it end? Could Apple preserve every photo taken with an iPhone over one copyright lawsuit? Could Google save a log of every American's searches over a single business dispute? The Times is opening Pandora's box, threatening to normalize mass surveillance as another routine tool of litigation. And the chilling effects may be severe; when people realize their AI conversations can be exploited in lawsuits that they're not part of, they'll self-censor — or abandon these tools entirely. Worst of all, the people most affected by this decision — the users — were given no notice, no voice, and no chance to object. When one user tried to intervene and stop this order, the magistrate judge dismissed him as not 'timely,' apparently expecting 70 million Americans to refresh court dockets daily and maintain litigation calendars like full-time paralegals. And last Thursday, Stein heard only from advocates for OpenAI and the Times, not from advocates for ordinary people who use ChatGPT. Affected users should have been allowed to intervene before their privacy became collateral damage. The justification for the unprecedented preservation order was paper-thin. The Times argued that people who delete their ChatGPT conversations are more likely to have committed copyright infringement. And as Stein put it in the hearing, it's simple 'logic' that '[i]f you think you're doing something wrong, you're going to want that to be deleted.' This fundamentally misapprehends how people use generative AI. The idea that users are systematically stealing the Times's intellectual property through ChatGPT, then cleverly covering their tracks, ignores the thousand legitimate reasons people delete chats. Users share intimate details about their lives with ChatGPT; of course they clear their conversations. This precedent is terrifying. Now, Americans' private data could be frozen when a corporate plaintiff simply claims — without proof — that Americans' deleted content might add marginal value to their case. Today it's ChatGPT. Tomorrow it could be your cleared browser history or your location data. All they need to do is argue that Americans who delete things must have something to hide. We hope the Times will back away from its stunning position. This is the newspaper that won a Pulitzer for exposing domestic wiretapping in the Bush era. The paper that built its brand in part by exposing mass surveillance. Yet here it is, demanding the biggest surveillance database in recorded history — a database that the National Security Agency could only dream of — all to win a copyright case. Now, in the next step of this litigation, the Times's lawyers will start sifting through users' private chats — all without users' knowledge or consent. To be clear, the question of whether OpenAI infringed the Times's copyrights is for the courts to decide. But the resolution of that dispute should not cost 70 million Americans their privacy. What the Times calls 'evidence,' millions of Americans call 'secrets.' Maybe you have asked ChatGPT how to handle crippling debt. Maybe you have confessed why you can't sleep at night. Maybe you've typed thoughts you've never said out loud. Delete should mean delete. The New York Times knows better — it just doesn't care. Jay Edelson has been recognized by Forbes as one of America's top 200 lawyers and by Fortune as one of the most creative people in business. His privacy cases have recovered over $1.5 billion for consumers nationwide.

OpenAI appeals data preservation order in NYT copyright case
OpenAI appeals data preservation order in NYT copyright case

Indian Express

time08-06-2025

  • Business
  • Indian Express

OpenAI appeals data preservation order in NYT copyright case

OpenAI is appealing an order in a copyright case brought by the New York Times that requires it to preserve ChatGPT output data indefinitely, arguing that the order conflicts with privacy commitments it has made with users. Last month, a court said OpenAI had to preserve and segregate all output log data after the Times asked for the data to be preserved. 'We will fight any demand that compromises our users' privacy; this is a core principle,' OpenAI CEO Sam Altman said in a post on X on Thursday. 'We think this (The Times demand) was an inappropriate request that sets a bad precedent.' U.S. District Judge Sidney Stein was asked to vacate the May data preservation order on June 3, a court filing showed. The New York Times declined to comment. The newspaper sued OpenAI and Microsoft in 2023, accusing them of using millions of its articles without permission to train the large language model behind its popular chatbot. Stein said in an April court opinion that the Times had made a case that OpenAI and Microsoft were responsible for inducing users to infringe its copyrights. The opinion explained an earlier order that rejected parts of an OpenAI and Microsoft motion to dismiss, saying that the Times' 'numerous' and 'widely publicized' examples of ChatGPT producing material from its articles justified allowing the claims to continue.

OpenAI to appeal New York Times suit demand asking to not delete any user chats
OpenAI to appeal New York Times suit demand asking to not delete any user chats

Time of India

time06-06-2025

  • Business
  • Time of India

OpenAI to appeal New York Times suit demand asking to not delete any user chats

HighlightsOpenAI is appealing a court order requiring it to preserve ChatGPT output data indefinitely, citing conflicts with user privacy commitments. OpenAI Chief Executive Officer Sam Altman stated that the demand from The New York Times sets a bad precedent and compromises user privacy. The New York Times has sued OpenAI and Microsoft for allegedly using its articles without permission to train the language model behind ChatGPT. OpenAI is appealing an order in a copyright case brought by the New York Times that requires it to preserve ChatGPT output data indefinitely, arguing that the order conflicts with privacy commitments it has made with users. Last month, a court said OpenAI had to preserve and segregate all output log data after the Times asked for the data to be preserved. "We will fight any demand that compromises our users' privacy; this is a core principle," OpenAI CEO Sam Altman said in a post on X on Thursday. "We think this (The Times demand) was an inappropriate request that sets a bad precedent." U.S. District Judge Sidney Stein was asked to vacate the May data preservation order on June 3, a court filing showed. The New York Times did not immediately respond to a request for comment outside regular business hours. The newspaper sued OpenAI and Microsoft in 2023, accusing them of using millions of its articles without permission to train the large language model behind its popular chatbot. Stein said in an April court opinion that the Times had made a case that OpenAI and Microsoft were responsible for inducing users to infringe its copyrights. The opinion explained an earlier order that rejected parts of an OpenAI and Microsoft motion to dismiss, saying that the Times' "numerous" and "widely publicised" examples of ChatGPT producing material from its articles justified allowing the claims to continue.

OpenAI appeals data preservation order in NYT copyright case
OpenAI appeals data preservation order in NYT copyright case

Business Recorder

time06-06-2025

  • Business
  • Business Recorder

OpenAI appeals data preservation order in NYT copyright case

OpenAI is appealing an order in a copyright case brought by the New York Times that requires it to preserve ChatGPT output data indefinitely, arguing that the order conflicts with privacy commitments it has made with users. Last month, a court said OpenAI had to preserve and segregate all output log data after the Times asked for the data to be preserved. 'We will fight any demand that compromises our users' privacy; this is a core principle,' OpenAI CEO Sam Altman said in a post on X on Thursday. OpenAI to open office in Seoul amid growing demand for ChatGPT 'We think this (The Times demand) was an inappropriate request that sets a bad precedent.' U.S. District Judge Sidney Stein was asked to vacate the May data preservation order on June 3, a court filing showed. The New York Times did not immediately respond to a request for comment outside regular business hours. The newspaper sued OpenAI and Microsoft in 2023, accusing them of using millions of its articles without permission to train the large language model behind its popular chatbot. Stein said in an April court opinion that the Times had made a case that OpenAI and Microsoft were responsible for inducing users to infringe its copyrights. The opinion explained an earlier order that rejected parts of an OpenAI and Microsoft motion to dismiss, saying that the Times' 'numerous' and 'widely publicized' examples of ChatGPT producing material from its articles justified allowing the claims to continue.

OpenAI appeals data preservation order in NYT copyright case
OpenAI appeals data preservation order in NYT copyright case

The Hindu

time06-06-2025

  • Business
  • The Hindu

OpenAI appeals data preservation order in NYT copyright case

OpenAI is appealing an order in a copyright case brought by the New York Times that requires it to preserve ChatGPT output data indefinitely, arguing that the order conflicts with privacy commitments it has made with users. Last month, a court said OpenAI had to preserve and segregate all output log data after the Times asked for the data to be preserved. "We will fight any demand that compromises our users' privacy; this is a core principle," OpenAI CEO Sam Altman said in a post on X on Thursday. "We think this (The Times demand) was an inappropriate request that sets a bad precedent." U.S. District Judge Sidney Stein was asked to vacate the May data preservation order on June 3, a court filing showed. The New York Times did not immediately respond to a request for comment outside regular business hours. The newspaper sued OpenAI and Microsoft in 2023, accusing them of using millions of its articles without permission to train the large language model behind its popular chatbot. Stein said in an April court opinion that the Times had made a case that OpenAI and Microsoft were responsible for inducing users to infringe its copyrights. The opinion explained an earlier order that rejected parts of an OpenAI and Microsoft motion to dismiss, saying that the Times' "numerous" and "widely publicized" examples of ChatGPT producing material from its articles justified allowing the claims to continue.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store