logo
#

Latest news with #StaceyWales

Justice at Stake as Generative AI Enters the Courtroom
Justice at Stake as Generative AI Enters the Courtroom

Asharq Al-Awsat

time04-07-2025

  • Asharq Al-Awsat

Justice at Stake as Generative AI Enters the Courtroom

Generative artificial intelligence (GenAI) is making its way into courts despite early stumbles, raising questions about how it will influence the legal system and justice itself. Judges use the technology for research, lawyers utilize it for appeals and parties involved in cases have relied on GenAI to help express themselves in court. "It's probably used more than people expect," said Daniel Linna, a professor at the Northwestern Pritzker School of Law, about GenAI in the US legal system. "Judges don't necessarily raise their hand and talk about this to a whole room of judges, but I have people who come to me afterward and say they are experimenting with it'. In one prominent instance, GenAI enabled murder victim Chris Pelkey to address an Arizona courtroom -- in the form of a video avatar -- at the sentencing of the man convicted of shooting him dead in 2021 during a clash between motorists. "I believe in forgiveness," said a digital proxy of Pelkey created by his sister, Stacey Wales. The judge voiced appreciation for the avatar, saying it seemed authentic. "I knew it would be powerful," Wales told , "that that it would humanize Chris in the eyes of the judge." The AI testimony, a first of its kind, ended the sentencing hearing at which Wales and other members of the slain man's family spoke about the impact of the loss. Since the hearing, examples of GenAI being used in US legal cases have multiplied. "It is a helpful tool and it is time-saving, as long as the accuracy is confirmed," said attorney Stephen Schwartz, who practices in the northeastern state of Maine. "Overall, it's a positive development in jurisprudence." Schwartz described using ChatGPT as well as GenAI legal assistants, such as LexisNexis Protege and CoCounsel from Thomson Reuters, for researching case law and other tasks. "You can't completely rely on it," Schwartz cautioned, recommending that cases proffered by GenAI be read to ensure accuracy. "We are all aware of a horror story where AI comes up with mixed-up case things." The technology has been the culprit behind false legal citations, far-fetched case precedents, and flat-out fabrications. In early May, a federal judge in Los Angeles imposed $31,100 in fines and damages on two law firms for an error-riddled petition drafted with the help of GenAI, blasting it as a "collective debacle." The tech is also being relied on by some who skip lawyers and represent themselves in court, often causing legal errors. And as GenAI makes it easier and cheaper to draft legal complaints, courts already overburdened by caseloads could see them climb higher, said Shay Cleary of the National Center for State Courts. "Courts need to be prepared to handle that," Cleary said. Transformation Law professor Linna sees the potential for GenAI to be part of the solution though, giving more people the ability to seek justice in courts made more efficient. "We have a huge number of people who don't have access to legal services," Linna said. "These tools can be transformative; of course we need to be thoughtful about how we integrate them." Federal judges in the US capitol have written decisions noting their use of ChatGPT in laying out their opinions. "Judges need to be technologically up-to-date and trained in AI," Linna said. GenAI assistants already have the potential to influence the outcome of cases the same way a human law clerk might, reasoned the professor. Facts or case law pointed out by GenAI might sway a judge's decision, and could be different than what a legal clerk would have come up with. But if GenAI lives up to its potential and excels at finding the best information for judges to consider, that could make for well-grounded rulings less likely to be overturned on appeal, according to Linna.

Late Arizona man forgives killer during trial in AI-generated video
Late Arizona man forgives killer during trial in AI-generated video

Yahoo

time11-05-2025

  • Yahoo

Late Arizona man forgives killer during trial in AI-generated video

(NewsNation) — Arizona man Christopher Pelkey's voice was taken from him forever when a man fatally shot him during a road rage incident in November 2021. But with the help of artificial intelligence and his family, the Army veteran left parting words for his killer. 'In another life we probably could have been friends,' a replica of Pelkey said in the AI-generated video played earlier this month in a Phoenix courtroom. This victim recreation is believed to be the first time AI has been used for a victim impact statement during a trial. The statements are a chance for victims and families to say their peace, but sometimes the victims aren't alive to do so. 'To Gabriel Horcasitas, the man who shot me, it is a shame we encountered each other that day in those circumstances,' the video said. 'I believe in forgiveness and in God, who forgives. I always have and I still do.' The victim's sister, Stacey Wales, said she struggled to find the right words to say in a victim impact statement, so she turned to AI. 'I turned to my husband one night and I asked him, I said, 'Tim, I want you to help me have Chris make his own impact statement at sentencing next week. I know you can do it. I've seen your work,'' Wales told NewsNation. 'And he says, 'Stacy, do you know what you're asking me? This is my best friend.'' The couple works in tech, and they had just a few days to come up with the 4 and a half-minute video using photos and voice recordings with the script of what they believed Pelkey would have said. The result was realistic. The man charged with shooting Pelkey to death could be seen wiping away tears at the hearing. Horcasitas, 54, was convicted of manslaughter and sentenced to more than 10 years in prison. 'I love that AI. Thank you for that,' Judge Todd Lang said at the hearing. 'And as angry as you are, and justifiably angry as the family is, I heard the forgiveness, and I know Mr. Horcasitas could appreciate it, but so did I.' Within hours of the hearing, the defense filed a notice to appeal, pointing to the impact AI may have had on the judge's sentencing decision. Although AI has been used before in legal research and preparing cases, using it to deliver a deceased person's victim impact statement is unprecedented. Copyright 2025 Nexstar Media, Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed.

From AI avatars to virtual reality crime scenes, courts are grappling with AI in the justice system
From AI avatars to virtual reality crime scenes, courts are grappling with AI in the justice system

The Independent

time09-05-2025

  • The Independent

From AI avatars to virtual reality crime scenes, courts are grappling with AI in the justice system

Stacey Wales gripped the lectern, choking back tears as she asked the judge to give the man who shot and killed her brother the maximum possible sentence for manslaughter. What appeared next stunned those in the Phoenix courtroom last week: An AI-generated video with a likeness of her brother, Christopher Pelkey, told the shooter he was forgiven. The judge said he loved and appreciated the video, then sentenced the shooter to 10.5 years in prison — the maximum sentence and more than what prosecutors sought. Within hours of the hearing on May 1, the defendant's lawyer filed a notice of appeal. Defense attorney Jason Lamm won't be handling the appeal, but said a higher court will likely be asked to weigh in on whether the judge improperly relied on the AI-generated video when sentencing his client. Courts across the country have been grappling with how to best handle the increasing presence of artificial intelligence in the courtroom. Even before Pelkey's family used AI to give him a voice for the victim impact portion — believed to be a first in U.S. courts — the Arizona Supreme Court created a committee that researches best AI practices. In Florida, a judge recently donned a virtual reality headset meant to show the point of view of a defendant who said he was acting in self-defense when he waved a loaded gun at wedding guests. The judge rejected his claim. And in New York, a man without a lawyer used an AI-generated avatar to argue his case in a lawsuit via video. It took only seconds for the judges to realize that the man addressing them from the video screen wasn't real. Experts say using AI in courtrooms raises legal and ethical concerns, especially if it's used effectively to sway a judge or jury. And they argue it could have a disproportionate impact on marginalized communities facing prosecution. 'I imagine that will be a contested form of evidence, in part because it could be something that advantages parties that have more resources over parties that don't," said David Evan Harris, an expert on AI deep fakes at UC Berkeley's business school. AI can be very persuasive, Harris said, and scholars are studying the intersection of the technology and manipulation tactics. Cynthia Godsoe, a law professor at Brooklyn Law School and a former public defender, said as this technology continues to push the boundaries of traditional legal practices, courts will have to confront questions they have never before had to weigh: Does this AI photograph really match the witness's testimony? Does this video exaggerate the suspect's height, weight, or skin color? 'It's definitely a disturbing trend," she said, 'because it could veer even more into fake evidence that maybe people don't figure out is false.' In the Arizona case, the victim's sister told The Associated Press that she did consider the 'ethics and morals' of writing a script and using her brother's likeness to give him a voice during the sentencing hearing. 'It was important to us to approach this with ethics and morals and to not use it to say things that Chris wouldn't say or believe," Stacey Wales said. Victims can give their impact statements in any digital format in Arizona, said victims' rights attorney Jessica Gattuso, who represented the family. When the video played in the courtroom, Wales said only she and her husband knew about it. 'The goal was to humanize Chris and to reach the judge,' Wales said. After viewing it, Maricopa County Superior Court Judge Todd Lang said he 'loved the beauty in what Christopher' said in the AI video. 'It also says something about the family," he said. 'Because you told me how angry you were, and you demanded the maximum sentence, and even though that's what you wanted, you allowed Chris to speak from his heart as you saw it.' On appeal, the defendant's lawyer said, the judge's comments could be a factor for the sentence to be overturned. ___ Associated Press reporters Sarah Parvini in Los Angeles, Sejal Govindarao in Phoenix and Kate Payne in Tallahassee, Florida, contributed to this report.

From AI avatars to virtual reality crime scenes, courts are grappling with AI in the justice system
From AI avatars to virtual reality crime scenes, courts are grappling with AI in the justice system

Associated Press

time09-05-2025

  • Associated Press

From AI avatars to virtual reality crime scenes, courts are grappling with AI in the justice system

Stacey Wales gripped the lectern, choking back tears as she asked the judge to give the man who shot and killed her brother the maximum possible sentence for manslaughter. What appeared next stunned those in the Phoenix courtroom last week: An AI-generated video with a likeness of her brother, Christopher Pelkey, told the shooter he was forgiven. The judge said he loved and appreciated the video, then sentenced the shooter to 10.5 years in prison — the maximum sentence and more than what prosecutors sought. Within hours of the hearing on May 1, the defendant's lawyer filed a notice of appeal. Defense attorney Jason Lamm won't be handling the appeal, but said a higher court will likely be asked to weigh in on whether the judge improperly relied on the AI-generated video when sentencing his client. Courts across the country have been grappling with how to best handle the increasing presence of artificial intelligence in the courtroom. Even before Pelkey's family used AI to give him a voice for the victim impact portion — believed to be a first in U.S. courts — the Arizona Supreme Court created a committee that researches best AI practices. In Florida, a judge recently donned a virtual reality headset meant to show the point of view of a defendant who said he was acting in self-defense when he waved a loaded gun at wedding guests. The judge rejected his claim. And in New York, a man without a lawyer used an AI-generated avatar to argue his case in a lawsuit via video. It took only seconds for the judges to realize that the man addressing them from the video screen wasn't real. Experts say using AI in courtrooms raises legal and ethical concerns, especially if it's used effectively to sway a judge or jury. And they argue it could have a disproportionate impact on marginalized communities facing prosecution. 'I imagine that will be a contested form of evidence, in part because it could be something that advantages parties that have more resources over parties that don't,' said David Evan Harris, an expert on AI deep fakes at UC Berkeley's business school. AI can be very persuasive, Harris said, and scholars are studying the intersection of the technology and manipulation tactics. Cynthia Godsoe, a law professor at Brooklyn Law School and a former public defender, said as this technology continues to push the boundaries of traditional legal practices, courts will have to confront questions they have never before had to weigh: Does this AI photograph really match the witness's testimony? Does this video exaggerate the suspect's height, weight, or skin color? 'It's definitely a disturbing trend,' she said, 'because it could veer even more into fake evidence that maybe people don't figure out is false.' In the Arizona case, the victim's sister told The Associated Press that she did consider the 'ethics and morals' of writing a script and using her brother's likeness to give him a voice during the sentencing hearing. 'It was important to us to approach this with ethics and morals and to not use it to say things that Chris wouldn't say or believe,' Stacey Wales said. Victims can give their impact statements in any digital format in Arizona, said victims' rights attorney Jessica Gattuso, who represented the family. When the video played in the courtroom, Wales said only she and her husband knew about it. 'The goal was to humanize Chris and to reach the judge,' Wales said. After viewing it, Maricopa County Superior Court Judge Todd Lang said he 'loved the beauty in what Christopher' said in the AI video. 'It also says something about the family,' he said. 'Because you told me how angry you were, and you demanded the maximum sentence, and even though that's what you wanted, you allowed Chris to speak from his heart as you saw it.' On appeal, the defendant's lawyer said, the judge's comments could be a factor for the sentence to be overturned. ___ Associated Press reporters Sarah Parvini in Los Angeles, Sejal Govindarao in Phoenix and Kate Payne in Tallahassee, Florida, contributed to this report.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store