In a landmark moment for the American justice system, the family of Christopher Pelkey used artificial intelligence to allow him to “speak” posthumously at the sentencing of the man convicted of killing him during a 2021 road rage incident in Arizona.
Pelkey, a 37-year-old US Army veteran who had served three tours in Iraq and Afghanistan, was fatally shot by Gabriel Paul Horcasitas while both were stopped at a red light in Chandler, Arizona.
As per CBS News, Pelkey was walking toward Horcasitas’ car when he was shot in the chest.
Last week, Horcasitas was sentenced to 10.5 years in prison for manslaughter.
During the sentencing, a nearly four-minute AI-generated video was played in court, showing a digital recreation of Pelkey delivering a victim impact statement . The avatar, created using a single photograph and audio from a YouTube video where Pelkey discussed PTSD, greeted the court with a disclaimer: “I am a version of Chris Pelkey recreated through AI that uses my picture and my voice profile.”
“It is a shame we encountered each other that day in those circumstances. In another life, we probably could have been friends,” said the AI Pelkey, according to The New York Times. “I believe in forgiveness and in God, who forgives. I always have and I still do”, added the AI avatar.
The video, written by Pelkey’s sister Stacey Wales, aimed to reflect his forgiving nature.
She was quoted by the BBC saying, “We approached this with ethics and morals because this is a powerful tool... like a hammer, it can build or destroy. We used it to build.”
The AI rendering struck a chord with Judge Todd Lang of the Maricopa County Superior Court.
“I loved that AI,” he said, as quoted by BBC News. “And as angry as you are, and justifiably angry as the family is, I heard the forgiveness. And I know Mr. Horcasitas appreciated it, but so did I”, Lang added.
However, the use of AI in such a personal and emotional court proceeding has raised questions among legal experts. Gary Marchant, a law professor and member of Arizona’s AI committee, was quoted by CBS News as saying that there’s concern that “deepfake evidence” might influence judges and juries. “It’s easy to create, and anyone can do it on a phone,” he warned.
Despite these concerns, the AI video was allowed because Arizona law permits victim impact statements in any digital format, explained victims’ rights attorney Jessica Gattuso, as per news agency AP. The video was also supported by nearly 50 letters submitted by family and friends that echoed its message.
Horcasitas’ attorney, Jason Lamm, has filed an appeal, suggesting that the judge may have improperly relied on the AI video in sentencing. “However, this may be a situation where they just took it too far,” Lamm was quoted by The New York Times.
While the AI avatar was used only in the sentencing phase and not during the trials, of which there were two due to a disclosure error in the first, the incident has prompted broader debate about AI's place in the courtroom.
Cynthia Godsoe, a Brooklyn Law School professor, was quoted by The Times as saying that such technology can “inflame emotions more than pictures,” warning courts to tread carefully.
But others see potential. As Maura R. Grossman of the American Bar Association’s AI task force noted, “There’s no jury that can be unduly influenced,” and therefore, she did not find it “ethically or legally troubling.”
Pelkey, a 37-year-old US Army veteran who had served three tours in Iraq and Afghanistan, was fatally shot by Gabriel Paul Horcasitas while both were stopped at a red light in Chandler, Arizona.
As per CBS News, Pelkey was walking toward Horcasitas’ car when he was shot in the chest.
Last week, Horcasitas was sentenced to 10.5 years in prison for manslaughter.
During the sentencing, a nearly four-minute AI-generated video was played in court, showing a digital recreation of Pelkey delivering a victim impact statement . The avatar, created using a single photograph and audio from a YouTube video where Pelkey discussed PTSD, greeted the court with a disclaimer: “I am a version of Chris Pelkey recreated through AI that uses my picture and my voice profile.”
This is the first time a murder victim has spoken in court using AI.
— big machines (@bigmachinesAI) May 7, 2025
Christopher Pelkey was killed in a 2021 road rage incident.
Through an AI-generated video, he delivered his impact statement to his killer.
What he said and what the judge said back is powerful. pic.twitter.com/f5S6qY7tXJ
“It is a shame we encountered each other that day in those circumstances. In another life, we probably could have been friends,” said the AI Pelkey, according to The New York Times. “I believe in forgiveness and in God, who forgives. I always have and I still do”, added the AI avatar.
The video, written by Pelkey’s sister Stacey Wales, aimed to reflect his forgiving nature.
She was quoted by the BBC saying, “We approached this with ethics and morals because this is a powerful tool... like a hammer, it can build or destroy. We used it to build.”
The AI rendering struck a chord with Judge Todd Lang of the Maricopa County Superior Court.
“I loved that AI,” he said, as quoted by BBC News. “And as angry as you are, and justifiably angry as the family is, I heard the forgiveness. And I know Mr. Horcasitas appreciated it, but so did I”, Lang added.
However, the use of AI in such a personal and emotional court proceeding has raised questions among legal experts. Gary Marchant, a law professor and member of Arizona’s AI committee, was quoted by CBS News as saying that there’s concern that “deepfake evidence” might influence judges and juries. “It’s easy to create, and anyone can do it on a phone,” he warned.
Despite these concerns, the AI video was allowed because Arizona law permits victim impact statements in any digital format, explained victims’ rights attorney Jessica Gattuso, as per news agency AP. The video was also supported by nearly 50 letters submitted by family and friends that echoed its message.
Horcasitas’ attorney, Jason Lamm, has filed an appeal, suggesting that the judge may have improperly relied on the AI video in sentencing. “However, this may be a situation where they just took it too far,” Lamm was quoted by The New York Times.
While the AI avatar was used only in the sentencing phase and not during the trials, of which there were two due to a disclosure error in the first, the incident has prompted broader debate about AI's place in the courtroom.
Cynthia Godsoe, a Brooklyn Law School professor, was quoted by The Times as saying that such technology can “inflame emotions more than pictures,” warning courts to tread carefully.
But others see potential. As Maura R. Grossman of the American Bar Association’s AI task force noted, “There’s no jury that can be unduly influenced,” and therefore, she did not find it “ethically or legally troubling.”
You may also like
Expert reveals best ways to save water while keeping gardens green and lush this summer
Ruben Amorim responds to Roy Keane after Man Utd legend's 'disgrace' rant at boss
The UK city where tourism is absolutely booming and creeping up on Manchester
India Foils 600 Drone Attacks in 48 Hours, Gives Befitting Reply to Pakistan's Audacity
India tears into Pakistan's false claim that Indian forces targeted Amritsar, calls them 'preposterous and outrageous'