Dead Victim’s AI Deepfake Brought In To Speak At Sentencing… Oh HELL No!

Somehow this is NOT a law professor's most insane hypo. The post Dead Victim’s AI Deepfake Brought In To Speak At Sentencing… Oh HELL No! appeared first on Above the Law.

May 8, 2025 - 16:56
 0
Dead Victim’s AI Deepfake Brought In To Speak At Sentencing… Oh HELL No!

This week in rejected Black Mirror plots, we had a deceased manslaughter victim reanimated as an AI-generated video to speak at his killer’s sentencing. In a victim impact statement hitherto unprecedented because, you know, THE LAWS OF NATURE, Christopher Pelkey’s likeness and voice presented a script in the first person that his family prepared.

Nope. Nope. Absolutely not! Delete this timeline.

Pelkey, killed in a 2021 road rage incident, said, “I believe in forgiveness, and a God who forgives. I always have, and I still do.” Or at least his deepfake said that, inventing an entirely new category of hearsay to frustrate future law students. Welcome to a world where witness statements are not only made out-of-court, but never actually made by the witness at all.

We’re not there yet. This impact statement is not presented to an Arizona jury charged with determining guilt and judges would still draw the line at ventriloquism. Presumably! But sentencing isn’t exactly trivial. It’s the part where someone gets locked up and it deserves to be just as free from any whiff of prejudice as the rest of the trial.

Judges give themselves a lot more credit for being immune to prejudicial testimony than they probably deserve. After all, they’re human and not robots… like the witnesses apparently. Having watched this presentation, the judge then imposed a sentence one-year longer than the prosecutors asked for. Was that entirely because the judge watched a dead man deliver an emotional appeal from beyond the grave? Maybe not, but it certainly looks that way. And, honestly, any judge that greenlights something like this has to know that the eventual sentence will be forever viewed through the lens of bringing an AI ghost directly into the proceedings.

We just watched a court drop the hammer on an effort to pipe oral argument through an AI-generated lawyer. And that was an appellate argument! If they felt unsettled by the prospect of listening to latter-day Max Headroom explain reinsurance, then a trial judge should be fully freaked out at the prospect of watching a dead man speak at sentencing.

I understand the emotional appeal, the family’s interest in symbolically giving their loved one the last word and substituting their grief with what they imagine Pelkey himself would’ve said. But the courtroom should not be a séance, and it sure as hell should not be open-mic night for deepfakes. Victim impact statements are already fraught territory — deeply emotional, sometimes inflammatory, often pushing the limits of what should influence a sentence. But at least hearing from the family speaking from their perspective as the family is grounded in reality. Positioning the family’s words through the ersatz victim is less a slippery slope than a greased luge.

Which is where judges should step in to gracefully tell the family no. And that’s hard to do after the loss they’ve suffered, but that’s why we have a judge to make the uncomfortable but necessary calls.

The legal system doesn’t need more spectacle dressed up in emotional pixels.

AI of dead Arizona road rage victim addresses killer in court [The Guardian]

The post Dead Victim’s AI Deepfake Brought In To Speak At Sentencing… Oh HELL No! appeared first on Above the Law.