Looking at the downvotes, remember upvoting an article ≠ an endorsement of the shitty technology being discussed in the article.
We shit on the technology in the comments, and upvote it so more of us can read about it and shit on it.
If I am murdered please don’t do this. I do not care if you feel like it will help you process the events
That should never be allowed in court. What a crock of shit.
It was a victim impact statement, not subject to the rules of evidence. The shooter had already been found guilty, and this was an impact statement from the victim’s sister, to sway how the shooter should be sentenced. The victim’s bill of rights says that victims should be allowed to choose the method in which they make an impact statement, and his sister chose the AI video.
I agree that it shouldn’t be admissible as evidence. But that’s not really what’s being discussed here, because it wasn’t being used as evidence. The shooter was already found guilty.
It sounds like it was played after a sentencing was given? Would be kind of sketchy if not.
This was played before sentencing. It doesn’t say it here, but the article I read earlier today stated that because of this video, the judge issued a sentence greater than the maximum recommended by the State. If true, then it really calls into question the sentence itself and how impartial the judge was.
It appears this was a Victim impact statement.
A victim impact statement is a written or oral statement made as part of the judicial legal process, which allows crime victims the opportunity to speak during the sentencing of the convicted person or at subsequent parole hearings.
From the article (emphasizes mine):
But the use of AI for a victim impact statement appears novel, according to Maura Grossman, a professor at the University of Waterloo who has studied the applications of AI in criminal and civil cases. She added, that she did not see any major legal or ethical issues in Pelkey’s case.
"Because this is in front of a judge, not a jury, and because the video wasn’t submitted as evidence per se, its impact is more limited," she told NPR via email.
Demon technology. Did we learn nothing from doom 2016 ?
- Woman’s brother was killed in a road rage incident
- In preparing her victim impact statement for the court, she struggled to find a way to properly represent her brother’s voice
- Her husband works with AI and helped her generate a video of her brother for the victim impact statement
- The video was very well received and apparently true to her brother’s personality. Though she didn’t forgive the killer, she knew her brother would. So, in the AI video, “he” did.
- After all the real people made their statements to the judge, the video was played
- The judge loved it and thanked the woman
In preparing her victim impact statement for the court, she struggled to find a way to properly represent her brother’s voice
Should clarify that the woman wrote the script. The AI just generated the voice and image. The AI read the woman’s script who wrote it in the tone of her brother putting aside her own feelings.
Appreciated – my apologies that I wasn’t clear. I was curious about the connection to “did we learn nothing from doom 2016” that the OP referenced.
For this ? The guy who was brought back through Ai was killed in a hit and run then they brought the ai version of him to court to give a statement from beyond the grave of sorts. I think it’s immoral as fuck but I’m sure I’ll get told why it’s actually not.
I was wondering what happened in “doom 2016”. And now I can’t tell if you’re summarizing the article or what happened in doom 2016.
Technology isn’t inherently good or evil. It entirely depends on the person using it. In this case, it had a very positive impact on everybody involved.
To me this is the equivalent of taxidermying a person then using them as a puppet. Sure it might have a positive impact on some people but it’s immoral at best.
This. I don’t see how it’s any different from making an ‘ai video’ about a murder victim thanking his murderer for easing his pain, in order to ‘make people feel better’ after a rich perpretrator games the system and is acquitted via dubious means. It’s blatant manipulation.
What makes it immoral? Nobody was hurt in any way, physically, emotionally, or financially. They disclosed the use of AI before showing the video. It even helped the perpetrator get a smaller sentence (IMO prison as a concept is inhumane, so less prison time is morally right).