Back in the 1970s, there was a television commercial featuring jazz singer Ella Fitzgerald with a wine glass, a recording studio, and a recordable audio cassette made by a company called Memorex. The pitch was that the audio recording of Ella’s voice could break the wine glass, just like her live voice. The tagline was, “Is it live or Memorex?”
Courts and the Deep Fake Problem
Courts may soon be facing a similar question when it comes to audio evidence, photos, videos, and various other forms of digital evidence: is it real, or is it fake?
Indeed, in the midst of the excitement about General AI and its potential benefits for legal professionals, a critical risk to our justice system is often overlooked: the threat posed by deep fakes. Deep fakes are artificial replications of things like images or voices that are not real. They are false, yet they pose a significant issue.
Consider voice cloning. An example of a voice deep fake is the recent fraudulent robocall that used a voice resembling Joe Biden’s,. The fake Biden voice asked people not to vote in the New Hampshire primary. The voice sounded just like Joe Biden’s, but it wasn’t. Such voice cloning presents serious authentication risks in courtrooms.
Similarly, the realm of fake photos and videos is evolving. Altering photos isn’t new, but today’s technology is both more accessible and sophisticated. It’s now easy to remove a person from a photo or add someone into a scene they were never in. Spotting fake videos is increasingly challenging, as I have discussed before.
And it goes beyond that. Recall the controversy at the Kyle Rittenhouse murder trial in Wisconsin,. In that case, photographic evidence, enlarged using pinch-to-zoom tools, was deemed inadmissible. The media went on a feeding frenzy with the notion