The owner of NotJordanPeterson.com, a website for generating convincing clips of Jordan Peterson saying whatever you want using AI, shut down their creation this week after the real Peterson announced his displeasure and raised the possibility of legal action.While the site was up, a 21-second recording greeted visitors to the site, saying in Peterson's voice, "This is not Jordan Peterson. In fact, I'm a neural network designed to sound like Dr. Peterson." The clip implored the visitor to type some text into a box, that would be fed into a neural network trained on hours of Peterson's actual voice, and generated into audio that sounded a lot like the real thing. Advertisem*nt
"The Deep Fake artists need to be stopped, using whatever legal means are necessary, as soon as possible."
Several media outlets tested the program and published the results, making him pantomime feminist texts and vulgarities. Aside from the outrageous content, the results sounded a lot like the real thing.It turns out that Peterson—a controversial Canadian professor known for his lectures defending the patriarchy and denying the existence of white privilege while decrying "postmodern neo-Marxists,"—did not find NotJordanPeterson.com flattering."Something very strange and disturbing happened to me this week," Peterson wrote on his website. "If it was just relevant to me, it wouldn’t be that important (except perhaps to me), and I wouldn’t be writing this column about it. But it’s something that is likely more important and more ominous than we can even imagine."He then goes on to spend over 1,300 words decrying deepfakes—algorithmically-generated face-swapped videos, not fake audio but sometimes combined with fake voices—as a threat to politics, personal privacy, and veracity of evidence, and ends with a vague allusion toward making fake audio and video illegal. Or, possibly, suing creators.
Tech
There Is No Tech Solution to Deepfakes
Samantha Cole
"Wake up. The sanctity of your voice, and your image, is at serious risk," he wrote. "It’s hard to imagine a more serious challenge to the sense of shared, reliable reality that keeps us linked together in relative peace. The Deep Fake artists need to be stopped, using whatever legal means are necessary, as soon as possible." Advertisem*nt
deepfakes
Deepfakes Were Created As a Way to Own Women's Bodies—We Can't Forget That
Samantha Cole
Peterson mentions Rep. Yvette Clark's proposed DEEPFAKES Accountability Act as a potential solution to his embarrassment, and what he sees as the dangers of deepfakes as a whole. The Electronic Frontier Foundation notes that in that bill, "while there is an exception for parodies, satires, and entertainment—so long as a reasonable person would not mistake the 'falsified material activity' as authentic—the bill fails to specify who has the burden of proof, which could lead to a chilling effect for creators."As a big fan of free speech, Peterson of all people should be wary of suggesting we sue the pants off anyone who makes an unflattering mimicry of us online. If he really wants to do something to combat the real dangers of deepfakes, he could start with advocating for improving the legislation that does exist to get help for victims of revenge p*rn and non-consensual nudes. Those are the people who are really impacted by harassment and intimidation online.
ONE EMAIL. ONE STORY. EVERY WEEK. SIGN UP FOR THE VICE NEWSLETTER.
By signing up, you agree to the Terms of Use and Privacy Policy & to receive electronic communications from Vice Media Group, which may include marketing promotions, advertisem*nts and sponsored content.