Let's Chat!

Just Some
Thoughts.

LET'S CHAT
OUR WORK

The Age of Deepfakes


Sometime around the turn of the millennium, the world changed forever. No, I don't mean NSYNC dropping "No Strings Attached" or the launch of Myspace. What I'm referring to is the proliferation of Photoshop and convincing fakes images. While it's been possible to create realistic visual fabrications for longer than that, it would have been an expensive and labor intensive process only undertaken for serious business like moon landings and disappearing folks who'd fallen out of favor with the Kremlin. The change in era was marked by a change in vocab: what used to get "doctored" gets "Photoshopped" instead. Since then, we've seen another shift, and now we live in the age of the "deepfake".

This week, a controversial new app made headlines and was quickly taken offline by its creators. DeepNude promised to turn any picture of a clothed female into a picture of a naked female, and, apparently, it worked pretty well. The team behind DeepNude stopped distributing the software shortly after it went mainstream, citing their underestimation of the popularity it would gain or it's potential for abuse. Hmmm, let's see, an app that plays to all of the worst impulses of perverts, predators, and angry exs... what could possibly go wrong? Needless to say, the bros that created DeepNude are in need of some serious mansplaining about the already troubled gender dynamics that make the release of such software such a bad idea, but the issue goes much deeper than made to order porn pics and public humiliation.

So, what exactly are deepfakes? The word, freshly coined on Reddit in 2017, is a portmanteau of "deep learning" and "fake", referring to the use of machine learning technology to create doctored images and videos. What it means is that creating convincing altered media is easier than ever. And it's being used for all sorts of fun like pasting Nick Cage's face onto, well, just about everything and showing Mark Zuckerburg telling the truth about his evil empire and its connections to Spectre. Laughs aside, the technology is also creating a kind of existential crisis around how we determine the truthiness of what we're looking at. In other words, we're now firmly planted in a world where seeing is, well, maybe not all it was once cracked up to be.

As some commentators have pointed out, better algorithms for detecting fake media are not going to save us. Experts think it's possible that this task will only get harder in the future and may eventually be impossible. Regardless, the best tools for disarming deepfakes have been around for a long time and probably aren't going anywhere: a critical perspective and thinking for yourself. Perhaps the deepfake scandals are actually helping humanity to learn the important lesson that no single source or piece of media is an absolute and uncorrupted image of truth. In fact, all of our stories and our history have been subject to adjustments since the beginning of time. Was the original deepfake a cave painting, showing a scandalous prehistoric love affair that never happened? While new technology can in some cases set the record straight, this is just the latest reminder that media, no matter how advanced, tells a story, but not the whole story.