We all take for granted what Photoshop can do: transform people and scenes into impossible beauty. We have even become skeptical about perfection, discarding images as being Photoshopped. Some Photoshop use is blatant, and we can afford to roll our eyes. Marketing has always been a bit fake anyways. But a new generation of reality doctors can use simple as well as complex tools to alter video: and video is a format which we are much quicker to believe.
Anyone can modify a photo to change the composition or visual elements. Look at Instagram’s filters. We use our creative license to intensify moments as we record them, and this behavior has become expected more that just widespread.
Doctoring a photo to show something that isn’t there is harder. Inserting elements – or particularly people – into scenes requires a higher level of mastery of digital art tools. Because of the difficulty in creating fake photos, we have a higher threshold for belief. A sunset that looks particularly intense – easy to spot the filter. A group of people sitting around a table featuring a celebrity that was never there – much harder to spot the fake.
Higher still is video. Realistically doctoring video requires an even rarer skillset than for photos. When it’s well done, it’s nearly impossible for anyone to realize that what they’re seeing is a video depicting something that did not happen. Combine that with the overload of information and the split seconds that we spend watching videos, and there is a near unlimited opportunity to shift the way other people truly believe in reality.
Doctoring Politicians to Fit a Narrative
In the era of fake news, this general situation would be dangerous. Politicians have been sound biting each other in political attack ads for decades. They take phrases out of context to make people appear less vote-worthy among certain audiences. If you wanted to falsify a video of a politician, you would have to be a special effects expert.
But today, you no longer need Adobe After Effects to create falsified videos anymore.
Would you believe me if I told you that Nancy Pelosi, the American Speaker of the House, gave a speech while visibly drunk? For anyone who didn’t vote for Trump, this concept is unfathomable. There is simply no way she would ever do something like that.
Yet this video depicts her slurring her speech and grinning like an idiot. And millions of Americans have watched it and believe that the third in line to the President – and the top democrat – is a no good drunk.
Is the video made by Marvel Studios? Is it drawn from Google’s AI bot that animates portraits to make them talk?
No, some dude just slowed the video down a little bit.
And millions of people who hate Democrats believe that she is really that type of person.
This is called a shallowfake, a slightly doctored image, video, or sound bite that distorts reality in a certain way.
Fake news are things like news articles claiming that Hillary Clinton sprinkles fetuses on her kale and quinoa salad for lunch. Shallowfakes are much slighter and harder to detect because they distort a reality that does exist. And in today’s information overload, no one has the time to check each and every piece of content they come across for veracity.
Sure, someone can come out and show the real video. That would disprove the fake version. But the scary thing is that the people who believe Pelosi is a drunk don’t want to know the truth. They want information that fits into the way that they want the world to be. As Shallowfakes pile up, they will continue to cite these falsifications as sources of truth.
And if Trump is any indication, he will soon have another shield to deflect the reality that he is trying to hide. Anything that comes out that doesn’t go his way : it’s not just fake news, it’s fake content.
The onus of proof now rests upon the masses, and what’s clear is that the masses do not have the means to create balanced decisions and dig further to find truth. Hear an unpopular opinion on Facebook? Unfollow that person! See something that catches your fancy? See what else that person has posted.
Social media was meant to connect people, and what it’s doing now is connecting similar types of people while creating deeper divisions between others. With no possibility of moderation, there is literally no way to confront this problem.
Are politics screwed? I tend to think not. In the absence of a major war or disaster to unite a nation, people become more and more sensitive to smaller and smaller things. This is good overall since it lets us address more and more issues, yet it makes it harder to see the common values that underpin a nation since all we see is our differences. You could argue that Trump’s election was won because of false information, but it was really won by complacency. If every millennial in the US had voted, not a single state would have gone red. Maybe fake news is the thing that can motivate everyone to vote.
But what about you?
Ah yes, politics is one thing. But could you fall victim to fake content? What gets truly frightening are deepfakes.
Imagine someone takes a picture of you from the Internet, like your profile picture from LinkedIn – which is widely available to everyone in the world. They take that photo and animate it into a video.
Sound like science fiction? Nope, that technology already exists.
Then take that animated video and apply it to any sort of staged video. They could even use voice alteration software matching with a video you posted on Facebook wishing your mom a Happy Mother’s Day, and they make you say something. The scenarios start to get very scary.
Imagine they make it look like you are talking to coworkers in the breakroom in video captured from a hidden cellphone (so the quality will be lower and easier to document).
Now imagine what they can do, maybe the video is set up to look like you are pinching a female coworker’s ass? Or imagine them faking your voice talking about how hard you would bang your intern if you had the chance?
This video ends up with HR who are jumpy right now with the #MeToo movement. They can’t afford to take any chances with a manager like you. How fast do you think you’ll be fired? How well could you argue against video proof?
It’s already hard enough to prove things in those types of situations. Since companies have the moral duty to protect their employees from any form of sexual harassmant, a video of sexual harassment should be grounds for immediate dismissal. But if that video is fake, how well can an innocent person protect themself?
Women have had to deal with this already. Revenge porn is when shitheads post sex videos with their exes online. It is one of the worst possible things anyone can do to anyone else. But what do you do if you’re a shithead and you don’t have a sex tape with your ex but still want to ruin their life? Why not make one? Find a porn video with someone else and add in your ex’s head. Watch as their entire life falls apart.
Believe it or not, we’re still only on the tip of the iceberg. With facial recognition software, people could start to doctor your face in different places to get you onto federal watchlists. They could add your image to crime scenes, and watch the triggers start to go off when it turned out you were present at a number of terrorist attacks. Hopefully you could defend yourself in court, but not before the SWAT team has a few things to say to you.
For women who are victims of fake revenge porn, there is no recourse in the court of public opinion. Deepfakes present a challenge to our society unlike anything we’ve ever seen before.
The point that I’m trying to make is that the ability to alter content is becoming more and more mainstream and so the possibility of misuse is multiplying. We are living in a time of unprecendented technological acceleration, and it is opening deep rifts in the fabric of society. Every action has an equal and opposite reaction, but since we are going so fast, we don’t always see the reaction until it’s too late.
We will need to learn to deconnect what we see online from reality, or we will succumb to whatever narratives others want us to believe.