How to Spot Deepfakes and What to Do if You Become a Victim

Man editing a video

Depending on its use, deepfakes can provide a lot of laughter and joy to audiences worldwide. But more often than not, they can be a tool for misinformation, fake news, sexual harassment, and even terror. Here is your ultimate guide to deepfakes: What they are, how you can spot them, and what you can do if you find yourself a victim.

What are deepfakes?

Deepfakes are edited audio recordings or videos that sound and look real. A more sophisticated form of Photoshopping, the word “deepfake” is a portmanteau of the words “deep learning” and “fake.” It is when an existing media, like a video or an image, is edited to use another person’s image or likeness, usually without the person’s consent. Photo manipulation was first discovered for motion pictures in the 19th century and only grew in sophistication until it became the technology it is today. In the past, computer graphics were only used by experts for films, TV shows, and other kinds of media for entertainment, but now anyone with access to the internet can create their own deepfakes.

Why are they harmful?

At best, they can be a way for people to find laughter and joy during mundane days. Meme culture has reached its zeitgeist in recent years, and it’s become a way for people to smile and forget about the devastating headlines we see daily. At worst, deepfakes can be a weapon for war, a powerful tool for fake news and misinformation, and another way to further objective, degrade, and sexually harass women (and men).

Being able to spot a deepfake is important because if we don’t, we might reach a point where we don’t know what’s real and what’s not, making us more vulnerable to gaslighting as individuals and as a collective. If we don’t have the right tools or technology to catch deepfakes, we as a society might continue to distrust everything we see and hear in the media.

Taking down notes from video

Tips for spotting a deepfake video

Detecting deepfakes can be a challenge, especially since they continue to grow in refinement as time goes on. But there are telltale signs, and you only need to look closely and arm yourself with the right tools and information. So here are some tools and tips for detecting a deepfake video:

  • A 2018 study found that deepfake videos don’t allow faces to blink normally. That makes sense since people often have their eyes open in pictures, and algorithms don’t know about blinking just yet. So if a person isn’t blinking normally on a video, there’s a chance it could be a deepfake.
  • If a deepfake is poor in quality, it’s much easier to spot. Check for patchy skin tones, flickering in the edges of the superimposed faces, and look closely at the details. Fine details like hair are difficult to render well for people who are not experts at animating. Other smaller details you need to watch out for include terribly rendered teeth and jewelry.
  • Inconsistent lighting can also be a dead giveaway. Like in Photoshopped images, if an original photo doesn’t match the original video, it can pose problems for how realistic and seamless the deepfake can look.
  • In late 2020, a project called the Deepfake Detection Challenge was launched to encourage research teams, corporations, and individuals across the globe to come up with software that can accurately detect a deepfake. It was a project backed by tech giants Microsoft, Facebook, and Amazon. If the best experts in the world can come together to come up with a foolproof way to detect these videos, then it can be a win for humanity as a whole.

What to do if someone uses your likeness without your informed consent

So what are your options if and when you find a video on the Internet using your image and likeness and superimposing it on somebody else? Fortunately, the Malicious Deep Fake Prohibition Act was proposed in 2018, and the DEEPFAKES Accountability Act was proposed in the following year. These acts provide victims of explicit deepfake content with the cause against the creator of the content, giving them an option to act upon what was done against them.

Consult with a reputable lawyer about what your options are—because there are many. You can have the video taken down for copyright or defamation or through a possible First Amendment violation. Talk to your lawyer about what you can do and hope that the law catches up on the speed upon which deepfake technology continues to advance.

Scroll to Top