Tips for Catching Deepfakes in Evidence
Deepfakes use “deep learning,” a complex type of machine learning, to create fake images, videos, and audio. If you haven’t seen the eerie deepfake video that morphs Bill Hader’s face as he imitates Tom Cruise and Seth Rogen, check it out.
Many who create deepfakes just do it for fun, but manipulated videos and audio have made their way into litigation. So how do we keep fakes from being admitted as evidence?
Below are a few things to watch for when reviewing audio and video evidence that seems too good (or bad) to be true.
Inconsistent Lighting
Pay close attention to lighting and shadows in videos. Is the person’s shadow where you’d expect it to be based on the light source? Does the shadow or light source move at times in ways that don’t make sense?
Unusual Eye/Body movements
Computer programs have a hard time imitating natural blinking and eye movement, so you might notice that a person in a deepfake video seems to be staring without blinking, or their eyes don’t follow the person they’re talking to.
When a person turns their head or body, watch for distortions or choppy video quality. If one person’s head has been placed on another’s body, you might notice awkward posture or body shapes.
Unnatural Facial Features
This one’s a little weird: Pay close attention to noses. In bad deepfakes, you might be able to easily see that the person’s mouth doesn’t match the words they’re saying. But a more subtle giveaway is when a person’s nose points in a slightly different direction than the rest of their face.
Here’s where the experts come in. In addition to inconsistencies you might see or hear, the background data attached to a digital file can reveal if it’s been manipulated.
When you load an audio file into an editing program like Audacity, for example, the recording’s metadata will look different than the raw file recorded on your phone. These differences can even indicate what software was used. Attorneys in a 2019 custody case in the U.K. were able to prove that a damning piece of audio was faked by looking at the recording’s metadata.
Digital forensics experts can examine the data hiding behind those audio and video files to help you determine what’s real.
Related Resources
You Don’t Have To Solve This on Your Own – Get a Lawyer’s Help
Meeting with a lawyer can help you understand your options and how to best protect your rights. Visit our attorney directory to find a lawyer near you who can help.