UTAH VALLEY UNIVERSITY, Utah — Three Utah Valley University student organizations wanted to research how deepfakes impact the viewer, whether viewers can identify deepfakes and how they engage with said deepfakes.
This, after several deepfakes associated with political candidates or campaigns have circulated online leading up to Election Day, including one of Gov. Spencer Cox that popped up earlier this year.
A deepfake is a video in which a person's face or body has been digitally altered to appear to be someone else using artificial intelligence.
The school’s Neuromarketing SMARTLab tracked the micro-expressions of 40 participants using technology called iMotions.
Those participants were tested in front of a computer outfitted with eye-tracking and facial emotion analysis hardware and software.
The lab results showed there were higher levels of engagement and confusion when exposed to deepfake content as displayed by their micro-expressions, but they did not report those feelings in post-test interviews.
Real video and audio prompted more traditional emotional responses.
Another 200 subjects engaged with an online study that assessed their ability to recognize deepfakes in video and audio formats.
At the beginning of the test, participants were divided into groups, unaware they were being shown A.I.- generated content.
Subjects then evaluated the speaker of the video or audio based on factors like credibility and trustworthiness.
It was then they were informed that the study’s purpose was to measure the impact of deepfakes and that some content may be A.I. generated.
Thereafter they were asked to assess if they believed the media was real or fake and to rate confidence in their judgement.
But even after being told they might have encountered a deepfake, they struggled to consistently identify the doctored content.
Of all the deepfake video and audio and real video and audio, at least 50 percent of participants believed the media was “probably real”.
And 57 percent or more were confident in their assessment, suggesting a 50/50 chance of deepfake detection.
“If you just need their image, if you just need their voice…everyone’s image and voice is online and freely available,” says Hope Fager, a national security studies student who helped lead the study. “It’s a little bit scary how easy it is to find the information that you need in order to pretend to be another person.”
And that’s why some campaigns have taken extra steps to ensure voters are exposed to false media as little as possible.
Michael Kaiser is the president and CEO of “Defending Digital Campaigns” based out of Washington D.C.
“We talk to campaigns a lot about the issue of deepfakes and impostering, and that’s another thing that happens in the space where someone imposters a campaign or a candidate and we work with them,” says Kaiser. “We have some tools that we provide to federal campaigns to help them monitor what’s going on on the internet to identify cases of deepfakes, to identify places where people are trying to fake being the actual campaign and we help them take it down.”
Because deepfake technology has become increasingly sophisticated, is there a way to tell fact from fake?
“I think it’s really hard,” says Kaiser. “I think there might be some technical things that people can see some blurring here or there, maybe the background doesn’t look correct, but that’s really hard to teach people to see that stuff, so I think you have to sort of trust your own instincts.”
Some social media sites know this and will flag content that might seem suspicious.
“The verification checks that this might be misinformation and things like that can go a long way,” says Fager. “Just the idea that something might be a deepfake is enough to kind of get people over the edge and think critically. No one really thinks about this on their own, but if we can put it in front of more people and say ‘Hey, just be aware what you’re looking at’. Trust your gut.”
Until more safeguards are created to weed out deepfakes, the buck starts and stops with voters right now as they view content up until Election Day.
Have your guard up and be on the lookout for what’s not real.
“The problem is now, this is happening today. This isn’t a far-off idea anymore and we need to be prepared,” says Fager. “We need to be aware of what we’re viewing and we need to be thinking critically, which is a little bit more difficult of a circumstance when we’re scrolling through social media and everyone’s brain is turned off.”