HARRISONBURG, Va. (WHSV) -With internet access and social media, information is easier to access – but it is also becoming harder to tell what is real and what is not.
“So now all of this fake information they are trying to spread they can put a politician, whoever they want, saying whatever they want,” said Hala Nelson, author and James Madison University mathematics professor.
Nelson said she believed there will be more deepfakes online as we get closer to the November election.
“You can like make the same politician say one thing to an audience who wants to hear a certain and then say a different thing to a different audience who wants to hear a different side,” Nelson said. “You can use to get more voters, you can use it to hurt someone really badly,”
She said when people are looking at anything on the internet, you should ask: “Is this real?”
“I would advise that people go and see right now what a deepfake video looks and sound like, and just like see it over and over again so they would have an eye for it,” Nelson said.
While technology is always improving, Nelson said you should still be able to find errors in fake videos or details that do not look right.
“There is always, like, something. The movements can be repetitive in a synthetic video, or audio things can start repeating a little.”
Fake videos have been around for years, but A.I. technology removes the manual labor that used to go into creating fake videos.
” You just give the input, the input is the person you want to fake, a picture, the voice of the person you want to fake, just like any voice of me on the internet just take a clip of that, and the new text you want them to say,” Nelson said. ”And then the machine will do all of that for you. You are not going to sit there and manually cut and paste faces on faces or anything like that. It is easier, right?”
Nelson said the technology is expensive and not everyone has access. People who have motivations to create election deepfakes will find a way to get ahold of the technology.
Copyright 2024 WHSV. All rights reserved.