Deep Fakes

futures2015

Member Who Talks (A Lot!)
May 2, 2015
7,156
6,795
0
73
Deep Fakes: A Looming Crisis for National Security, Democracy and Privacy?
Fake news sucks, and as those eerily accurate videos of a lip-synced Barack Obama demonstrated last year, it’s soon going to get a hell of a lot worse. As a newly revealed video-manipulation system shows, super-realistic fake videos are improving faster than some of us thought possible.
I predict that deep fake videos will make dramatic appearances in the 2020 elections. Those who thought the Russians interfered in the 2016 election haven't seen anything yet.

Gizmodo: Deepfake Videos Are Getting Impossibly Good
The new system was developed by Michael Zollhöfer, a visiting assistant professor at Stanford University, and his colleagues at Technical University of Munich, the University of Bath, Technicolor, and other institutions. Zollhöfer’s new approach uses input video to create photorealistic re-animations of portrait videos. These input videos are created by a source actor, the data from which is used to manipulate the portrait video of a target actor. So for example, anyone can serve as the source actor and have their facial expressions transferred to video of, say, Barack Obama or Vladimir Putin.

But it’s more than just facial expressions. The new technique allows for an array of movements, including full 3D head positions, head rotation, eye gaze, and eye blinking. The new system uses AI in the form of generative neural networks to do the trick, taking data from the signal models and calculating, or predicting, the photorealistic frames for the given target actor. Impressively, the animators don’t have to alter the graphics for existing body hair, the target actor body, or the background.

Secondary algorithms are used to correct glitches and other artifacts, giving the videos a slick, super-realistic look. They’re not perfect, but holy crap they’re impressive.
 

JG

Member Who Talks (A Lot!)
Oct 29, 2008
61,344
12,019
0
Eventually you may be able to completely fake a video of someone committing a heinous crime. Then what?

Cottage industry...expert in video manipulation to testify as to whether pictures or videos are genuine.
 
  • Like
Reactions: futures2015

futures2015

Member Who Talks (A Lot!)
May 2, 2015
7,156
6,795
0
73
Eventually you may be able to completely fake a video of someone committing a heinous crime. Then what?

Cottage industry...expert in video manipulation to testify as to whether pictures or videos are genuine.
There are people working on AIs to detect fake video, however, the money is in creating them. (i.e. Create a fake video of someone famous doing something nefarious/illegal, post it on youtube, collect money from thousands/millions of views)

Here's a partial list of (mis)use:
  • Fake videos could feature public officials taking bribes, uttering racial epithets, or engaging in adultery.
  • Politicians and other government officials could appear in locations where they were not, saying or doing horrific things that they did not.
  • Fake videos could place them in meetings with spies or criminals, launching public outrage, criminal investigations, or both.
  • Soldiers could be shown murdering innocent civilians in a war zone, precipitating waves of violence and even strategic harms to a war effort.
  • A deep fake might falsely depict a white police officer shooting an unarmed black man while shouting racial epithets.
  • A fake audio clip might “reveal” criminal behavior by a candidate on the eve of an election.
  • A fake video might portray an Israeli official doing or saying something so inflammatory as to cause riots in neighboring countries, potentially disrupting diplomatic ties or even motivating a wave of violence.
  • False audio might convincingly depict U.S. officials privately “admitting” a plan to commit this or that outrage overseas, exquisitely timed to disrupt an important diplomatic initiative.
  • A fake video might depict emergency officials “announcing” an impending missile strike on Los Angeles or an emergent pandemic in New York, provoking panic and worse.
I expect some variation of one or more of these will affect the 2020 election.

It's becoming increasingly difficult to isolate oneself from the locoweeds.

EDIT:
Tools not available to fix the problem. Deepfake experts say that legitimate content could eventually be authenticated to fight deepfakes, but such tools might not be available in time for next year’s election cycle.

“A deepfake timed just right, the night before the election or morning of, is going to be primed to be believed,” said Danielle Keats Citron, a professor at the University of Maryland Francis King Carey School of Law who is scheduled to testify Thursday. “And state and non-state actors know that.”
 
Last edited:

padrehorn11

Member Who Talks (A Lot!)
Oct 27, 2016
2,140
4,249
0
Texas
There are people working on AIs to detect fake video, however, the money is in creating them. (i.e. Create a fake video of someone famous doing something nefarious/illegal, post it on youtube, collect money from thousands/millions of views)

Here's a partial list of (mis)use:
  • Fake videos could feature public officials taking bribes, uttering racial epithets, or engaging in adultery.
  • Politicians and other government officials could appear in locations where they were not, saying or doing horrific things that they did not.
  • Fake videos could place them in meetings with spies or criminals, launching public outrage, criminal investigations, or both.
  • Soldiers could be shown murdering innocent civilians in a war zone, precipitating waves of violence and even strategic harms to a war effort.
  • A deep fake might falsely depict a white police officer shooting an unarmed black man while shouting racial epithets.
  • A fake audio clip might “reveal” criminal behavior by a candidate on the eve of an election.
  • A fake video might portray an Israeli official doing or saying something so inflammatory as to cause riots in neighboring countries, potentially disrupting diplomatic ties or even motivating a wave of violence.
  • False audio might convincingly depict U.S. officials privately “admitting” a plan to commit this or that outrage overseas, exquisitely timed to disrupt an important diplomatic initiative.
  • A fake video might depict emergency officials “announcing” an impending missile strike on Los Angeles or an emergent pandemic in New York, provoking panic and worse.
I expect some variation of one or more of these will affect the 2020 election.

It's becoming increasingly difficult to isolate oneself from the locoweeds.

EDIT:
You can't even believe your lyin' eyes anymore.
 

calvin farquhar

Member Who Talks (A Lot!)
Dec 19, 2017
7,054
12,749
0
More ammo to continue dividing the country, in this case using fake videos or media. That's troubling.