Seeing the Round Corners

August 5, 2019

PART VI  – FAKE NEWS AND THE SYNTHETIC MEDIA

Today we continue with the Wall Street Journal’s overview of Fake News and what it means to the world.

Finding older versions of footage
Journalists and media organizations now ave reverse image search engines – Tineye or Google Image Search – that can be used to find possible older versions of a suspect video to determine whether any portion of the video has been manipulated. Typically, “Deep fakes are often based on footage that is already online.”

Examining the footage
Today advancing technology provides a number of tools for journalists and media organizations to run editing programs that enable journalists to slow down footage. Final Cut is one program that allows journalists to slow down, zoom the image, and look at it frame by frame or pause multiple times. Such an editing program reveals obvious glitches such as “glimmering and fuzziness around the mouth or face, unnatural lighting or movements and differences between skin tones – all are telltale signs of a deep fake.” 
.
Here are some additional glitches to look for: 

  • small edits in the foreground or background of the footage;
  • signs that seem like an object was inserted or deleted into a scene that might change the context of the video;
  • glimmering, fuzzinesss and unnatural light can be indicators of faked footage; and
  • audio that has an unnatural intonation, irregular breathing, metallic sounding voices and obvious edirts – all are hints that the audio may have been generated by artificial intelligence.

The Wall Street Journal Committee states, “It is important to note that image artifacts, glitches and imperfections can also be introduced by video compression.

The democratization of deep fake creation adds to the challenge
There seems to always be a down side tro advancing technologies as social media has learned the hard way. The Wall Street Journal Committee found that technologies are being created by a number of companies that “could eventually end up being used to create deep fakes.”

Project Cloak by Adobe

The reader's comments or questions are always welcome. E-mail me at doris@dorisbeaver.com.