Seeing the Round Corners

August 19, 2019

PART VIII  – FAKE NEWS AND THE SYNTHETIC MEDIA

There was once a time at the Colorado legislature – down under the Gold Done as many refer to it – when legislators wanted to require members of the press to wear state-issued badges to identify them as such because legislators were afraid of being eavesdropped on. Yeah! Readers can only imagine how that went over, but that was several years back. Since politicians are often easy targets of fake news and deep fakes, hopefully that idea is long forgotten.

Journalism and life as a professional journalist is one of assignment after assignment, and tougher than anyone except those in the profession actually realize. The world may find that the exposure of fake news and deep fakes is dependent on the work of these journalists as close reading of Parts I through VII of Seeing the Round Corners described.

The Wall Street Journal’s Media Forensics Committee summarized its work with these points: 

  • deep fakes are not going away anytime soon;
  • such elaborate forgeries will make verifying media harder; and
  • such challenge(s) could become more difficult over time.

 

A photo forensics expert now on the faculty at the University of California -  Berkeley, Hany Farid, opined in 2018:  “The next 18 months will be critical – I do think that the issues are coming to a head,” then added he expects researchers will have made advances before the 2020 election cycle.

Considering how the “Russians interfering in the 2016 election” has dominated the news since the last election, Professor Farid’s comments and the public’s distrust of journalists, the WSJ Committee’s final summation is this:  “Despite the current uncertainty, newsrooms can and should follow the evolution of this threat by partnering with academia institutions and by training their journalists how to leverage new tools.” The tendency of the ordinary citizen to be less and less informed makes this idea even more important.

Now that readers have an orientation on fake news and deep fakes, there is a vast number of purposes that such technology can be used for – some antisocial and some prosocial. There are beneficial uses of deep fake technology with the most obvious possibilities in the area of education, art, and autonomy, with many more to evolve as the technology evolves.

Education may take advantage of deep fakes to produce videos of historical figures speaking directly to students, thereby making a dull and boring lecture far more interesting. Such possibilities could also involve for example, changing a war film scene to make it appear a commander and a legal adviser are discussing application of the laws of war, when in reality, the discussion had nothing to do with that, and never took place. The hypothetical scenario could also allow for re-runs again and again. 

The propensity for rewriting history will have a field day with this no doubt. Two obstacles that arise at this point in time are these:  “Creating modified content will raise interesting questions about intellectual property protections and the reach of the fair use exemption.” 

Artistic/creative endeavors may enjoy the broadest benefits of deep-fake technology. Resurrecting dead performers for fresh roles is not always well received, but in the event of the sudden death of an actor in a movie in progress such as Carrie Fisher (The Last Jedi), deep fakes meant filmmakers could fake additional dialogue using snippets from real recordings. Deep fake technology can be used to embolden an activists’ point far greater than words alone.

Individuals being able to avail themselves of an aspect of life not available to them in a conventional sense because of say physical disabilities is one example using deep fake technology in terms of autonomy. Also described as “avatar” experiences, such enable “a person to have or perceive experiences that might otherwise be impossible, dangerous or otherwise undesirable if pursued in person,” but may not always be pursued in a serious matter, just be in pursuit of happiness.

Next week, harmful use of deep-fake technology.

The reader's comments or questions are always welcome. E-mail me at doris@dorisbeaver.com.