Seeing the Round Corners

July 8, 2019

PART II – FAKE NEWS AND THE SYNTHETIC MEDIA

Today begins coverage on what some in the media and journalism world believe is an alarming time for reality on the information highway. Since the advent of radio, television and all things media, readers have enjoyed being able to rely on truthfulness of what is presented to the public, but with internet technology, that has changed dramatically.

As revealed in an earlier column, a study conducted by the Pew Research Center revealed 68 percent of Americans responded to the survey that made-up news and information greatly impacts their confidence in government institutions. Roughly 54 percent said it has a major impact on their confidence in each other.

This writer does not pretend to be an expert on the technical workings of the internet, so here are two phenomena explained by Robert Chesney and Danielle Keats Citron (Chesney/Citron) in their paper on Deep Fakes.

Two phenomena – information cascade and filter bubbles – make it possible for content to find its way to online users, but this also means widespread circulation, “even going viral sometimes.” Chesney/Citron explain the information cascade dynamic this way:

  • everyday interactions involve the share of information;
  • because people cannot know everything, they often rely on what others say even if it contradicts their own knowledge;
  • at a certain point, in other words, it is rational for people to stop paying attention to their own information and to look to what others know;
  • and when people pass along what others think, the credibility of the original claim snowballs;
  • simply put, information cascades result when people stop paying sufficient attention to their own information and rely too much on what they assume others know;
  • they compound the problem by sharing the information onward, believing they have learned something valuable; and
  • the cycle repeats and the cascade strengthens. 

 
Ever wonder why social media developed so explosively?  Chesney/Citron explain that “social media platforms are a ripe environment for the formation of information cascades spreading content of all stripes and quality.” Such cascades are so powerful, they can spill over to traditional mass-audience outlets, “overcoming whatever gatekeeping that exists.”

There are numerous unintended consequences of social media platforms. Social movements have leveraged the power of information cascades to name a few:

  • Black Lives Matter activists;
  • Never Again Movement of the Parkland High School students;
  • Arab Spring protesters spread videos and photographs of police torture, leading to the toppling of tyrants;
  • Journalists Howard Rheingold refers to positive information cascades as “smart mobs,” but not every mob is smart or laudable and the information cascade dynamic does not account for such distinctions; and
  • the Russia covert action program to sow discord in the United States during the 2026 election provides ample demonstration.  

 

Chesney/Citron also point to the natural tendency of human beings to “propagate negative and novel information” – negative and novel information “grabs our attention to novel threats and especially attentive to negative threats.”

New to this writer was the term “data scientist.” Chesney/Citron identify a study conducted by data scientists on news stories shared on Twitter from 2006 to 2010, resulting in a list of 126,000 rumors shared online, with hoaxes and false rumors reaching people ten times faster than accurate stories. The authors of the study hypothesized, “Fake tweets tended to elicit words associated with surprise and disgust, while accurate tweets included words associated with sadness and trust.”

Chesney/Citron address such tendencies pointing out that  “with human beings naturally primed to spread negative and novel falsehoods, automation can be weaponized to exacerbate those tendencies.”

Users of the internet are all too familiar with those ever-present determinations as to whether the user is a bot or legitimate user, but notes these other uses of bots:

  • to amplify and spread negative misinformation;
  • Facebook estimates that as many as 60 million bots may be infesting its platform;
  • are responsible for a substantial portion of political content posted during the 2016 election; and
  • can manipulate algorithms used to predict potential engagement with content.

Filter bubbles are the second type of phenomena that make viral circulation of content possible to online audiences. Human beings naturally tend to surround themselves with information confirming their belief, according to Chesney/Citron, even without the aid of technology, and used these points to provide how filter bubbles work as powerful insulators against the influence of contrary information:

  • social media platforms supercharge the tendency of human beings by empowering users to endorse and re-share content;
  • platforms’ algorithms highlight popular information, especially if it has been shared by friends, surrounding us with content from relatively homogenous groups;
  • as endorsements and shares accumulate, the chances for an algorithmic boost increases;
  • after seeing friends’ recommendations online, individuals tend to share them with their networks; and
  • because people tend to share information with which they agree, social media users are surrounded by information confirming their preexisting beliefs.

 

Chesney/Citron noted that in a study of Facebook users, “individuals reading fact-checking articles had not originally consumed the fake news at issue, and those who consumed fake news in the first place almost never read a fact-check that might debunk it.”

The common cognitive biases and social media capabilities are summarized by Chesney/Citron with this statement:  “Information cascades’ natural attraction to negative and novel information and filter bubbles provide an all-too-welcoming environment as deep-fake capacities mature and proliferate.”

The reader's comments or questions are always welcome. E-mail me at doris@dorisbeaver.com.