Seeing the Round Corners

August 26, 2019


Today’s column explores the possible harmful uses of deep-fake technology. As most readers have likely observed over recent decades since computer technology took charge of the world, “human ingenuity is not limited to applying technology to beneficial ends.” Serious harms will be the result of the capacity to produce deep fakes, “many of them exacerbated by the combination of networked information and cognitive biases” described in previous Seeing the Round columns (July 1st through August 19th).

When computers first took control of the world, the complaint was the output was no better than the person inputting it – “idiot in, idiot out.” However, the sophistication of today’s technology, “the inherent credibility and the manner in which they hide the liar’s creative role” have shown how “deep fakes will emerge as powerful mechanisms for some to exploit and sabotage others.” Deep fakes have the ability to merely irritate or embarrass while others humiliate and destroy, even spur violence.”

Exploitation and sabotage are two likely harmful uses of deep fake technology. Harmful exploitation may take the nature of theft or abuse – stealing people’s identities to extract financial or some other benefit, or stealing a person’s identity to harm them. Blackmailers may use deep fake technology to extract something of value from people who doubt their ability to discredit the deep fakes and reach those seeing the initial release(s).

The victims of fake sex videos will most likely be female, and far too often, not created for sexual or financial gratification of the creator, but “some will be nothing less than cruel weapons meant to inflict pain,” such as in the case of a former spouse, girlfriend, boyfriend or significant other. Other examples are how imagery depicting non-sexual abuse or violence can be used to “threaten, intimidate and inflict psychological harm on the depicted victim or those who care for that person,” or “portray some falsely as endorsing a product, service, idea or politician.”

Sabotage is another harmful use of deep-fake technology:  reputational  sabotage across every field of competition such as workplace, romance, sports, marketplace and politics can be dealt “significant blows to the prospects of their rivals.”

Employment opportunities are probably the most likely affected aspect of ordinary citizens’ lives by deep-fake technology. According to a 2009 Microsoft study, “more than 90 percent of employers use search results to make decisions about candidates for employment,” with more than 77 percent of cases having a negative result as employers refuse to interview or hire people if search results produce “inappropriate results.” Business competitors are subject to the sabotage of deep-fake technology such as falsely showing the chief executive of a rival company exhibiting disreputable behavior, bribing government officials, purchasing illegal drugs or hiring prostitutes.

Statistics on spread of news stories related in an earlier column warrant repeating. A study conducted by data scientists revealed “news stories shared on Twitter from 2006 to 2010, using third-party fact-checking sites, generated a list of 126,000 rumors shared online.” “Hoaxes and false rumors reached people ten times faster than accurate stories,” and “falsehoods were 70 percent more likely to get re-tweeted than accurate news, with people more likely to re-tweet inaccurate news items.”

Next week, harm to society.

The reader's comments or questions are always welcome. E-mail me at