Seeing the Round Corners

HEADS UP, the new day for Seeing the Round Corners “GOING LIVE” is Tuesday each week.

November 12, 2019


As the columns have continued on and on, readers are probably wondering, does the information on fake news and the synthetic news ever end. Sorry folks, it is only beginning!!! With the 2020 presidential election just beginning, the creators of fake news and deep fake videos are working triple over-time to defeat President Trump – anything goes, no matter how vicious, slanderous or false it may be.

That said, suing the creators of deep fakes continues. Readers should not get the impression that victims of deep fakes are just at the mercy of the purveyors of defamatory falsehoods. They are not.

Tort law is an area of law that is better suited to deal with deep-fake scenarios. Here are some just to give the reader examples as provided by Chesney and Citron:

  • victims can sue for defamation where falsehoods are circulated either recklessly or negligently;
  • public officials and public figures for their part, could sue posters for defamation if clear and convincing evidence exists of factual malice, that is, the defendant had knowledge the deep fakes were false or recklessly disregarded the possibility that they were false;
  • private individuals only need to show that defendant was negligent;
  • special damages do not need to be shown if deep fakes injure someone’s career, as would be the case if a video featured someone committing a crime or engaged in a sex video;
  • victims in some cases could sue for intentional infliction of emotional distress which requires proof of “extreme and outrageous conduct.” Particularly humiliating content like deep-fake sex videos would amount to “extreme and outrageous conduct” because such would fall outside the norms of decency.


Chesney and Citron further explain about other privacy-focused
torts at first blush, but closer inspection says no. Here are a few of those likely to run into rough sailing:

  •  “public disclosure of private fact” tort concerns the publication of private “non-newsworthy” information that would highly affect a reasonable person. Deep fakes certainly might cause such offense, but using such a person’s face in a deep-fake video does not amount to the disclosure of private information if the source image was generated from content posted online;
  • the intrusion-on-seclusion tort is likewise ill-suited to the deep-fake scenario. It narrowly applies to defendants who “intrude into a private place or invade a private seclusion that the plaintiff has thrown about his person or affairs;”
  • deep-fake videos do not involve invasions of spaces in which individuals have a reasonable expectation of privacy.


Chesney and Citron are careful to explain options that are likely and others not so likely – defamation, false light and intentional infliction of emotional distress with some prospect for copyright infringement and rights of publicity claims.

In copyright and rights of publicity claims, such civil claims have potential to redress serious harm to specific individuals, but cannot remedy a broad range of harms surveyed previously. What many readers have seen of late attributable to deep fakes is the case of a creator of a deep fake that goes viral, and, predictably sets off violence in a community. A victim of that violence might sue a creator of that deep fake for negligence or perhaps even some form of intentional-tort liability.

In such situations as negligence or some form of intentional-tort liability, the threshold hurdles are huge. Systemic harms may more likely be pursued on whether it is possible to sue the platform that spread the fake widely and thus made possible its widespread ill effects.”

Next week, suing the platforms.

The reader's comments or questions are always welcome. E-mail me at