Seeing the Round Corners

HEADS UP, the new day for Seeing the Round Corners “GOING LIVE” is Tuesday each week.

November 19, 2019

PART XVII  – FAKE NEWS AND THE SYNTHETIC MEDIA

Last week’s column presented the best area of law if it was to sue creators of deep fakes – tort law rose to the top. It also discussed the possibilities of who should be sued in the case of a deep fake that goes viral and, predictably sets off violence in a community. The questions were which was best for systemic harms:  “Is it possible to sue the creator of a particular deep fake” or “Whether it is possible to sue the platform that spread the fake widely and thus made possible its widespread ill effects?”

Today’s column – suing the platforms covers lengthy explanation of the subject.

The subject of who is responsible in the case of harmful deep fakes – creators of the content platforms – is a complicated legal morass to put it mildly. Chesney and Citron state:  “Given the key role that content platforms play in enabling the distribution of deep fakes, the most efficient way to mitigate the harm may be to impose liability on platforms,” and go on to suggest that “Content platforms arguably are the cheapest cost avoiders because perpetrators may be difficult to find and deter.” In fact, “some content platforms may be the only realistic possibility for deterrence and redress.”

Readers should not get the impression online platforms do not have some incentive to play a screening role when one considers the “impact of moral suasion, market dynamics and political pressures, but in today’s world  “civil liability does little to add to this pressure.”

In 1996, Congress graciously handed the platforms with a liability shield in the form of Section 230 of the Communications Decency Act (CDA). The purpose of that “shield” was to make it hard for someone to sue an online platform based on its hosting of harmful content created or developed by others, with the exception of federal criminal law, the Electronics Communications Privacy Act and intellectual property law.”

Chesney and Citron make this analysis of Section 230 of the CDA – it provides platforms with a liability shield:

  • the example is given where an online platform displays content that is either republished from another source (such as quoting a news article) or generated by a user (such as a customer review posted on Yelp), and the content is defamatory or otherwise actionable. With protection afforded by Section 230, a plaintiff could not sue the online platform that helped it see the light of day;

 

  • Section 230(c)(1) “expressly forbids treating the platform as a ‘publisher’ of the problematic content. Section 230 is interpreted by the courts to afford online platforms as enjoying immunity from liability for user-generated content even if they deliberately encourage the posting of that content;” and
  • Section 230(c)(2), simply put, “forbids civil suits against a platform based on good-faith act of filtering to screen out offensive content whether in the nature of obscenity, harassment, violence or otherwise.”

 
Problems arose in developing Section 230 when a company, Prodigy, was held liable as a publisher of defamatory comments because it tried to filter profanity on its bulletin boards but did not do so completely. Cleaning up the internet was the goal by “making it easier for willing platforms to filter out offensive material, removing the risk that doing so would incur civil liability by casting them in a publisher’s role.”The result was an amendment to the CDA offered that was entitled “Protection for Private Blocking and Screening of Offensive Material” that ultimately became Section 230.

Portions of Section 230 are good and not so good. Section 230(c)(2) shows clear intent, expressly concerning the scenario in which a platform might be deemed engaged in editorial activity based on filtering of offensive user-posted content. Another part of Section 230 does not have the narrowing language, and as a result has proven to be a capable foundation for courts to interpret Section 230 immunity quite broadly.

Next week, more on how courts have stretched the immunity platforms, even when they solicit or knowingly host illegal or tortuous activity, and more.

The reader's comments or questions are always welcome. E-mail me at doris@dorisbeaver.com.