Seeing the Round Corners

HEADS UP, the new day for Seeing the Round Corners “GOING LIVE” is Tuesday each week.

December 10, 2019

PART XVIII  – FAKE NEWS AND THE SYNTHETIC MEDIA

Recent weeks began coverage on how to deal with the harms brought on with the advent of deep fakes and deep fake videos. Specific Categories of Civil Liability considers whether or not the creators and distributors should be subject to civil liability for the harms they cause.

Part XVII began coverage on suing the platforms. As discussed in earlier columns, the authors of Deep Fakes:  A Looming Challenge for Privacy, Democracy, and National Security (Bobby Chesney and Danielle Citron) take the position that “Given the key role that content platforms play in enabling the distribution of deep fakes, the most efficient way to mitigate the harm may be to impose liability on platforms.”

Today’s column continues with Section 230 passed by Congress in 1996 as part of the Communications Decency Act (CDA). The purpose of Section 230 was to provide platforms with a liability shield. Progression of time has meant the immunity provision has been “stretched” considerably by the courts since its enactment, leading to a “very permissive environment for hosting and distributing user-generated online content, but also one in which it is exceptionally hard to hold providers accountable even in egregious circumstances – including situations in which someone is purveying systematic disinformation and falsehoods (state-sponsored or otherwise).”

That “stretching” by the courts includes an assessment of “First Amendment values” that supposedly “drove the CDA.” Courts have so extended the immunity provision that it applies to a remarkable array of scenarios including:

  • one in which the provider republished content knowing it violated the law;
  • solicited illegal content while ensuring that those responsible could not be identified;
  • altered their user interface to ensure that criminals could not be caught; and
  • sold dangerous products. 

 

The courts’ handling of Section 230 has seen it evolve into a “king of super-immunity,” preventing “the civil liability system from incentivizing the best-positioned entities to take action against the most harmful content. Such a liberal interpretation has virtually removed liability-based reason to take down illicit material, and resulted in victims having “no legal leverage to insist otherwise.”

As is the case when entities gain “power without responsibility,” (Rebecca Tushnet, Geo Washington L. Review) that power also includes the “ability to ignore the propagation of damaging deep fakes, even ones that cause specific and immediate harms such as sabotage to reputation, interference with business prospect and the like.”

The question arises, whether Section 230 should be amended to make liability possible in a wider-range of circumstances. That occurred when Congress enacted the Allow States and Victims to Fight Online Sex Trafficking Act (FOSTA for short), meant to deal with websites facilitating various forms of sex trafficking. Plaintiffs (and state attorneys general) acting on behalf of victims no longer have to face Section 230 immunity when suing platforms for knowingly assisting, supporting or facilitating sex trafficking offenses.

Debate took place as FOSTA was considered – from greater liability exposure for online platforms to less filtering rather than more. Now debate is under way whether Section 230 should be amended as to allow platforms to be held accountable for deep fakes. Should Section 230(c)1) be made conditional, in contrast to be automatic under the status quo, which means the entity would have to take ‘reasonable’ steps to ensure that its platform is not being used for illegal ends.”

Next week, more on specific categories of civil liability.

 

 

The reader's comments or questions are always welcome. E-mail me at doris@dorisbeaver.com.