Export Compliance Daily is a Warren News publication.
A ‘Very Good’ Law?

Advocates Push Platforms to Shoulder Civil Liability for Extreme Content

Platforms should take on more civil liability for terror- and murder-related content, advocates said in interviews two days after Attorney General Bill Barr said Section 230 of the Communications Decency Act potentially blocks victims from seeking civil recovery (see 2002190056). The topic is gaining steam on Capitol Hill (see 2001280059).

Sign up for a free preview to unlock the rest of this article

Export Compliance Daily combines U.S. export control news, foreign border import regulation and policy developments into a single daily information service that reliably informs its trade professional readers about important current issues affecting their operations.

The tech industry’s broad immunity needs to be removed, and platforms mandated to work with police, said Dallas Police Sgt. Demetrick Pennie, who's running as a Republican for a congressional seat in Texas’ District 30. Two of his colleagues were killed in a 2016 Dallas police shooting. Pennie said platforms are complicit in such attacks because they promote the content and targeted activity associated with terror plots. Courts have avoided altering Section 230 because it's Congress’ job to legislate, he said.

WilmerHale partner Patrick Carome disagreed with Barr’s assertion that 230 plays a substantial role in denying victim relief. Platforms have largely won dismissal against terror victims’ cases because of the Justice Against Sponsors of Terrorism Act, not Section 230, he said. Carome also defended the competitive benefits of Section 230. Increasing liability across the board could have a chilling impact on the ability of startups to innovate, he said. Tightening regulation further insulates the incumbents from competition, he said.

Pennie is a client of Excolo Law lead counsel Keith Altman, who has led several terror-related cases. He has argued platforms are responsible for providing material support to extremist groups like ISIS and Hamas, representing victims in Paris, Orlando, San Bernardino, Barcelona, Nice and other cities. He claims platforms have cited Section 230 immunity in defense.

When Section 230 was introduced, the internet was driven by bulletin board material, said Altman. Modern platforms are far different in that they use targeted ads and content by interacting with user data. Essentially, they're promoting extreme content based on audience interaction, he said. Without social media, ISIS would be a handful of extremists chanting in the desert, he said.

Invoking civil liability could change platform incentives, Altman said. For example, copyrighted material and content involving child exploitation is removed quite rapidly from platforms, he said. He believes that’s because platforms are liable for illegal content.

Meanwhile, individuals seeking removal of videos of their relatives being murdered, like Andy Parker (see 2002200049), are left to flag the content themselves, said Eric Feinberg, founding member at Global Intellectual Property Enforcement Center. Feinberg created the technology that has helped Parker flag YouTube content on his daughter’s murder. YouTube has no incentive to remove such videos, said Feinberg. He flagged the content Friday on YouTube and on Facebook. Google defended its vigorous enforcement of platform policies prohibiting such videos Thursday.

Section 230 doesn’t allow survivors or victims to sue companies for nefarious content, Feinberg said, echoing Barr. Senate Judiciary Committee Chairman Lindsey Graham, R-S.C., and Sen. Richard Blumenthal, D-Conn., are working on a draft legislative proposal that would hold companies civilly liable for content involving child exploitation (see 2002070052).

There seems to be more discussion about amending Section 230 than ever before, Carome said. His fear is that knee-jerk reactions could upend a “very good law,” which has struck a good balance in a complicated area.