PK, TechFreedom Chiefs Defend Section 230 Protections After Mass Shootings
Weakening Silicon Valley’s content liability protections potentially discourages platform moderation and emboldens extremists on unfiltered websites like 8chan, said progressive and libertarian tech observers Monday. Public Knowledge CEO Chris Lewis and TechFreedom President Berin Szoka warned government against intervening in speech moderation, discussing 8chan's role in the weekend’s mass shootings.
Sign up for a free preview to unlock the rest of this article
Export Compliance Daily combines U.S. export control news, foreign border import regulation and policy developments into a single daily information service that reliably informs its trade professional readers about important current issues affecting their operations.
Network provider Cloudflare terminated service Monday for 8chan, an online forum known for unmoderated discussion. The “lawless” nature of 8chan inspired and glorified several mass shootings, wrote CEO Matthew Prince. Evidence suggests the El Paso gunman “posted a screed” to 8chan immediately before opening fire and killing 22 people, he said. Prince listed similar attacks in Christchurch, New Zealand, and Poway, California, that were well-documented on 8chan. The website “has repeatedly proven itself to be a cesspool of hate,” Prince wrote. “Even if 8chan may not have violated the letter of the law in refusing to moderate their hate-filled community, they have created an environment that revels in violating its spirit.”
When platforms actively moderate content, it’s a subjective process that exposes them to claims of political bias, Szoka said in an interview. This has helped fuel the narrative that Silicon Valley's biased against conservatives and calls to eliminate protections under Section 230 of the Communications Decency Act (see 1906200057). Eliminating the industry’s liability shield would mean far less content moderation, opening the door for more passive forums like 8chan to emerge, he said. Congress should stay out of the way and let industry make the tough decisions on where to draw the line, he said.
Lewis and Szoka don’t often align, but this is one area of agreement, Lewis said. Allowing extreme content to proliferate on unregulated platforms is “the direction we don’t want to go,” Lewis said. That doesn’t mean there can’t be discussions about how the government can influence and promote safe online conversation without speech intervention, he added. U.S. society was founded on free expression, which Section 230 helps protect, he said.
Regulators need to intervene and protect minority and marginalized communities, said Color of Change Campaign Director Media Culture and Economic Justice Evan Feeney: Silicon Valley’s obsession with free speech is “inextricably tied to its lack of understanding of structural racism.” While forums like 8chan serve as incubators, “that same hate can be found out in the open on Twitter, in private groups on Facebook, and in videos on YouTube,” he said.
The First Amendment sets a high bar for any congressional response to 8chan speech, said Center for Democracy & Technology Free Expression Project Director Emma Llanso: “Congress won't be able to compel 8chan, its hosts, or any other server provider to restrict access to lawful speech.” She encouraged industry, which has more flexibility, to deliver clear policies and transparent enforcement.
Industry should moderate content sparingly and free from government intervention, said Electronic Frontier Foundation Executive Director Cindy Cohn. The same arguments used to remove hateful speech are used to silence minority voices, she added. If industry moderates, there should be careful consideration and predetermined, clear standards, she said: “Otherwise, we will be establishing a powerful tool for censorship that will inevitably be exploited by repressive governments and other powerful actors.”
President Donald Trump said he will direct DOJ to work with local communities and social media companies to identify mass shooters before they strike. “The monster in the Parkland high school in Florida had many red flags against him, and yet nobody took decisive action,” Trump said Monday at the White House. “The perils of the Internet and social media cannot be ignored, and they will not be ignored.” The alleged shooter’s social media accounts were reportedly littered with threatening posts foreshadowing his assault.
Senate Judiciary Committee Chairman Lindsey Graham, R-S.C., struck a bipartisan legislative agreement with Sen. Richard Blumenthal, D-Conn., that would encourage states to adopt red flag protection order laws. Such orders would allow law enforcement to remove firearms from at-risk and mentally ill individuals. Graham said he spoke with Trump, who seemed supportive of the idea. The legislation would create a grant program to allow police to consult with mental health professionals in identifying cases, Graham said.
Trump also blamed video games, saying it’s “too easy today for troubled youth to surround themselves with a culture that celebrates violence.” Instead of blaming video games, the president should “set a good example and stop inciting hatred and bigotry,” said House Commerce Committee Chairman Frank Pallone, D-N.J.
Cloudflare about two years ago banned the Daily Stormer, a white-supremacist forum that subsequently moved to another host and claims it has more readers than ever. The Daily Stormer is “no longer Cloudflare's problem, but they remain the Internet's problem,” Prince wrote. “I have little doubt we'll see the same happen with 8chan.” He urged collaboration with policymakers to better understand the problem and offer solutions, which could mean “moving enforcement mechanisms further down the technical stack.”