Export Compliance Daily is a Warren News publication.
Platforms Testify on Hate

Commerce Chairman Wicker Doubtful Congress Will Pass Section 230 Legislation Soon

It’s unlikely Congress will pass legislation altering Section 230 of the Communications Decency Act, Senate Commerce Committee Chairman Roger Wicker, R-Miss., told reporters Wednesday. “If I could wave a magic wand, I might make nuanced changes, but I think realistically you’re not going to see a statute passed changing that section anytime soon. Considering what it takes to get a bill passed and signed into law.”

Sign up for a free preview to unlock the rest of this article

Export Compliance Daily combines U.S. export control news, foreign border import regulation and policy developments into a single daily information service that reliably informs its trade professional readers about important current issues affecting their operations.

The “bulk” of legislative work on content moderation is “clearly” not going to come from his committee, Wicker said after a hearing on online extremism and violence. Witnesses came from Google, Facebook and Twitter.

DOJ should offer guidance for addressing social media-fueled violence, said ranking member Maria Cantwell, D-Wash., citing unmoderated platforms like 8chan. She appealed to the platforms for financial support. “We need to outline a continuous effort, a continuous, growing effort with more support from law enforcement nationally and internationally” to combat hate crimes online, Cantwell told us. She said Interpol’s task force on child pornography, a partnership with Microsoft and others, provides a model.

The tech industry’s Global Internet Forum to Counter Terrorism was created to stem online hate, Facebook Global Policy Management Head Monika Bickert told Cantwell. Launched by Facebook, Microsoft, Twitter and YouTube at the 2016 EU Internet Forum, GIFCT is establishing a “no-go zone” for extremists and violent content, the executive said.

Google has “nothing to announce” about a potential Chinese search product described as Dragonfly (see 1812110053), Global Director-Information Policy Derek Slater told Sen. Ted Cruz, R-Texas. Cruz pressed Slater on Google's reported censored version of search. Sen. Marsha Blackburn, R-Tenn., told us she will have follow-up questions for Google on its definitions of hate crime and speech.

Mass shootings and hate crimes shed light on the role of social media companies as a “catalyst for the spread of white nationalist propaganda both here and abroad,” said House Intelligence and Counterterrorism Subcommittee Chairman Max Rose, D-N.Y. He spoke at a separate hearing on white nationalism. Congress can no longer look at these platforms as unicorn companies started by “teenagers in hoodies,” he continued. Rose is tired of hearing about 80 and 90 percent success rates for content removal, saying it would be unacceptable for automobile airbags to work that frequently. Tech companies need to be held to a standard through public-private partnerships, he said.

The companies play a “critical role” in combating violent extremism, but it's “important to recognize content removal online cannot alone solve these issues,” Twitter Public Policy Director Nick Pickles told the Senate panel. All three companies cited statistics about how much harmful content they have removed. Twitter suspended more than 1.5 million accounts for terror-related violations August 2015 to December. Facebook removes “millions of pieces of content every year, much of it before any user reports it,” Bickert said.

More than 87 percent of the 9 million videos YouTube removed in Q2 were flagged by automated systems, and more than 80 percent of those were yanked "before they received a single view,” said Slater. He cited the hundreds of millions of dollars spent annually and more than 10,000 staffers working on content policy issues. Bickert cited the 30,000 Facebook hired to focus on safety and security.

Platforms should provide more transparency, particularly on platform policies and standards, said Anti-Defamation League Senior Vice President-Programs George Selim. Better data will lead to better policies, he said. Sen. Richard Blumenthal, D-Conn., praised the industry’s increased attention on the issue but said more can be done. He asked what the companies are doing to proactively prevent someone from using a gun, suggesting industry needs more incentive to combat harmful content.

Google systems aren’t perfect, but the platform is constantly improving, partly because of data sharing, said Slater. After the 2018 mass shooting at a high school in Parkland, Florida, Google began proactively contacting law enforcement to better address violent threats, he said.