Export Compliance Daily is a Warren News publication.
‘Beyond Mere Amplification’

Tech Algorithms Under Microscope After SCOTUS Argument

Platforms shouldn’t be liable for real-world harm just because their algorithms amplify and rank content, said consumer advocates, academics and industry representatives Monday at the State of the Net Conference.

Sign up for a free preview to unlock the rest of this article

Export Compliance Daily combines U.S. export control news, foreign border import regulation and policy developments into a single daily information service that reliably informs its trade professional readers about important current issues affecting their operations.

Legislators are exploring ways to amend Communications Decency Act Section 230 and increase the potential for platform liability linked to algorithmic decision-making (see 2303030041). The topic received a lot of attention during Supreme Court oral argument in Gonzalez v. Google and Twitter v. Taamneh.

In Gonzalez, Free Press took a stance that resembles the positions of Justice Clarence Thomas and Sen. Josh Hawley, R-Mo., on Section 230, said Vice President-Policy Matt Wood: “That is not a comfortable place for me, but it’s true.” Platforms should be liable only when they “know they’re causing harm,” he said. Congress should decide the test for determining when a platform is exacerbating harm, he said.

Justices Samuel Alito, Neil Gorsuch and Thomas have been eager to trim Section 230, and yet Thomas highlighted the importance of algorithms for the internet’s functioning, said NetChoice CEO Steve DelBianco. Algorithms are only “rules” for ranking content and providing order to user feeds, said DelBianco. He disputed Justice Elena Kagan’s claim Section 230 was written in a pre-algorithm era, saying some of the earliest internet companies relied on algorithms.

Columbia University’s Knight First Amendment Institute took the position that platforms should be immune unless their algorithms “materially contribute” to the illegal conduct in a way that goes “beyond mere amplification,” said Litigation Director Alex Abdo. A platform shouldn’t be liable for providing a “core value” to the public, he said: This would seemingly hold every platform liable for simply ranking content and result in mass censorship.

Justices Kagan and Sonia Sotomayor seemed to appreciate that Big Tech has been the beneficiary of an “alarming and deferential” treatment for 20-plus years, said University of Miami School of Law professor Mary Anne Franks. Platforms might take more thoughtful action to protect its users if it faced liability for some of the decisions that affect individual users, who are the only ones bearing the brunt of the harm, she said: Kagan understood that tech is getting a free pass.

Several panelists spoke in favor of the Pact Act as a potentially helpful solution for improving Section 230. The bill, introduced by Sens. Brian Schatz, D-Hawaii, and John Thune, R-S.D., wouldn’t please everyone (see 2302160066), but it provides a middle ground in allowing “due process,” said Wood. Free Press didn’t endorse the bill because there’s more work to do, he said: “I don’t think tech companies should get a pass.” Some version of the Pact Act could be a viable model, said University of North Carolina Center on Technology Policy Director Matt Perault: Its obligations seem “unobjectionable” for large platforms. Regular transparency reports and providing staff for handling takedown complaints are routine practice for a company like Facebook, he said.

Panelists debated the impacts of the 2018 Stop Enabling Sex Traffickers-Allow States and Victims to Fight Online Sex Trafficking (SESTA-FOSTA) package (see 2202240065), the first major Section 230 carve-out. Legislators and courts need to fully understand the real-world impacts of pulling back such a statute, said Reddit Senior Public Policy Lead Billy Easley. After the passage of SESTA-FOSTA, Reddit had to remove entire communities of sex workers just because they were mentioning sex work-related terms, not because they were soliciting, he said: “There was no way for us to know particularly where the line was.”

This is the wrong attitude toward amending statutes, said Digital Progress Institute President Joel Thayer: “'Let’s not touch it because bad things could happen.' I think that’s absurd. I think even from a political standpoint, that’s just not the reality.” It’s possible to make changes through targeted approaches, he said. There have been some positives from the passage of SESTA-FOSTA, including the restitution for victims of sex trafficking and the takedowns of major distributors like Backpage, he said: The “internet didn’t break” because of the law’s passage.