Portman, Coons Exploring Social Media Bill With FTC Role
Senate Homeland Security Committee ranking member Rob Portman, R-Ohio, and Senate Privacy Subcommittee Chairman Chris Coons, D-Del., are gathering information on legislation that would require social media platforms to open their algorithms to independent research, Portman said Thursday.
Sign up for a free preview to unlock the rest of this article
Export Compliance Daily combines U.S. export control news, foreign border import regulation and policy developments into a single daily information service that reliably informs its trade professional readers about important current issues affecting their operations.
“We don’t have a bill yet,” Portman told us after a hearing on extremist content. “We’re in the process of collecting input from the outside groups.” He noted the plan could be partly modeled after some suggestions from Stanford University law professor Nathaniel Persily, who testified.
Persily said the FTC could have a role working with the National Science Foundation to “vet” independent researchers. Social media platforms would provide “research portals” so researchers can study key societal issues. There would need to be privacy safeguards to prevent another Cambridge Analytica-caused privacy breach, said Persily.
Chairman Gary Peters, D-Mich., told us he wants to review the effort from Portman and Coons. The committee is in the process of putting together a hearing with social media companies, he said: “It’s our intent to bring the companies before the committee.” It’s premature to say which companies, said Peters, but he focused much of his remarks and questions on Facebook spreading dangerous content. The committee requested information from major social media companies about practices and policies to address extremist content, he said. Portman said he looked forward to hearing from the social media companies at future hearings.
Algorithms determine what users want to hear and amplify those messages, said Portman. It’s important to get “under the hood” and understand design elements, he said. There are proprietary information issues yet everybody is “talking about regulation,” including Google, Facebook and Twitter, he said. “We don’t know what we’re regulating if there’s no transparency.”
Social media companies are giving people what they want to read and hear, which is what newspapers, radio and TV do, said Mitt Romney, R-Utah. Romney said he has a hard time envisioning how the government could alter social media algorithms without violating the First Amendment, particularly when it takes no such measures against traditional media. All traditional media are trying to maximize profits, same as the National Enquirer, he said.
Ron Johnson, R-Wis., questioned the existence of an objective third party to research platforms. He shared concerns about tech companies’ Communications Decency Act Section 230 immunity when platforms are acting as publishers. They particularly stifle conservative expression, he said. Hate-filled discussions help with advertising dollars, said James Lankford, R-Okla. Congress should help social media “turn down the volume” on hate-filled discourse, he said.
Maggie Hassan, D-N.H., asked about weaknesses in algorithms that allow extremists to capture audiences. Extremists are paid to post and test content moderation until they get content through, and content moderation filters can’t keep up, said O'Neil Risk Consulting & Algorithmic Auditing CEO Cathy O'Neil. Karen Kornbluh, German Marshall Fund Digital Innovation and Democracy Initiative director, agreed, saying content moderators are simply outmatched.
Dominant companies don’t have any incentive to make this a priority, said University of Miami law professor Mary Anne Franks. She wants changes to Section 230, specifically: Make companies criminally liable for hosting non-consensual pornography, aka revenge porn.
Witness' written testimony and Peters' opening statement are here.