The Supreme Court should “narrow the scope” of Communications Decency Act Section 230 and reverse the 9th Circuit’s decision shielding YouTube from liability in Gonzalez v. Google (docket 21-1333), Texas Attorney General Ken Paxton (R) wrote in a merits-stage amicus brief announced Thursday (see 2212070026). The 9th U.S. Circuit Court of Appeals in June 2021 dismissed a lawsuit against YouTube for hosting and recommending ISIS proselytizing and recruitment videos. The 9th Circuit affirmed a decision from the U.S. District Court for the Northern District of California shielding YouTube and its algorithms from liability. Plaintiff in the litigation and SCOTUS petitioner is the estate of Nohemi Gonzalez, an American student who was killed in Paris in 2015 during an ISIS attack. The petitioner asked SCOTUS to revisit the 9th Circuit's decision. Google didn’t comment. The case is scheduled for oral argument on Feb. 21 (see 2212190042). Though Section 230 was designed in 1996 to allow online publishers narrow protections from defamation liability, courts have “misinterpreted the law and allowed it to become a nearly all-encompassing blanket protection for certain companies, specifically internet and Big Tech companies,” Paxton said. These limitless legal protections prevent states from holding Big Tech accountable for law violations, even when infractions are unrelated to content publishing, said Paxton.
Section 230
A bipartisan group of senators introduced legislation Wednesday to increase transparency into social media companies’ internal data. Introduced by Sens. Rob Portman, R-Ohio; Chris Coons, D-Del.; Bill Cassidy, R-La.; and Amy Klobuchar, D-Minn., the Platform Accountability and Transparency Act (PATA) would require social media companies to deliver internal data to independent researchers. The researchers’ proposals would be subject to review and approval from the National Science Foundation. Companies that fail to comply would face FTC enforcement and potential loss of liability protection under Communications Decency Act Section 230, sponsors said. Platforms would be required to maintain a comprehensive ad library, content moderation statistics, data about viral content, and information about platforms’ algorithm rankings and recommendations.
Senate Antitrust Subcommittee Chair Amy Klobuchar, D-Minn., struck back Tuesday against opponents of her Journalism Competition and Preservation Act (S-673) following a wave of outcry against a bid to attach the controversial bill to the FY 2023 National Defense Authorization Act (see 2212050067). Text of a pending compromise version of the annual measure, to be filed as an amendment to shell bill HR-7776, again failed to materialize by Tuesday afternoon, amid fractious negotiations.
FCC Chairwoman Jessica Rosenworcel confirmed Thursday she has received a letter from acting FAA Administrator Billy Nolen asking that the agency mandate voluntary protections for radio altimeters agreed to by Verizon and AT&T in the C band (see 2206170070) for 19 other providers who bought spectrum in the record-setting auction. “I have seen the letter” and “we are in discussions with our colleagues at NTIA,” Rosenworcel told reporters after the FCC meeting. Commissioner Brendan Carr said he was happy to look at FAA concerns, but believes the time to raise new objections has passed.
Social media platforms lack accountability for hosting harmful content because of Communications Decency Act Section 230, New York Attorney General Letitia James (D) and Gov. Kathy Hochul (D) said in a report released Tuesday. The report showed Payton Gendron, the alleged mass shooter who killed 10 black people in Buffalo in May, was radicalized on fringe platforms like 4chan. Platforms largely provided an uneven response to his livestreaming efforts, the report said. James’ office reviewed thousands of pages of documents and social media content to explore how the alleged shooter used platforms to “plan, prepare and publicize his attack,” James said. Gendron was radicalized through “virtually unmoderated websites and platforms that operate outside of the mainstream internet, most notably 4chan,” James said, and livestreaming platforms like Twitch were “weaponized to publicize and encourage copycat” attacks. Section 230 allows “too much legal immunity” for platforms, even “when a platform allows users to post and share unlawful content,” James said.
The Supreme Court will consider two appeals of appellate court decisions on social media companies' legal protections when their platforms are used in conjunction with terror attacks. On Monday, SCOTUS granted certiorari in docket 21-1333 in an appeal of a 9th U.S. Circuit Court of Appeals decision tossing out a suit against Google's YouTube for hosting and recommending ISIS proselytizing and recruitment videos. Plaintiff in the litigation and SCOTUS petitioner is the estate of Nohemi Gonzalez, a U.S. citizen who was killed in ISIS attacks there in 2015. The petitioner asked SCOTUS to revisit the 9th Circuit's holding that the Communications Decency Act's Section 230 protects YouTube's algorithm for recommending videos. Google didn't comment. The court also granted cert Monday in docket 21-1496, in which Twitter is appealing another 9th Circuit decision. In that decision, the appellate court found Twitter and co-defendants Facebook and Google could be held liable for aiding and abetting an act of terrorism. Twitter and the others were sued by American relatives of Nawras Alassaf, a Jordanian killed in an ISIS attack in Istanbul in 2017. “These cases underscore how important it is that digital services have the resources and the legal certainty to deal with dangerous content online," Computer and Communications Industry Association President Matt Schruers said in a statement. “Section 230 is critical to enabling the digital sector’s efforts to respond to extremist and violent rhetoric online, and these cases illustrate why it is essential that those efforts continue.” SCOTUS "can really do something useful by constraining Section 230 protections to hosting content instead of targeting content," tweeted Matt Stoller, American Economic Liberties Project research director.
House GOP leaders formally unveiled their “Commitment to America” midterm election policy platform Friday at an event in Monongahela, Pennsylvania, but the plan’s proposals for reining in Big Tech didn’t get mention from party leaders. The proposal calls for “greater privacy and data security protections for Americans,” supplying “parents with more tools to keep their kids safe online” and preventing “companies from putting politics ahead of people.” Big Tech “has tipped the scales to silence and censor those with conservative viewpoints,” the House GOP’s plan said: “Worse than crystalizing [sic] an ideological echo chamber, these apps have proven to be incredibly addictive for children with potentially devastating consequences.” House Commerce Committee ranking member Cathy McMorris Rodgers, R-Wash., who would likely become panel chair if the GOP wins control of the chamber in the November election, later talked about the tech proposal during a Fox Business Channel appearance. She cited interest in revisiting Communications Decency Act Section 230, noting major social media companies have been more interested in “censoring conservative speech online” than stopping “criminal activity” committed via their platforms. “I’m sending letters to many of the big tech companies like TikTok, Snapchat and Instagram and telling them they need to do more to stop the fentanyl sales that are killing our children,” Rodgers said. She earlier this month cited instances in which young people have had access to drugs, often laced with fentanyl, using Snapchat (see 2209150061).
Law professors and advocacy groups consider Friday’s 2-1 5th U.S. Circuit Court of Appeals decision upholding Texas social media law HB 20 in NetChoice v. Ken Paxton an outlier with uncertain effects on social media platforms, but they widely expect the matter to go to the Supreme Court. “This is far from over; there are a lot of hurdles between here and this law taking effect,” said Tech Freedom Internet Policy Counsel Corbin Barthold Monday during a livestreamed panel on the decision. “It is really unclear how platforms could continue to function,” said Blake Reid, director-University of Colorado Samuelson-Glushko Technology Law & Policy Clinic. Plaintiff NetChoice declined to comment on whether it will appeal the case.
A potential legislative proposal from Sens. Lindsey Graham, R-S.C., and Elizabeth Warren, D-Mass., that would create a new tech regulator (see 2209120059) is dividing the Senate.
“Holding Big Tech accountable” will be one of House Commerce Committee Republicans’ top priorities if their party wins a majority in the chamber in the November election, ranking member Cathy McMorris Rodgers, R-Wash., said during a Thursday Punchbowl News event. “We need to hold Big Tech accountable” in a bigger way than has happened during this Congress, Rodgers said: She supports “narrowing [Communications Decency Act] Section 230 protections, especially for the larger companies” that have been “bad actors,” so “they can be held accountable” for censorship. Rodgers touted Republicans’ Big Tech Censorship and Data Task Force and language in House Commerce’s stalled (see 2209010066) American Data Privacy and Protection Act (HR-8152) that “would protect” personal information for users under age 17. The GOP also aims to ensure “small companies and innovators can still have access” to a “free internet” so “they can compete,” she said. Rodgers cited TikTok and Snapchat as among the worst actors in the tech space. She cited TikTok’s “impact on kids” and the “amount of data” that app collects that’s “being stored in China or used in China.” She criticized Snapchat over instances in which young people have had access to drugs, often laced with fentanyl, using the app (see 2110260070). Snapchat and TikTok didn’t comment. Rodgers said her shorter-term goals include ensuring language to temporarily extend the FCC’s spectrum auction authority past Sept. 30 makes it into a planned continuing resolution to fund the federal government past that date (see 2209090053). The House already “did our work” by passing the Spectrum Innovation Act (HR-7624), which would renew the FCC’s authority for 18 months (see 2208090001), she said: “It would be unfortunate” if the agency’s existing authority expires and “I don’t believe anyone wants to see that” happen.