Schakowsky Thinks House Closer Than Senate to Releasing Bipartisan Privacy Bill
The House Consumer Protection Subcommittee is closer to releasing a privacy bill than bipartisan Senate negotiators, Chair Jan Schakowsky, D-Ill., told reporters Wednesday after a subcommittee hearing with Facebook. At the hearing, she and ranking member Cathy McMorris Rodgers, R-Wash., had contrasting views on the need to regulate media manipulation and deepfakes.
Sign up for a free preview to unlock the rest of this article
Export Compliance Daily combines U.S. export control news, foreign border import regulation and policy developments into a single daily information service that reliably informs its trade professional readers about important current issues affecting their operations.
Facebook’s new deepfake policy wouldn’t have applied to a well-known doctored video of House Speaker Nancy Pelosi, D-Calif., Vice President-Global Policy Management Monika Bickert testified, as expected (see 2001070012). She noted the deepfake would be subject to other company policies on misinformation.
Schakowsky told reporters the hearing didn’t produce many “concrete suggestions on how the government can follow up,” but it highlighted the need for Congress to examine the issue. She noted her suggestion that tech platforms submit to independent audits about every six months on manipulation activity.
Congress should rely on innovation, not more regulation here, said McMorris Rodgers. There’s no better place to ensure responsible technology use than the U.S., she said.
Facebook’s announcement Tuesday detailed a new policy that seems “wholly inadequate,” Schakowsky said. She cited the Pelosi video amassing millions of views, which prompted no action from Facebook. “You have to have accurate characterizations of people’s beliefs,” especially public figures, Senate Commerce Committee ranking member Maria Cantwell, D-Wash., told us.
Asked whether he supports Facebook’s new policy or if it’s a step in the right direction, House Commerce Committee Chairman Frank Pallone, D-N.J., declined comment. There needs to be more transparency for consumers about deceptive practices, Pallone said during the hearing. He cited fake videos and dark patterns, which are tactics used to manipulate users into buying items or offering consent. House Commerce Committee ranking member Greg Walden, R-Ore., credited Facebook for implementing new privacy policies in response to its Cambridge Analytica data breach.
Congress should encourage and reward industry attempts to self-regulate media manipulation issues, said University of Nebraska assistant law professor Gus Hurwitz. He noted industry solutions and technologies for detecting harmful content are sin their infancy, so Congress should yield.
Center for Humane Technology Executive Director Tristan Harris blamed platform business models for the polarization of constituents in political parties. Platform activity is driven by extremist content, misleading information and conspiracy theories, he said, calling it a “race to the bottom.”
In discussion with Schakowsky, Bickert confirmed Facebook’s new policy doesn’t apply to video and audio. Schakowsky regarded that as a flaw, saying they should all be treated the same. Rep. Jerry McNerney, D-Calif., said platforms have failed to fulfill public responsibilities associated with the power they wield.
Platforms are somewhat lawless, said Harris. He claimed platforms are failing to remove and address illegal activity on their sites, which means the U.S. is moving from a lawful society to an unlawful society.
Schakowsky echoed that in closing remarks, saying consumers have clearer and better expectations in real life versus the digital world. There's an incredible and justified distrust of how platforms are protecting consumers, she said. McMorris Rodgers urged better education on the topic, referencing a Jan. 28 FTC hearing on voice cloning technology.
Pallone cited a need for uniform standard on deepfakes across platforms like Facebook, Twitter and YouTube. Bickert’s testimony noted Facebook CEO Mark Zuckerberg’s desire for uniform standards across the industry.
McMorris Rodgers raised speech issues for content moderation. Bickert called Facebook “very much a platform for free expression,” citing its use of third-party fact checkers. In the past seven years, the company has gone from being reactive to proactive in going after “abusive content,” Bickert said: “We grade ourselves on how much we are finding before people report it to us.” She was responding to Rep. Bob Latta, R-Ohio, who raised concerns about scams and identity theft targeting elderly platform users.