Platform Regulation Increasingly in Governments' Sight in Many Countries
Last week's terrorist attack in New Zealand puts more pressure on lawmakers to regulate online platforms, some experts said. Even before that, 70-80 proposals for regulating platforms were under consideration globally, Hogan Lovells preliminarily found in an ongoing six-month survey. Early results show "by far the most proposals we tracked come from the government," emailed Hogan Lovells (Brussels) competition lawyer Falk Schoening.
Sign up for a free preview to unlock the rest of this article
Export Compliance Daily combines U.S. export control news, foreign border import regulation and policy developments into a single daily information service that reliably informs its trade professional readers about important current issues affecting their operations.
That governments, not industry, are making such proposals suggests they might see the light of day, Schoening added. He noted platforms acted quickly after the Christchurch crime to take down the video of the attacks on two mosques, which shows there's "an emerging form of self-regulation in the industry." Google, Facebook, Amazon and Twitter didn't comment Monday.
EU countries seem most active in seeking to regulate platforms, but such activities are also taking place elsewhere, including China and Japan, Schoening said in an interview. Preliminary conclusions show a "perceived need" for politicians to act, to tell voters in the upcoming EU and national elections how they'll protect them from abuse by platforms. Whether the rules will actually safeguard people against tech companies is questionable, he said; the overarching theme is how to get platform regulation right internationally. One key question is who has power to regulate -- telecom authorities or competition agencies, Schoening said. Competition law figures in 25 percent of the proposals, possibly because of their perceived successful use against tech players by EU Competition Commissioner Margrethe Vestager.
U.K. lawmakers are increasingly focused on platform regulation. A yearlong probe into disinformation and fake news found democracy is threatened by relentless targeting of citizens with disinformation and personalized "dark adverts" from unidentifiable sources delivered over major social media platforms, said Member of Parliament Damian Collins, chair of the Commons Digital, Culture, Media and Sport Committee. "The big tech companies are failing in the duty of care they owe to their users to act against harmful content, and to respect their data privacy rights." He was among those who attacked Facebook for not sending CEO Mark Zuckerberg to a November multicountry privacy hearing held by Collins' committee (see 1811270014).
"Social media companies cannot hide behind the claim of being merely a 'platform' and maintain that they have no responsibility themselves in regulating the content of their sites," the Feb. 18 committee report said. It urged creating a new category of tech company, not necessarily a "platform" or "publisher," which would assume legal liability for content identified as harmful after it's posted by users. The panel urged governments to impose a code of ethics, overseen by an independent regulator, setting out what constitutes harmful content. "Reliance on voluntary codes, such as the one at EU level on Disinformation, is no longer an option," said European Publishers Council Executive Director Angela Mills Wade in a statement.
The EU e-commerce directive, which exempts platforms from liability unless they have specific knowledge of illegal content, was enacted before those companies began to curate content for users, the U.K. Lords Communications Committee reported March 9. "Self-regulation by online platforms which host user-generated content, including social media platforms, is failing." The panel recommended platforms hosting user content be subject to a statutory duty of care, enforced by the Office of Communications, and that a new body, the Digital Authority, coordinate regulators. The report supports his organization's "longstanding view" that the internet is subject to the same laws that apply offline, said ISP Association Chair Andrew Glover in a statement.
"Consider the areas for -- and the scope and form of -- regulation of digital platforms operating in Australia," the Communications and Media Authority told a Competition and Consumer Commission inquiry. It proposed a "stack" regulatory model with rules, when needed, targeted across each layer rather than particular delivery modes.
Meaningful regulation in the U.S. is unlikely, except possibly on privacy or political campaign funding, said Greg Sparrow, CompliancePoint senior vice president. The debate is complicated by free speech concerns, which make it hard to pin rules on content, and which mean any regulation will probably be about fair disclosure. Regulating platforms is a "hard challenge" because it comes down to each country's respective cultural values, the consultant said: In the U.S., it's more a brand issue for tech companies seeking customer goodwill.