Export Compliance Daily is a Warren News publication.
FTC Comments

YouTube, Industry, Advocates Debate Wisdom of Children’s Content Carve-Outs

The FTC should continue presuming viewers of youth-directed online content are children, despite efforts from YouTube and the tech industry seeking carve-outs for adults watching it. It’s a central issue in the agency’s review of the Children’s Online Privacy Protection Act rule (see 1912090061), based on comments in docket 2019-0054 collected through Thursday.

Sign up for a free preview to unlock the rest of this article

Export Compliance Daily combines U.S. export control news, foreign border import regulation and policy developments into a single daily information service that reliably informs its trade professional readers about important current issues affecting their operations.

Lack of clarity on guidelines threatens the viability of creators’ business models, YouTube filed. The platform took issue with current FTC guidelines that it said require platforms to treat anyone watching primarily child-directed content as under 13 years old. Adults often view children’s content for educational and other purposes, YouTube said, so the FTC should treat “adults as adults.”

The agency shouldn’t allow general audience platforms to “rebut the presumption that the users of child-directed portions of their services are children,” nearly 20 consumer and education groups said. Campaign for a Commercial-Free Childhood, Center for Digital Democracy, Electronic Frontier Foundation and Public Citizen signed. Expand the definition of personal information to include biometric data, they asked.

CCFC, CDD, Electronic Frontier Foundation and Public Citizen joined another group of organizations in separate comments. They urged the FTC not adopt a privacy-related rule until it completes an FTC Act 6(b) study of online digital advertising, children on general audience platforms, data brokers and education technology. Color of Change, Common Sense Media, Electronic Privacy Information Center and Public Knowledge signed. They questioned industry suggestions at a recent COPPA workshop that many of the viewers of youngster-directed content are actually adults.

On the question of allowing a “rebuttable presumption that users on child-directed portions of general-audience sites are children,” the Center for Democracy & Technology sought caution. Avoid mandating hosts identify child-directed user-generated content, it said: “A legal regime that requires operators to review, either manually or automatically, all content uploaded to their service and affix a label of ‘child-directed’ or ‘not child-directed’ would violate the First Amendment.” The agency “is right to focus on providing clarity to operators of general-audience user-generated content” about potential liability for child-directed content, said CDT.

The Computer and Communications Industry Association wants “greater clarity on qualifying operator activities regarding monetization, product improvement, and personalization of child-directed content.” It’s appropriate for the rule to allow general audience platforms “that have reasonably age-screened users to treat adult users interacting with child-directed content as adults,” CCIA said.

Consumer Protection Bureau Director Andrew Smith said in September that FTC staff has heard that the inability to engage in interest-based advertising on YouTube could make content creation uneconomic for certain creators of high-quality content (see 1909230062). The Information Technology and Innovation Foundation argued in October that ads targeting children aren’t really the problem (see 1910040026): The bigger issue is deceptive and misleading marketing.