One in three internet users worldwide is a child under 18, estimates show. Expect that figure to rise as smart toys and all sorts of smart devices fuel constant connection. How will personal data, collected today, be used tomorrow? Answers are evolving. In a marketplace where information is money, privacy should be protected at every turn.
Invasive or unlawful practices by tech companies heighten privacy concerns. According to a recent investigation by TechCrunch, Facebook recruited teens for a so-called “research” study. Along with adults, teens as young as 13 were paid to install a Virtual Private Network app and give Facebook extensive visibility into personal and behavioral data.
Facebook’s practices were dubious (research, really?). Others clearly violate law. In 2018 the Federal Trade Commission announced fines against smart toy manufacturer VTech for collecting personal information from “hundreds of thousands of children” without parental consent and failing to secure data.
In February the FTC announced a $5.7 million civil penalty against video-sharing app Musical.ly (now TikTok). According to a statement from two FTC commissioners, Musical.ly collected and exposed sensitive data, including location, of young children. “In our view,” commissioners wrote, “these practices reflected the company’s willingness to pursue growth even at the expense of endangering children.”
VTech and Musical.ly violated the Children’s Online Privacy Protection Act, federal law requiring operators of commercial websites or services to obtain “verifiable parental consent” before collecting personal information from kids under 13. COPPA covers platforms targeting children and those with “actual knowledge” of child users.
COPPA provides important protections but should be updated and strengthened. Consider that COPPA was enacted two decades ago. In 2012 the FTC updated the COPPA Rule, broadening personal information to include “persistent identifiers,” such as cookies tracking online activities, along with location data, videos, and more. Good — but COPPA still treats teens like adults. Bad idea.
Enter new bipartisan “COPPA 2.0” legislation, introduced by U.S. Senators Markey and Hawley. If passed, COPPA 2.0 would expand protections, requiring companies to obtain consent from teens under 16 before collecting personal information.
That’s a major improvement, says Ariel Fox Johnson, senior counsel for Policy and Privacy at Common Sense Media, which supports COPPA 2.0. “We don’t have consumer privacy protections at a federal level for adults. There’s COPPA and then there’s nothing. You fall off a cliff in terms of protection when you hit 13.”
COPPA 2.0 strengthens corporate accountability, setting a more stringent “constructive knowledge” standard for companies regarding young users. That makes it harder to feign ignorance. “Right now, you have companies sticking their head in the sand,” says Johnson.
What else? The legislation requires manufacturers of kids’ connected devices to disclose data practices on packaging. It bans behaviorally targeted marketing to children.
What about school impacts? Currently, schools consent on behalf of parents if children’s personal information is used exclusively within the educational context. How teen consent would work under COPPA 2.0 would need to be determined.
More broadly, other federal laws address student privacy. Most states, including North Carolina, have laws safeguarding student privacy; these increasingly cover education technology companies and not just schools, says Johnson, but the burden often rests on schools to monitor providers or have contracts with providers. Gaps remain. “It’s really important to ensure that protection puts the liability on the people who are actually handling student information and in a position to protect it. In many cases, that’s the ed tech provider,” says Johnson.
What’s the coin of the digital realm? Data. When it’s personal — and especially when it belongs to kids or teens — it should be protected.
Kristen Blair is a Chapel Hill-based education writer.