It’s time to rethink children’s privacy protection
Disney accounted for nearly 40 percent of the U.S. box office in 2019. The world’s top-earning YouTube star is an 8-year-old boy who made $26 million in a single year reviewing toys. Video sharing app TikTok has more than 800 million active users worldwide, nearly half of whom are aged between 16 and 24.
The growing prevalence of online services and apps marketing to young demographics is downright scary. And these numbers will only increase as children are using technology earlier — and now, given the current pandemic, more often — than ever. The regulatory landscape, therefore, must change with it.
Consider all the data these services are amassing from the millions of children using them — how are these companies collecting and using this data? Are they selling it? If so, to whom? Most importantly, how are they protecting children’s privacy?
Enforcement actions against YouTube, Oath, and TikTok totaled over $180 million in 2019 for illegally collecting personal information about children. Most recently, TikTok came under fire for failing “to delete personal information previously collected from children and is still collecting kids’ personal information without notice to and consent of parents.”
More enforcement is coming.
And now that Congress is pushing to raise the ages of consent to some of these online services, companies will accumulate even more data and be more susceptible to regulatory violations and fines.
Regulations are here, but they’re far from perfect
Over the last few years, we’ve seen a handful of new regulations focused on protecting children’s privacy. These include the Kids Internet Design and Safety (KIDS) Act, the Children’s Online Privacy Protection Act (COPPA), and, to a lesser extent, the California Consumer Privacy Act (CCPA), which I’ll get to in a minute.
COPPA requires the operators of sites or online services directed at children under 13 to obtain “verifiable parental consent” before collecting data. Whereas the KIDS Act — which has yet to be passed — would force companies like YouTube and TikTok to limit and modify the way ads, app design, and some types of harmful content appear to children under the age of 16.
Unfortunately, however, we’re only at the tipping point of data privacy, especially when you consider the implications of the CCPA.
The shortcomings of COPPA are giving companies a compliance loophole
The CCPA gives consumers the rights on the access to, deletion of, and sharing of their personal information that’s collected by businesses. However, the CCPA isn’t enough for businesses to comply with the COPPA because of the blurred lines between federal and state laws.
To avoid the possibility that it will be used to preempt stronger state laws like the CCPA, the COPPA — which is the federal law — should be updated in two ways: a modification of its “actual knowledge” standard and an increase of its age of consent. This is not conjecture: The Ninth Circuit District Court rejected the application of stronger state laws in a settlement appeal, arguing COPPA preempted inconsistent state laws. Even though the FTC, which enforces COPPA, disagreed, the debate is far from settled.
The COPPA, like the CCPA, should cover all businesses that not only have “actual knowledge” that the intended target is a child, but also “willfully disregards” a consumer’s age. Under the CCPA, a business is covered once it knows a data subject is a child. To not “willfully disregard” a consumer’s age, furthermore, a business must verify a consumer’s age once it has reason to believe children under age 16 use its services. These same protections should be afforded under an update to the COPPA so that businesses with child data cannot escape regulation simply because they chose to look the other way.
Take Facebook, which was recently accused by the FTC for not protecting children’s privacy. Facebook is pleading ignorance, saying it was abiding by the COPPA in an effort to avoid federal fines and violations of the CCPA. And, can you blame them?
The confusion and conflict between the COPPA and state laws is troubling, and it’s giving businesses a loophole to collect and use user data as they wish. The COPPA needs to be updated so that it’s abundantly clear that state privacy frameworks can be upheld and can preempt it.
The COPPA doesn’t require online service operators to confirm the actual ages of the children watching and using their content. So, any business that markets to and doesn’t collect the ages of its users (i.e. Pokemon Go and TikTok) are in compliance because they have plausible deniability.
Under the COPPA, children under 13 require parental consent. This is far too young. Parents, are you comfortable with your 13 or even 16-year-old being responsible for making decisions about how their data is used?
COPPA, like the CCPA, should extend privacy protections to more youth, covering those under age 16, not just those under age 13. This means that COPPA-covered businesses should obtain affirmative consent to collect, use, or sell data to all minors under 16 years old. Adolescents between 13 to 16 years of age still experience critical developmental milestones. Unregulated or harmful digital activity may harm their development, resulting in cyberbullying, depression, social anxiety, and exposure to developmentally-inappropriate content. This is why a number of recent House bills are seeking to raise the age of consent.
The CCPA and other state laws attempt to correct this with willful disregard, but the COPPA does not: It needs to be updated to make this clear.
Again, let’s not give companies a loophole or runaround when it comes to using and protecting children’s data.
What’s next?
To ensure the privacy protections of minors, policymakers should update COPPA in the two key ways discussed above. By doing so, policymakers will also create a national standard harmonizing disparate state privacy laws, the navigation of which is costly for companies.
Whether it’s the CCPA, COPPA, KIDS Act, or even the GDPR, a central component to compliance is transparency — mandating that organizations provide consumers with a clear understanding of what data is held about them and how it’s being used. We will see a more stringent enforcement environment — and a steep learning curve for companies that aren’t paying attention to the requirements of these regulations.
Matthew Carroll is co-founder and CEO of the automated data governance company Immuta. Prior to co-founding Immuta, he served as CTO of CSC’s Defense Intelligence Group where he led data fusion and analytics programs and advised the U.S. Government on data management and analytics issues. He is a U.S. Army veteran of tours in Iraq and Afghanistan and holds a Bachelor of Science in Chemistry, Biochemistry and Biology from Brandeis University.
Copyright 2023 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed. Regular the hill posts