Meta faces new restrictions over FTC allegations Facebook violated kids’ privacy rules

Meta, the parent company of Facebook and Instagram, is facing new proposed restrictions on how it uses data from minors based on allegations that the company violated childrens’ online privacy rules over how it represented capabilities of a kids’ messaging app. 

The Federal Trade Commission (FTC) alleged Wednesday that Meta misrepresented between 2017 and 2019 that children using the Messenger Kids app would only be able to communicate with contacts approved by their parents, in a way that violated a previous order issued by the FTC, the FTC Act and COPPA, or the Children’s Online Privacy Protection Act. 

Although the company said children would only be able to communicate with contacts approved by their parents, there were circumstances in which children were able to communicate with unapproved contacts in group text chats and group video calls, the FTC alleged. 

“Facebook has repeatedly violated its privacy promises,” FTC’s Bureau of Consumer Protection Director Samuel Levine said in a statement. “The company’s recklessness has put young users at risk, and Facebook needs to answer for its failures.” 

Meta responded in a statement by calling the move “a political stunt.”

“This is a political stunt. Despite three years of continual engagement with the FTC around our agreement, they provided no opportunity to discuss this new, totally unprecedented theory,” the company said in a statement. “Let’s be clear about what the FTC is trying to do: usurp the authority of Congress to set industry-wide standards and instead single out one American company while allowing Chinese companies, like Tik Tok, to operate without constraint on American soil.”

“We have spent vast resources building and implementing an industry-leading privacy program under the terms of our FTC agreement. We will vigorously fight this action and expect to prevail,” Meta added.

The alleged violations were found as part of the FTC’s own assessments as well as an independent assessor’s review of whether Meta’s privacy program satisfied requirements under a 2020 order issued to Meta by the FTC that required the company to pay $5 billion and expand its privacy program. 

In a separate tweet, Meta spokesperson Andy Stone also pushed back on the allegations and said that the “two privacy incidents the FTC identified today to support their action were discovered by us, fixed, and publicly disclosed three years ago.” 

The agency also proposed updates to the 2020 order that would add additional restrictions, including banning the company from monetizing data of children and teens under 18. 

The order would restrict Meta and its related entities, which also includes the virtual reality (VR) brand Oculus, from collecting data on minors aside from that which is needed for security purposes. It would also ban the company from using the data for commercial gain even after the users turn 18. 

Additionally, it would require Meta to pause the launch of any new products or services without written confirmation from the assessor that Meta’s privacy program is in full compliance with the orders’ requirements. 

The proposed updates also call for Meta to gain users affirmative consent for any future uses of facial recognition technology, a requirement that could become more crucial as Meta expands further into the VR market. 

The proposed updates were approved in a 3-0 vote by the FTC. The board is currently made up of three Democrats, after both Republicans stepped down, leaving two open seats. 

The FTC asked Meta to respond to the agencies’ investigations within 30 days.

Updated: 4:40 p.m. ET

Tags data privacy Data privacy Facebook Facebook Facebook FTC FTC kids safety Meta META Meta

Copyright 2023 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed. Regular the hill posts

Main Area Top ↴

More Technology News

See All
Main Area Bottom ↴

Most Popular

Load more