It doesn’t look good for Facebook. According to the Federal Trade Commission, the social media giant hasn’t been complying with privacy rules and regulations on youth data. A total ban on monetizing such information is now proposed.
It must be getting tiresome for the FTC, surely. First, in 2019, the commission announced a record-breaking $5 billion settlement with Facebook over allegations that the firm violated its 2012 FTC privacy order. The company was accused of deceiving users about their ability to control the privacy of their personal information.
Then, in 2020, the FTC formally approved amendments to the 2012 order and required Facebook to restructure its approach to privacy. It had to do this “from the corporate board-level down, the commission said at the time.
But $5 billion is peanuts for Meta, Facebook’s parent company. So it’s not entirely surprising that the FTC has again announced that the firm has not been complying with the latest privacy order.
On top of "continuing to give app developers access to users’ private information," which Meta claimed had been cut off, the FTC alleges that Facebook has caused new harm.
Young users at risk
For instance, the FTC alleges that Facebook's Messenger Kids product misled parents on who could connect to chat with minors and misrepresented who had access to private youth data.
The commission is now proposing changes to the 2020 order that prevent Meta from launching new products on any of its platforms without first procuring written FTC compliance confirmation. Future use of facial recognition technology would also be limited.
Perhaps most importantly, the company would also be completely prevented from monetizing any of the youth data it collects across Facebook, Instagram, WhatsApp, and Oculus because, according to the FTC, Facebook has violated the Children’s Online Privacy Protection Act (COPPA) Rule.
“Facebook has repeatedly violated its privacy promises,” said Samuel Levine, Director of the FTC’s Bureau of Consumer Protection. “The company’s recklessness has put young users at risk, and Facebook needs to answer for its failures.”
The FTC has asked the company to respond to allegations that, from late 2017 until mid-2019, Facebook misrepresented claims that parents could control whom their children communicated with through its Messenger Kids product.
Despite the company’s promises that children using Messenger Kids would only be able to communicate with contacts approved by their parents, children in certain circumstances were able to communicate with unapproved contacts in group text chats and group video calls.
The FTC says these misrepresentations violated the 2012 order, the FTC Act, and the COPPA Rule. Under the COPPA Rule, operators of websites or online services that are directed to children under 13 must notify parents and obtain their verifiable parental consent before collecting personal information from children.
Of course, the proposal from the FTC is just the first step in the process. In seeking modifications to the 2020 order, the FTC has formally asked Meta to respond in 30 days to the proposed findings from the agency’s investigation.
However, Meta has already published its own statement, in which the company says that the FTC’s latest complaint “is a political stunt” and “a clear attempt to usurp the authority of Congress to set industry-wide standards and instead single out one American company while allowing Chinese companies like TikTok to operate without constraint on American soil.”
According to Meta, the FTC does not have the authority to unilaterally impose “do-overs” on court-approved, negotiated settlements. Besides, the company insists it has overhauled its approach to privacy and invested billions of dollars in a “rigorous” privacy program.
More from Cybernews:
Subscribe to our newsletter