Since the start of 2019, Meta has had over 1.1 million reports of users below the age of 13 on its Instagram platform. However, the company only disabled a small percentage of these accounts, alleges a recently unveiled legal complaint against Meta. This lawsuit has been initiated by the attorneys general from 33 states.
The social media behemoth, rather than taking action, has allegedly continued to gather children’s private data such as location and email addresses without the consent of a parent, contravening a federal children’s privacy law. With these charges, Meta could face civil penalties amounting to hundreds of millions of dollars or even higher if these allegations are proven correct.
The accusation states, “Specific knowledge within Meta that countless Instagram users are less than 13-years-old is a well-documented, meticulously analyzed, and confirmed fact within the company and is fiercely guarded from public disclosure.”
These privacy concerns form part of a broader federal lawsuit that was filed last month by California, Colorado, and 31 other states in the Northern District of California’s U.S. District Court. The lawsuit contends that Meta has unfairly captured young people on its Facebook and Instagram platforms while suppressing internal studies that show user harm. It also seeks to compel Meta to stop using certain features that have allegedly hurt young users.
However, several pieces of evidence used by the states were obscured in the initial filing due to redactions.
The unsealed complaint, filed on Wednesday evening, offers fresh insights from the states’ lawsuit. Utilizing excerpts from internal emails, company presentation, and employee correspondences, the complaint argues that despite failing to comply with children’s privacy law, Instagram has for years actively engaged and strived to attract underage audiences.
The unsealed document also alleges that Meta consistently failed to prioritize the creation of efficient age-verification systems, and instead utilized strategies that enabled underage individuals to misrepresent their age when creating Instagram accounts. Moreover, it accuses Meta executives of asserting in congressional testimony that the company’s age-verification processes were effective and they deleted underage accounts once they were aware of them — this, despite their knowledge that there were millions of underage Instagram users.
Adam Mosseri, the head of Instagram, noted in a company chat in November 2021 that “tweens currently lie about their age to gain access to Instagram.”
The subsequent month, in his Senate testimony, Mr. Mosseri declared, “Instagram prohibits individuals younger than 13 years.”
In response, Meta stated on Saturday that they have invested a decade in ensuring online experiences are safe and age-appropriate for teenagers, and that the allegations by the states “misrepresent our work through out of context quotes and selectively referenced documents.”
Meta also indicated that in the United States, Instagram’s usage policies prohibit users below the age of 13. The company asserts that they have “procedures in place to delete these accounts when identified.”
The company noted that verifying the ages of users is a “complicated” issue for online services, particularly for younger users who may not possess school IDs or driver’s licenses. Meta suggested federal legislation requiring “app stores to obtain parental consent before allowing their teens below 16 to download apps” as an alternative to having young individuals or their parents provide personal details, such as birthdates, to numerous apps.
The privacy charges revolve around a 1998 federal law known as the Children’s Online Privacy Protection Act. This stipulates that online platforms with child-targeted content must secure verifiable consent from a parent before collecting personal details from users under the age of 13. The penalty for breaching this law can exceed $50,000 per violation.
The lawsuit argues that Meta chose not to develop systems to effectively identify and hinder underage users as it viewed children as an important demographic — the future user base — necessary for its sustained growth.
Meta had plenty of evidence of underage users, per the filing on Wednesday. For instance, an internal company chart in the unsealed content demonstrated how Meta tracked the daily usage of Instagram by 11 and 12-year-olds, states the complaint.
The complaint further asserts that Meta knew about specific accounts belonging to underage Instagram users through reports within the company. Yet, it systematically disregarded certain reports of users under 13 and allowed them to keep utilizing their accounts, as long as these user’s profiles did not contain a biography or pictures.
In one 2019 incident, Meta employees deliberated why they had not deleted four accounts owned by a 12-year-old girl despite requests, as well as “complaints from the girl’s mother stating her daughter was 12.” The employees concluded that the accounts had not been removed partially because Meta representatives “were unable to conclusively identify the user as underage,” the court document divulged.
This isn’t the first time Meta has been accused of privacy infringements. In 2019, to settle charges from the Federal Trade Commission of deceiving users regarding their ability to manage their privacy, the company agreed to pay a record sum of $5 billion and to alter its data practices.
It could be simpler for the states to go after Meta for violating children’s privacy than proving that the company promoted compulsive social media use, a relatively new trend, among young people. Since 2019, similar children’s privacy complaints have been successfully filed against tech giants such as Google and its YouTube platform, Amazon, Microsoft, and Fortnite’s creator Epic Games by the F.T.C.
States Claim Meta’s Millions of Underage Users Were a ‘Well-Known Secret’