State authorities are accusing Meta, the parent company of social media giant Facebook, of knowingly allowing millions of underage users to access its platform.
The allegations suggest that Meta was well aware of this issue, which was treated as an ‘open secret’ within the company’s ranks.
According to documents filed in court, several U.S. states claim that Meta’s internal communications reveal a clear understanding of the presence of underage users on its platform, despite the company’s policies that prohibit users under the age of 13. This revelation raises significant concerns about Meta’s ability to protect young users and adhere to regulatory guidelines.
How did Meta allegedly allow millions of underage users to flourish on its platform, despite its policies and public commitments to safeguarding children online?
The accusations have ignited a fierce debate about the responsibility of tech giants in protecting minors from harmful online content and interactions.
Meta now faces critical questions about its internal communication practices, its commitment to enforcing age restrictions, and whether the company will face regulatory consequences for allegedly neglecting the safety of underage users. As the case unfolds, the tech giant’s reputation and the future of online child safety hang in the balance.