The systematic suppression of child safety research reveals a corporate culture that prioritizes legal protection over user welfare. Meta ordered researchers to delete evidence of children being sexually propositioned, reframed studies to obscure risks and stopped collecting data showing kids under 10 in the Metaverse — a likely COPPA violation. Safety measures came only after regulatory pressure, showing the company knew of dangers but chose deniability over protecting vulnerable users.
Meta’s approach to VR and teen safety reflects responsible corporate governance in a complex regulatory environment. Beyond ensuring research compliance, the company has expanded protections in Teen Accounts, strengthened tools to block and report abuse and introduced nudity and location safeguards. It is also extending protections to adult-managed child accounts and removing hundreds of thousands of predatory accounts. These actions show Meta is prioritizing youth safety while meeting legal obligations.
© 2025 Improve the News Foundation.
All rights reserved.
Version 6.15.2