Vulnerable young people are facing mounting mental health pressures and, while some factors provoking this decline — such as issues within the education system and the fallout from the COVID pandemic — are difficult to change, others are very easy to, like algorithms or social media codes. By making the relevant companies more accountable, settings, content filters, and revised guidance could help young people have healthier relationships with social media platforms.
Third-party users are largely responsible for their content and conduct online, and numerous regulatory processes are carried out to ensure content is not harmful. According to Meta's global head of safety, Antigone Davis, over 30 tools have been developed to support teens and families, encourage time limits, and identify more than 99% of harmful content even before it's reported by users. Evidently, social media platforms are always evolving to keep young people safe.
A crisis affecting mental health isn't the same as a crisis of mental health, and the "reification" of society — where the effects of political arrangements of power and resource start appearing like objective facts about the world — have had the consequence of swapping out political problems for scientific or technical ones. Characterizing issues such as the youth mental health crisis as a problem of "social media addiction" rather than focusing on going after unregulated tech oligopolies skips over the core fact — societal problems like these are inherently political.