- Meta is limiting content for teens on Facebook and Instagram amid criticism that its platforms harm mental health. New measures will restrict default settings, prevent searching certain topics, and prompt privacy updates.
- The policy change follows lawsuits alleging Meta’s platforms contribute to conditions like eating disorders in teens. Critics say Meta profits from harmful effects despite knowing the risks.
- Meta says updates come after consulting mental health experts to make platforms safer and more age-appropriate. The company still faces ongoing criticism over Instagram’s negative effects on teenagers.
Meta has announced it will be limiting the type of content that teenagers can access on Facebook and Instagram. This comes as the company faces increasing criticism that its platforms are addictive and damaging to young users’ mental health.
Meta’s New Measures to Protect Teens
The new protections are intended to provide more age-appropriate experiences for teens on Meta’s apps. Updates will change teens’ default settings to the most restrictive, prevent them searching certain topics, and prompt users under 18 to update Instagram privacy settings.
Meta expects to complete the rollout over the next few weeks. This will stop teens from accessing content about self-harm, eating disorders, restricted goods, or nudity. It includes blocking content from accounts they follow.
Backlash Over Effects of Meta Platforms on Teens
The policy change follows lawsuits from 42 state attorneys general filed last October. They allege Meta’s platforms are harming teens and contributing to conditions like body dysmorphia and eating disorders.
New York’s attorney general said “Meta has profited from children’s pain by intentionally designing its platforms with manipulative features that make children addicted.” A Meta whistleblower also testified the company knew its products were harmful but failed to act.
Meta Has Faced Ongoing Criticism
Complaints have persisted since 2021 about Instagram’s negative effects on teenagers. Leaked documents showed Meta was aware Instagram harmed many teens’ wellbeing. A whistleblower said algorithms maximized engagement over user safety.
Meta paused plans for an Instagram for Kids app amid the backlash. The company hasn’t provided updates on this initiative aimed at children 10-12 years old.
New Measures Part of Meta’s Ongoing Review
Meta said the latest changes follow regular consultation with experts on adolescent mental health and development. The goal is making platforms “safe and age-appropriate” for young users. This includes better understanding “which types of content may be less appropriate for teens.”