Meta, the parent company of Facebook and Instagram, has announced new measures to restrict teenagers’ access to content related to suicide, self-harm, and eating disorders on its platforms. The move comes in response to government pressure and concerns about the impact of such content on the mental health of young users.
The changes include hiding content in sensitive categories, such as suicide and eating disorders, from view for users under the age of 18. Even if a teen follows an account sharing such content, they will not be able to see it. Instead, they will be directed to expert resources for help, such as the National Alliance on Mental Illness.
In addition to content restrictions, Meta will default teen accounts to restrictive filtering settings that control the type of content displayed on Facebook and Instagram. This includes filtering recommended posts in Search and Explore that are deemed “sensitive” or “low quality.” The default settings will be the most stringent, but users can choose to adjust these settings.
The move by Meta comes at a time of increased government scrutiny over how tech companies handle children on their platforms. Meta CEO Mark Zuckerberg is scheduled to testify before the US Senate on child safety on January 31, along with other tech executives. The hearing follows legislative efforts across the US to restrict children’s access to certain content online.
Governments, including the EU and the UK, have been enacting or considering regulations aimed at holding tech companies accountable for the content shared on their platforms. The EU’s Digital Services Act includes rules on algorithmic transparency and ad targeting, while the UK’s Online Safety Act requires online platforms to comply with child safety rules under the threat of fines. The regulations aim to create a safer online environment for users, particularly minors, but have faced criticism over potential privacy concerns.