On Tuesday, Meta announced that it would be implementing stricter limitations on the content available to teenagers on Instagram and Facebook. This decision comes as the company faces heightened criticism for potentially causing harm to young users on its platforms.
Several months after being accused by multiple U.S. states of causing harm to the mental well-being of minors and adolescents, and providing false information about the safety of its platforms, Meta has implemented changes.
In a recent blog post, the corporation helmed by Mark Zuckerberg announced that it will limit access for adolescents to specific types of content on Facebook and Instagram, regardless of whether it originates from their friends or people they follow.
The company stated that this category of content may involve discussions about suicide or self-harm, as well as nudity or references to restricted items.
According to its website, goods that are prohibited on Instagram include tobacco, weapons, alcohol, contraception, cosmetic procedures, and weight loss programs.
Moreover, teenagers will now automatically have the most limited options selected on Instagram and Facebook. This was previously only applied to new users, but it will now be extended to current users as well.
The company stated that this will create a more challenging situation for individuals to encounter possibly delicate content or accounts in areas such as Search and Explore.
Additionally, Meta announced that it will broaden its practice of concealing search results to encompass a wider range of terms related to suicide and self-harm.
Recently revealed internal investigations conducted by Meta, as reported by the Wall Street Journal and whistle-blower Frances Haugen, have indicated that the company has been well-informed about the potential risks its platforms pose to the well-being of young individuals.
Teens are categorized as individuals under the age of eighteen on social media platforms, based on the date of birth provided during registration.
Source: voanews.com