Meta is launching its “Teen Accounts” feature on Facebook and Messenger this Tuesday, responding to ongoing criticism regarding its efforts to safeguard young users from online dangers.
The new privacy enhancements and parental controls, which were first rolled out on Instagram last year, aim to alleviate concerns about teenagers’ social media usage, according to the company.
This move to bolster safety features for teens coincides with lawmakers who are determined to advance proposed legislation, such as the Kids Online Safety Act (KOSA), aimed at protecting children from the risks associated with social media.
Meta, along with TikTok from ByteDance and YouTube from Google, is currently facing numerous lawsuits filed by children and school districts regarding the addictive qualities of social media platforms.
In 2023, 33 states in the U.S., including California and New York, took legal action against the company for allegedly misleading the public about the risks associated with its platforms.
Meta announced that users under 16 will need parental consent to go live and will have the option to disable a feature that automatically blurs images that may contain nudity in direct messages.
“We will begin implementing these updates in the coming months,” the company stated.
In July 2024, the U.S. Senate moved forward with two online safety bills — KOSA and The Children and Teens’ Online Privacy Protection Act — which would hold social media companies accountable for the impact their platforms have on children and teens.
Although the Republican-led House did not vote on KOSA last year, they indicated during a recent committee hearing that they intend to pursue new legislation aimed at enhancing online protections for children.
Major platforms like Facebook, Instagram, and TikTok permit users aged 13 and older to create accounts.