
New York’s attorney general has unveiled proposed regulations to curb addictive social media feeds for children, notably including stringent age verification rules. The move follows the Stop Addictive Feeds Exploitation (SAFE) for Kids Act, passed last year, which bans algorithm-personalised feeds for under-18s without parental consent.
Instead, platforms like TikTok and Instagram will limit feeds to content from accounts young users follow. The law also prohibits notifications to under-18s between midnight and 6 a.m. The new rules outline standards for age determination and parental consent for implementing these provisions.
“Companies may confirm a user’s age using a number of existing methods, as long as the methods are shown to be effective and protect users’ data,” Attorney General Letitia James’ office said.
Options for confirming a user is at least 18, for example, include requesting an uploaded image or verifying a user’s email address or phone number to check against other information, the office said.
Users under 18 who want to receive algorithmic feeds and nighttime notifications would have to give the companies permission to request consent from a parent.
Supporters of the law said curated feeds built from user data are contributing to a youth mental health crisis by vastly increasing the hours young people spend on social media.
“Children and teenagers are struggling with high rates of anxiety and depression because of addictive features on social media platforms,” James said in releasing the rules, which are subject to a 60-day public comment period.
Online age check laws — on the rise in the U.S. — have garnered opposition from groups that advocate for digital privacy and free speech. More than 20 states have passed age verification laws, though many face legal challenges.
The New York attorney general’s office noted Instagram and other social media platforms themselves have been implementing various forms of age assurance in recent months.
“The incorporation of age assurance methods into the infrastructure of social media platforms is a positive development that demonstrates the technical and financial feasibility of age assurance methods for these platforms,” the office said. “Unfortunately, voluntary adoption of age assurance methods has not achieved the level of protection of minors required by the (SAFE) Act.”
After the rules are finalized, social media companies will have 180 days to implement the regulations.