It’s not a social media ban – it’s a ban on algorithms targeting kids

Dan Barrett
Dan Barrett

Is a blanket social media ban for teenagers 16 and under the solution? Or could there be a better approach?

On 8 June, the New York legislature passed a bill that wouldn’t stop kids from accessing social media, but instead the bill would provide access to platforms free of “addictive” recommendation algorithms.

Any company found in violation of the Stop Addictive Feeds Exploitation (SAFE) for Kids act would have 30 days to correct the issue or face fines of up to $5,000 per user under the age of 18. With four million kids in the state of New York, that’s a multi-billion fine that the social media companies would most likely be keen to avoid.

What do they mean by “addictive feed”?

One of the key success metrics for social media sites is the time spent on site by users. Platforms are built with features that give users reasons to not click away and do something else. When users like and share articles, the platforms see this as an indicator that the user is interested in that topic. Similarly if a user spends more time looking at a post, that becomes another indicator of interest. The algorithms then determine which content will increase a users time on site and will serve the user posts from accounts the user does not follow and readjusts the users timeline to prioritise content of interest in a users feed.

- Advertisement -

For adult users, this is problematic, but not considered as harmful. Adults have the tools available to them to make judgments on how they are spending their time. But for children who are still developing their minds and understanding of the world, algorithm-led social feeds are seen as a concern.

The bill passed by the New York legislature cites the following justification:

Addictive feeds have had an increasingly devastating effect on children
and teenagers since their adoption, causing young users to spend more
time on social media which has been tied to significantly higher rates
of youth depression, anxiety, suicidal ideation, and self-harm. Children
are particularly susceptible to addictive feeds, which provide a non-
stop drip of dopamine with each new piece of media, as they are less
capable of exercising the impulse control necessary to mitigate these
negative effects, which can stunt development and cause long-term harm.
Among girls, the association between poor mental health and social media
use is stronger than the associations between poor mental health and
binge drinking, obesity, or hard drug use.

Research shows that spending time on social media is ten times more
dangerous than time spent online on non-social media. Self-regulation by
social media companies has and will not work, because the addictive
feeds are profitable, designed to make users stay on services so that
children can see more ads and the companies can collect more data.

The change proposed by the bill is to strip the algorithm out and restrict users under the age of 18 to a social media feed presented as a linear feed (the most recent posts to the oldest), with posts only from accounts that the user has actively chosen to follow.

The content experience via social media is then no different to other online browsing. Users are opting into specific content feeds and have greater expectation that they will see the sort of content they signed up for.

- Advertisement -

Not a perfect solution. But there is no perfect solution.

The New York bill is a realistic compromise. There are benefits to kids being on social media, just as there are benefits for adults to be on social media. It gives users a sense of connection with friends, online communities, and access to information & entertainment. But it isn’t a cure for some of the other problems that remain with social media.

Users under 18 are still being exposed to problems like the reward loop caused by receiving likes and comments on posts. Facebook whistleblower Frances Haughen cited internal research conducted by the company that found “32% of teen girls said that when they felt bad about their bodies, Instagram made them feel worse.” Removing the algorithm doesn’t also remove that reward loop, but it does remove the additional posts that amplifies and feeds the negativity.

If, for example, a teenage girl spends more than a few moments longer than usual looking at a post with diet tips or likes/comments on it, that user won’t then be fed more articles about diet tips. That sense of reinforced negativity will be removed from the user’s feed.

Julie Scelfo is a former New York Times journalist who created the group Mothers Against Media Addiction (MAMA). She is an advocate for the bill and told NBC News:

“We’re in the middle of a national emergency in youth mental health and it’s abundantly clear that one major contributing source of that is social media and its addictive algorithms,” Scelfo said. “It’s not social media in and of itself, but it’s the addictive design that is contributing to children’s emotions being exploited for profit.” 

- Advertisement -

Share This Article
Follow:
Dan Barrett is the Head of Content at EducationDaily's publisher. He is a Brisbane-based writer/producer/comms professional who has worked for organisations including SBS, Mediaweek, National Seniors Australia, iSentia, the NSW Dept of Customer Service, and Radio National. He is passionate about the Oxford comma and is one of Australia's earliest podcasters.