The views expressed by contributors are their own and not the view of The Hill

Banning TikTok won’t make kids safer, but the House is ignoring what will

A person holds a phone displaying the TikTok logo against a computer screen showing more logos.
Michael Dwyer, Associated Press File
The TikTok logo is seen on a cellphone on Oct. 14, 2022, in Boston. New research finds that TikTok’s powerful algorithms are promoting videos about self harm and eating disorders to teens. The findings come from the Center for Countering Digital Hate, which created TikTok accounts for fictitious young people living in the U.S., Britain, Canada and Australia.

Now we know the House of Representatives’ dirty little secret. The chamber that’s become infamous for hyper-partisanship and dysfunction can move fast when it wants to. Case in point: A bill that would ban TikTok in the U.S. unless it is sold to a new owner, moved from introduction through Committee and floor passage in just eight days.

Which begs the question: Where is that urgency when it comes to children’s safety online?

Make no mistake about it, whether TikTok poses a unique national security threat or not, banning the app or forcing a sale won’t make kids safer or address the social media-fueled mental health crisis among our nation’s youth. Not as long as companies like Instagram, Snap, Discord, YouTube and TikTok are allowed to do anything and everything to trap kids in an endless scroll.

It’s been 25 years since Congress last passed a bill protecting children online — long before smartphones or social media. And that law, the Children’s Online Privacy Protection Act, only includes kids up to 13. Meaning teens have no protections whatsoever against the sophisticated design choices and psychological manipulations that Big Tech deploys to earn billions of dollars in ad revenue from young people.

That business model of maximizing engagement at all costs hurts young people in two interrelated ways. First, when one-third of teens say they are on social media “almost constantly,” that naturally means less time for sleep, exercise, studying and face-to-face interaction. Second, the design features social media companies use to hook kids are deeply harmful — like sending children down rabbit holes of pernicious content or enticing them to attempt dangerous viral challenges — and often make kids vulnerable to predation, drug dealers, and cyberbullying. 

Every week on Instagram, 1 in 6 users aged 13-15 report seeing content that promotes self-harm; more than 1 in 8 users that age receive unwanted sexual advances and more than 1 in 5 report being cyberbullied.

The Kids Online Safety Act would change that. At its heart is a “duty of care” requiring online platforms to ensure their design is not contributing to some of the most serious problems facing youth today — suicide, eating disorders, cyberbullying, sexual exploitation, fentanyl and social media addiction.

The tech industry and its defenders claim the bill will lead to widespread censorship, prohibiting youth from accessing content about sensitive topics or needed resources. Yet, the Kids Online Safety Act explicitly protects children’s ability to search by stating that nothing in the bill can be used to “prevent or preclude any minor from deliberately and independently searching for, or specifically requesting, content.”

What the act will do is require that social media platforms stop pushing truly toxic material into children’s feeds, such as targeting kids who show signs of depression with guides on how to cut themselves and hide it from their parents; or from sending kids struggling with body image information about how to eat just 500 calories a day — an extremely dangerous diet.

Internal documents make it clear that social media companies know their design choices hurt kids. In 2020, Meta conducted an internal study that determined a link between “Like” counts and “constant negative comparisons.” Researchers further linked those negative comparisons “to multiple negative well-being outcomes” such as increased loneliness, worse body image and negative mood. 

Meta then conducted “Project Daisy,” in which it hid visible “Like” counts from some Instagram users. That change reduced those users’ experiences of negative social comparison and had a “statistically significant impact” on their well-being. Yet, because likes keep young people on Instagram longer, Meta chose not to implement the program, defunded that research team and retained the visible “Like” feature.

Or consider TikTok, which received major kudos for implementing a “take a break” feature for teens. Turns out, kids can instantly override their “break” by entering a passcode. What is less well known is that TikTok also has a feature that withholds videos it knows a user will love until the moment it senses the user is about to disengage from the app. 

If the Kids Online Safety Act were law, Meta and TikTok wouldn’t be allowed to manipulate young people like this or ignore their own internal research showing they are harming kids. The bill would also force them to report those harms publicly.

The Kids Online Safety Act is the only bill that would address the wide range of design-caused harms children experience online. And it could be law in a matter of weeks — if the House wants it to be. The bill has 67 Senate cosponsors, including Majority Leader Chuck Schumer (D-N.Y.), setting it up to sail through a floor vote. The House hasn’t even introduced this lifesaving bill.

When motivated, we know that the House can pass tech legislation and that the Energy and Commerce Committee, led by Chair Cathy McMorris Rodgers (R-Wash.) and ranking member Frank Pallone (D-N.J.), can move fast. 

So, the question is: does the House care more about protecting children or Big Tech billionaires? Keep an eye on the Kids Online Safety Act to find out.

Josh Golin is the executive director of Fairplay, a nonprofit advocate that opposes child-targeted marketing and the excessive screen time it encourages. 

Tags Cathy McMorris Rodgers Chuck Schumer Frank Pallone Kids Online Safety Act Politics of the United States social media regulations TikTok ban

Copyright 2023 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed. Regular the hill posts

Main Area Top ↴
Main Area Bottom ↴

Most Popular

Load more