The views expressed by contributors are their own and not the view of The Hill

Use the TikTok moment before Congress to make all social media work better

If I were still in the House, I would have voted to make TikTok break up with China, because the risk that the Chinese government will use the platform to influence Americans is real. If the “TikTok bill” in its current form is enacted, and all goes according to plan, the company and its data will be owned by Americans. 

And then, I’m sorry to say, absolutely nothing else will change. 

TikTok will still be major amplifier of Chinese and Russian propaganda, of antisemitism, and of every variety of crazy, conspiracy minded extremism. It will still deepen Americans’ political divisions, and distort our sense of reality. It will still drive teenagers to depression and suicide.  

The algorithm TikTok uses to push out content like drug dealers push cocaine (which its new owners will undoubtedly wish to retain) makes these effects inevitable, whether a foreign adversary has its hands on the dial or not. So do the recommendation algorithms of all social media companies, though TikTok’s is arguably the most addictive of them all

So the question for senators as they take up the TikTok bill is: Are you going to protect Americans only from something that one foreign owned social media platform might do to us in the future? Or will you also seize this chance to end the immense harm that this platform — and others — are already doing to our children and to our democracy? 

Imagine that a corporation was tracking troubled kids’ online behavior and then calling them on an old fashioned phone to say, “Hi, we’ve noticed you may be having suicidal thoughts — so we’re going to mail you a how-to guide.” Or that it was calling people who watched a Holocaust denial video to say, “Hey, great stuff! We’d love to connect you with your local neo-Nazi group.” This is how social media algorithms work. They follow our movements online, figure out what content is most likely to maximize our “engagement” — the time we spend on their platform — and feed us more and more of that content, automatically and instantaneously. 

If what keeps us glued to the screen is cute puppy videos mixed with legitimate, balanced news about the world, then great, we get more of that. But as TikTok and other social media companies have learned, what holds most people’s attention for the most time is content that that plays on our fear, anger, and even self-loathing, and that reinforces our preexisting political passions. Again, foreign adversaries like China don’t need to control this mechanism in real time if they want to hurt Americans — all they have to do is introduce divisive content and watch it propagate. Mostly, we do it to ourselves. 

Consider: How could thousands of otherwise normal Americans have decided it would be a good idea to violently storm the United States Capitol on Jan. 6, and where did they get the illusion that the country would be with them? How have millions of Americans come to reject vaccines, or to believe teachers are pedophiles, or to develop such blind loyalty to a political leader than when he’s found liable for sexual abuse and indicted for felonies, they turn against our courts and the FBI instead of doubting him? How can educated college students profess absolute conviction that Hamas killed no civilians on Oct. 7?  

I think one reason is the profound change in how Americans, and the 8 billion people on our planet who spend an average of six and a half hours online each day, get information. Most of what enters our brains each day used to be moderated by human beings — journalists, experts, teachers — many of whom at least tried to distinguish truth from lies and right from wrong. Today, virtually everything we learn (whether the original source is the New York Times, FOX News, or an internet influencer) is filtered through social media, and individually selected for us by an algorithmic software program that privileges negativity and conflict to maximize our screen time. 

Yes, social media has added much good to our lives. But it is also the most perfect machine ever invented for making people hate each other, and for splintering humanity into tribes. To keep treating the symptoms of this calamity — the wars, the loss of trust in democratic institutions, the loneliness and suicide — without attacking the cause is a losing game. To regulate one of the companies most responsible without addressing the worst harm it does would be a lost opportunity. 

Fortunately, that this company is owned by China and poses national security risks offers a unique opportunity for regulation. The Senate should enhance the House TikTok bill — for example, by requiring the company’s prospective owners to disclose details about its recommendation algorithm, to prohibit the algorithm’s use with kids (all ideas drawn from existing bipartisan bills), and to require a plan to replace its addictive engagement maximizing system with one that minimizes harm to mental health, foreign propaganda and extremism. 

Can social media survive with a different approach to recommending content? 

A few years ago, Facebook asked an experimental group of users to rate every post on their news feed as either “good” or “bad” for the world, and then trained an algorithm to give those users more of what they labeled as good. In other words, it replaced an engagement-based algorithm, which tries to maximize our screen time by responding to our subconscious impulses, with what might be called a contentment-based algorithm, which gives us what we consciously say makes our lives and communities better. The experiment worked, giving users more family photo albums and legitimate news, and less dark content. The only problem was that users spent slightly less time with the nice news feed than with Facebook’s normal product, thus seeing less ads, so the contentment-based system was never implemented.  

A new TikTok should adopt this system. As for Meta, Google and the rest — they probably can’t be compelled to do so, because the First Amendment allows them to recommend content as they please. But as FOX News knows, the First Amendment is not a total liability shield. If social media companies, like traditional media, could be sued for harm, they would have a powerful incentive to change. So once it’s done with TikTok, Congress can take the fight to other social media companies that deserve equal scrutiny — by reforming or repealing Section 230 of the Telecommunications Act, the law that protects them from legal accountability. 

Our adversaries will use social media against us whether they own it or not. And turning a terrible Chinese company into a terrible American company will achieve little. Lets instead use TikTok’s moment before Congress to make it — and all social media — work better for Americans and the world. 

Tom Malinowski represented New Jersey’s 7th District in the House from 2019 to 2023. He is a senior fellow at the John McCain Institute. 

Tags algorithms ByteDance Social media Suicide TikTok

Copyright 2023 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed. Regular the hill posts

Main Area Top ↴

More Opinion News

See All
Main Area Bottom ↴

Most Popular

Load more