The views expressed by contributors are their own and not the view of The Hill

TikTok has a political influencer problem targeted at Gen Z voters

FILE - The TikTok logo is seen on a mobile phone in front of a computer screen which displays the TikTok home screen, on March 18, 2023. (AP Photo/Michael Dwyer, File)

Government officials across the U.S. continue to panic about TikTok. Fears about whether the Chinese-owned social media platform collects U.S. citizen data for the Chinese government have led to a recent statewide ban in Montana and rare bipartisan action at the federal level

But in the short run, direct manipulation of the app’s vast user base will likely only become more effective. In fact, groups in the U.S. are already using new strategies to manipulate public opinion in the lead-up to the 2024 presidential election. 

Shadowy marketing firms now pay vast armies of TikTok influencers to promote political content with the goal of swaying young voters. Many of these companies — and the political campaigns who hire them to coordinate well-placed TikTokers — are based stateside. They take advantage of weak regulations related to disclosure in order to hide their relationships with mainstream political groups and to give their synthetic influence campaigns the illusion of being “authentic.” Foreign governments and other international groups have employed them too.  

According to TikTok’s own numbers, over 150 million Americans use the video sharing platform. Lawmakers and intelligence officials have been quick to point out that many, even most, of these users are young. The firm closely guards up-to-date statistics on user demographics, but in 2022, Pew Research reported that “67 percent of [U.S.] teens say they ever use TikTok, with 16 percent of all [U.S.] teens saying they use it almost constantly.” 

Researchers and political parties alike advocate for the revolutionary voting power of young people. As a demographic, they are particularly concerned with hot-button issues like abortion and gun control. Because of this, some politicos are engaged in near-overt efforts to stymie the youth vote. For example, here in Texas, recently proposed legislation would close polling places on college campuses.   

Concerningly, research from UNICEF suggests that young voters are especially vulnerable to political manipulation online. As Pew and children’s advocacy groups demonstrate, these same young people spend a lot of time online. They form aspects of their social and political identity over the internet and social media. And TikTok has been described as a “key platform for youth political expression.” 

Marketing companies focused on digital political communication in the U.S. began organizing paid influencers on TikTok and other social media platforms as early as 2020. Democrat-aligned PR firms such as Main Street One (now renamed People First) pioneered such tactics, claiming they were an authentic, human-based, response to bot-driven disinformation campaigns. But political influencers are very much a bipartisan phenomenon. Republican and Conservative hype houses on TikTok also campaign on behalf of political candidates. 

Influencer-driven marketing firms now claim to control massive, immediately-deployable, stables of small-scale influencers on behalf of a given campaign. These “nano” and “micro” influencers aren’t what most people think of when they imagine an influencer. Most aren’t celebrities, either online or offline.  

Instead, these are everyday people who have captive, intimate social media audiences who identify with demographics especially appealing to U.S. political campaigns: Latinos in South Florida, Black voters in Atlanta and college-educated women in the Rust Belt. 

In a soon-to-be-released issue of the journal Social Media+Society, we brought together several research teams to study political influencers across six countries and nine social media platforms. We show that political influencers can emerge spontaneously because of a particularly controversial topic — as a real “grassroots” phenomenon. Crucially, though, this work also looks behind the curtain of coordinated “astroturf” campaigns. These are made up of political influencers who peddle conspiracy theories, false narratives about voter fraud, pro-Kremlin propaganda and white supremacy. 

The problem, despite claims from the firms using these tactics, is that paid political influencers’ efforts are not authentic. They are often highly coordinated from the top down to achieve specific goals. At best, firms like People First are able to find users who actually agree with the political message they are hired to promote. But this isn’t always the case. Sometimes, the people paid to promote partisan stances are simply hustling as part of the wider gig economy — working to make money regardless of the content they are compensated to share.  

It is also difficult to track whether influencers are paid, let alone paid as part of broader, coordinated efforts. While some political influencers label their posts as paid advertising, others do not. Because these payments take place off social media platforms, companies like TikTok are struggling to respond, and regulators in the U.S. are slow to catch up. Voters should have a right to know when someone is speaking to them on behalf of, or paid for, by political candidates or groups. Ultimately, this amounts to a new form of computational propaganda: falsely amplified content that is bogusly curated to unwitting users via platforms’ trending and recommendation algorithms.  

While there are some similarities, political influencers on TikTok are different from their counterparts on comparable platforms like Instagram and YouTube. TikTok allows for often unexpected, near-instantaneous virality for even non-celebrity users. In fact, viral content is central to their platform logic. One viral post is all it takes for a user to be noticed by the digital marketing firms that recruit and organize them. Other times, such reach isn’t even a prerequisite — just that the user in question appeals to a particular niche voting population.

TikTok appears to be completely unprepared to tackle their political influencer problem — let alone broader coordinated inauthentic activity on their platform. But they aren’t alone. Facebook has also been exploited by political influencers to spread election fraud conspiracies. Similarly, on Twitter, far-right influencers have been found guilty of trying to suppress the vote.

Despite the growing need to understand influencers and other problems online, social media platforms such as Twitter have made it more difficult to use their data to conduct research. More transparent, ready, data access for independent research is critically necessary to understand how political influencers use TikTok and other platforms to manipulate the groups in our democracy who are most vulnerable. Unfortunately, social media companies don’t appear to be moving in that direction.

Samuel Woolley, Ph.D. directs the Propaganda Research Lab at the University of Texas at Austin’s Center for Media Engagement, and is an assistant professor in UT Austin’s School of Journalism and Media. He is the author of the new book “Manufacturing Consensus: Propaganda in the Era of Automation and Anonymity.” 

Martin Riedl, Ph.D. is a postdoctoral researcher with the Propaganda Research Lab, and an incoming assistant professor at the School of Journalism and Media at the University of Tennessee, Knoxville. 

Josephine Lukito, Ph.D. is a senior faculty research associate at the Propaganda Research Lab and an assistant professor in UT Austin’s School of Journalism and Media.