The views expressed by contributors are their own and not the view of The Hill

In Russian meddling, tech giants don’t know what they’re up against


When Facebook, Twitter, and Google executives continue to testify this week before the Senate and House intelligence committees about Russia’s use of online media to interfere in the 2016 election, they will need to demonstrate they understand the Kremlin’s penetration of U.S. social media is extensive, serious, and ongoing.  They should also show they are willing to invest in the necessary human and technological resources to counter this subversive propaganda.  

{mosads}Even as investigative journalists break new stories every week about the extent of Russian propaganda and the different and sometimes bizarre forms it took during the election, U.S. tech companies have consistently downplayed the number of fake or automated accounts on their platforms as well the scope of Russia’s interference.  

Twitter said it identified 201 fake accounts, while Facebook announced it discovered 470 accounts linked to Russia’s Internet Research Agency, a notorious “troll farm” funded by a prominent Kremlin crony. If Twitter and Facebook executives think these numbers are good approximations of the number of fake accounts Russia deployed in 2016, they are deluding not only themselves but the public.

When it comes to online propaganda, Russia is no amateur. After years of trolling the democratic opposition within Russia, today’s Kremlin trolls can impersonate black, white, Native American, gay, straight, male, female, pro-gun, anti-gun, and just about every other conceivable stripe of American voter one can think of, not to mention Germans, Dutch, Italians, etc.  

Do social media companies really think the Kremlin’s online propaganda machine is limited to one small building in suburban St. Petersburg? Why would the Kremlin put all of its resources into this one operation, which was exposed to the world by Russian reporters in August 2013, a full three years before the U.S. presidential election (and well before the story hit the U.S. media)?  

We know already that the Kremlin excels at creating diffuse networks of hackers, and that Russia’s intelligence services have co-opted cyber criminals and pro-Putin youth movements to conduct cyber operations. The Nashi youth group, for example, took credit for Russia’s 2007 cyber attack on Estonia. The notorious hacker Profexer, who reportedly developed malware for Russia’s military intelligence agency as part of the “Fancy Bear” operation that targeted the DNC, was revealed to be a Ukrainian whose services were contracted over the Dark Web.  

The evidence suggests the Kremlin subcontracts social media trolls the same way it contracts hackers. The discovery of a small army of trolls commenting on U.S. political topics in the tiny Macedonian town of Veles provides a window into how this ecosystem works. Until U.S. social media companies begin to apply their advanced analytic capabilities to tracking down the foreign trolls who have run amok on their networks, they will only be scratching the surface of the problem.

It took social media companies a number of years to wake up to — and be browbeaten into accepting — the reality that ISIS and other radical extremists were using their platforms as tools for online recruitment and radicalization. Eventually, after denying the extent of the problem and downplaying their ability to cope with it, social media companies got serious and got better at rooting out extremist accounts.

Today, a similar epiphany is required for a comprehensive strategy to take root for countering state-sponsored disinformation. Russia poses different challenges than ISIS, but the bottom line is they are both national security threats.  

Recent announcements by Twitter and Facebook that they will provide more transparency about political advertising is a good start, as is the decision to ban advertising by state-sponsored propaganda outlets like RT and Sputnik. But more needs to be done.

In the first place, social media companies need to get more aggressive about using algorithms and humans to root out bots and automated accounts. According to researchers at the Oxford Internet Institute, almost 50 percent of Twitter traffic may be automated — an astounding figure.

Second, social media companies must screen more carefully for divisive content to ensure that it not only meets guidelines for hate speech and offensive commentary, but also that it is not being distributed by a foreign sponsor. This will not be easy because we know from former Russian trolls that they were constantly switching their accounts among different servers and using different SIM cards. But persistent weeding of fake foreign accounts is necessary even if it requires a significant investment in human resources.

Finally, social media companies should partner with organizations like FactCheck.org and Hamilton 68, based in the United States, or StopFake, based in Ukraine, to better inform the public and their own staff about what messages the Kremlin is pushing. Partnering with academic institutions can also help raise awareness of how we process online information and how foreign actors seek to manipulate that cognitive process.

The social media revolution has fundamentally changed how people get information and how they interact.  This leads to many useful innovations but also to polarization, echo chambers, and foreign manipulation.  For the companies that pioneered this brave new world, it is worth recognizing the seriousness of the problem posed by malign foreign actors. And for the rest of us, it is worth remembering that social media companies themselves are not the problem; if we forge smart partnerships, we can help them develop the right solutions.

Michael Carpenter is senior director at the Penn Biden Center for Diplomacy and Global Engagement and nonresident senior fellow at the Atlantic Council. Carpenter is a former deputy assistant secretary of Defense for Russia, Ukraine, and Eurasia, foreign policy advisor to Vice President Biden, and NSC Director for Russia.

Tags Computing Facebook Internet troll Propaganda in post-Soviet Russia Propaganda in Russia Russian interference in the 2016 United States elections Social media Social networking services Software Twitter Universal Windows Platform apps World Wide Web

Copyright 2023 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed. Regular the hill posts

Main Area Top ↴
Main Area Bottom ↴

Most Popular

Load more