YouTube continues to push extremist videos, including white supremacist content, and other clips that could end up being a gateway to extremist content for users already susceptible to racial hatred, according to a new report from the Anti-Defamation League (ADL).
YouTube says it has removed thousands of extremist videos, but such content is still circulating among a subset of highly engaged users encountering the videos through recommended content, the ADL said in the report released Friday.
Nine percent of YouTube users who participated in the ADL’s nationwide survey said they had viewed at least one video from an extremist channel during the course of the study.
Additionally, 22 percent said they had viewed at least one video from an “alternative channel,” which the ADL identified in its report as possibly serving as a “gateway to extremist content.”
The study identified a list of 322 “alternative” channels, including channels from conservative commentators, and 290 “extremist or white supremacist channels.”
Of the channels between the two lists, 515 were still active as of Jan. 21. Among the 97 channels no longer active, six were encountered by participants during the study and 91 were taken down either before the start of the study or before Jan. 21.
YouTube defended its handling of extremist content in a statement responding to the report.
“We have clear policies that prohibit hate speech and harassment on YouTube, and terminated over 235,000 channels in the last quarter for violating those policies,” the company said in a statement.
“Beyond removing content, since 2019 we’ve also limited the reach of content that does not violate our policies but brushes up against the line, by making sure our systems are not widely recommending it to those not seeking it. We welcome more research on this front, but views this type of content gets from recommendations has dropped by over 70% in the US, and as other researchers have noted, our systems often point to authoritative content.”
ADL CEO Jonathan Greenblatt said YouTube has not done enough to tackle the widespread extremist content.
“It is far too easy for individuals interested in extremist content to find what they are looking for on YouTube over and over again,” Greenblatt said in a statement.
“Tech platforms including YouTube must take further action to ensure that extremist content is scrubbed from their platforms, and if they do not, then they should be held accountable when their systems, built to engage users, actually amplify dangerous content that leads to violence,” he added.
The study found that almost all of the 9 percent of users who watched a video from an extremist channel watched videos from alternative channels as well. Those numbers were driven by a “small but highly engaged subset” of users with high consumption levels, the report stated.
The audience for videos from the identified channels is “dominated by people who already have high levels of racial resentment,” making up more than 90 percent of views for both types, according to the report.
While the recommendations to potentially harmful videos from other types are videos are “rare,” the recommendations of such content are “frequently” shown alongside videos from alternative or extremist channels, the ADL said.
The report analyzed data from a sample of 915 survey respondents between July and October and included browser history dating back to April.
Democrats have been pushing for tech giants to ramp up efforts to remove extremist content, especially after the deadly Jan. 6 riot at the Capitol. Rep. Anna Eshoo (D-Calif.) led dozens of House Democrats in a letter to YouTube CEO Susan Wojcicki last month urging the platform, along with Twitter and Facebook, to change their algorithm that facilitates the spread of extremist content.
Eshoo said the ADL report further shows that YouTube needs to change its algorithm and pushed for Congress to pass a bill she introduced with Rep. Tom Malinowski (D-N.J.) designed to hold social media platforms more accountable.
“The ADL released a well-researched and troubling report showing that white supremacy and extremist content is proliferating on YouTube and that YouTube’s recommendation algorithms are pushing users to more extreme content. These findings are damning. They make a crystal-clear case for why YouTube needs to rethink the core design of its product,” Eshoo said in a statement.
Updated at 1:09 p.m.