Five takeaways from YouTube’s week of controversies
YouTube infuriated critics across the political spectrum this week in its handling of controversies over harassment, extremism and exploitation.
And pressure on the Google-owned company is likely to increase in the near future.
{mosads}The platform was accused of enabling child sexual exploitation, failing to protect LGBTQ creators from harassment, sweeping up activists and journalists amid a purge of extremist videos, and being biased against conservatives.
And that was in the span of just one week.
Much of the controversy began with Vox journalist Carlos Maza’s allegation that YouTube has allowed a popular creator on the platform, right-wing commentator and comedian Steven Crowder, to harass him with homophobic and racist slurs for the past two years.
YouTube said Crowder’s videos did not violate its policies against hate speech and harassment but nevertheless decided to prohibit Crowder from selling ads on his channel until he deals with some of the “issues” that have been raised, spurring accusations on the right of anti-conservative bias.
In the midst of the back-and-forth over Maza, YouTube announced it would begin banning videos promoting extremist ideologies such as white supremacy and caste superiority, a move that is expected to result in hundreds of thousands of videos being taken down.
As part of those changes, YouTube said it would prohibit creators who violate its hate speech policies from running ads on their channels.
{mossecondads}Meanwhile, lawmakers on Capitol Hill stewed over a New York Times report that said YouTube’s recommendation algorithm was enabling child sexual exploitation.
Here are five takeaways from YouTube’s week of public relations challenges.
Liberals and conservatives are furious
YouTube’s middle-of-the-road approach failed to please liberals and conservatives, who both pummeled the company for its decision to “demonetize” Crowder until he deals with concerns about his account that include selling T-shirts that say “Socialism Is For Fags.”
Maza’s supporters said demonetizing would have a minimal financial impact, a position Crowder backed up in a video saying he makes most of his money from merchandise and other crowdfunding sources.
“The problem isn’t Crowder and the problem isn’t monetization,” Maza tweeted. “The problem is that @YouTube allows monsters and bullies to become superstars, break YouTube’s rules, build an army of loyal, radicalized followers, and then make millions selling them merch that sustains their work.”
But while YouTube faced pressure from the left, conservatives began coalescing around Crowder’s case, holding it up as an example of alleged anti-conservative bias. Right-wing commentators and GOP lawmakers lashed out at YouTube’s decision, with some calling it ideologically driven and an infringement on free speech.
Sen. Ted Cruz (R-Texas), who has blasted tech companies for what he says is bias against conservatives, jumped into the fray.
“This is ridiculous,” Cruz wrote on Twitter. “YouTube is not the Star Chamber — stop playing God & silencing those voices you disagree with. This will not end well. #LouderWithCrowder.”
Cruz has appeared on Crowder’s show multiple times in recent years.
The pressure from both sides highlighted YouTube’s Catch-22 when it comes to content moderation. Crackdowns on harassment and hate speech are often met with allegations of political bias.
YouTube faces pressure to rethink its business model
Critics pointed to YouTube’s business model as the main reason for inaction against controversial figures. The platform’s revenue stream is determined in large part by the amount of time users spend on the platform, how many views videos accrue and the number of interactions with ads.
A host of industry watchers noted that some of the top-performing creators on YouTube often rely on incendiary content to rack up significant amounts of money for themselves and the platform.
“When people say, ‘This is all about free speech,’ it’s really not,” Matt Rivitz, the founder of social media advocacy organization Sleeping Giants, told The Hill. “This is about their ability to make money off engagement, and that’s really always going to guide their position.”
A YouTube spokesperson pushed back on that assessment, telling The Hill, “We consistently apply our policies across all creators, whether that be hate speech or harassment.”
YouTube says it is pursuing a strategy that promotes “responsibility” by working to remove policy-violating content, promote more authoritative content and minimize the spread of harmful videos.
Maza said YouTube would take an enormous hit to its bottom line if it started to seriously address the scourge of hate speech and harassment on its platform.
“They’ve created such a network of monsters they can no longer enforce any of these policies without engaging in a major overhaul in their design and operating policy,” Maza said.
YouTube’s new extremism policy to cause headaches
Activists, journalists and right-wing commentators have accused YouTube of censorship as accounts and videos have been taken down for “hate speech” violations in the wake of the extremism ban.
YouTube faced backlash for demonetizing and taking down videos from Ford Fischer, an independent journalist who regularly reports on white supremacists and white nationalists, as well as educators who flagged what they said was improper removal of content.
All of YouTube’s removals are subject to human review, a YouTube spokesperson told The Hill. The company has often touted artificial intelligence as a partial solution to tackling harmful content. But experts said there will likely be many examples of people arguing their videos were wrongfully taken down as YouTube ramps up the new policy.
“I am sure that there will be any number of anecdotal incidents that people are angry about,” Jesse Blumenthal, head of tech and innovation policy for the Koch network, told The Hill. “And that’s because content moderation at scale is really difficult.”
YouTube has said about 500 hours of new footage is uploaded to YouTube every minute.
But not everyone is criticizing YouTube’s new policy on extremist videos. Some of the country’s top civil rights groups, many of which have been pushing the company to take stronger action against neo-Nazi and supremacist content for years, applauded the move while noting that they’re waiting to see how YouTube chooses to enforce the ban.
The Anti-Defamation League, a Jewish organization that has mobilized against online hate speech, and Color of Change, a digital civil rights group, both called the move an “important step forward.”
The company’s problems are expanding
Children’s safety groups and lawmakers on Capitol Hill are demanding answers from YouTube after researchers found that its recommendation algorithm created playlists of scantily clad children, heightening concerns about predators on the platform.
And it’s not the first time the company has dealt with the issue. YouTube disabled comments on videos featuring minors in February after a user detailed how the sections were being used to identify and share exploitative videos of young girls. Subsequent reports found that YouTube’s algorithm led users to more videos of young girls.
This week, researchers from Harvard University’s Berkman Klein Center for Internet and Society found that the algorithm is still pushing viewers to more exploitative videos.
Sens. Richard Blumenthal (D-Conn.) and Marsha Blackburn (R-Tenn.) condemned the platform for allowing the exploitation of children.
“We are dismayed at YouTube’s slow and inadequate response to repeated stories about child exploitation of its platform,” the senators wrote. “Action is overdue.”
The company laid out in a blog post what it has done to protect children, saying it has been making “regular improvements to the machine learning classifier that helps us protect minors and families.”
Scrutiny only set to intensify on Capitol Hill
Sen. Josh Hawley (R-Mo.) announced he will introduce a bill that would require sites such as YouTube to leave videos of children out of its recommendation engine.
While Hawley’s bill is unlikely to become law, other senators have weighed in. Sen. Mark Warner (D-Va.) told The Hill he believes Congress should pay more attention to “the extent to which YouTube has been used by a range of bad actors, including far-right groups, to facilitate targeted harassment, spread extremist content and radicalize an entire generation of young users.”
Sen. Brian Schatz (D-Hawaii) said YouTube acts only when it’s under “pressure.”
The increased scrutiny comes at an awkward time for YouTube. On Monday, the House Judiciary Committee announced it is launching a bipartisan investigation into whether large tech companies — including YouTube’s parent company Google — are using their vast market power to suppress competition.
The company’s handling of harmful content is likely to come up as the probe moves forward, with lawmakers increasingly raising concerns about whether Congress should step in to take on some of the complicated questions surrounding speech online.
Facebook CEO Mark Zuckerberg has called for an outside body to set standards for speech on social media, a move that would take the onus off platforms that have amassed massive amounts of power and influence with relatively little oversight.
Maza and Blumenthal both raised concerns about government intervention in interviews with The Hill.
Government oversight “is not going to end well for us as creators, not going to end well for the platform, not going to end well for the conservatives who think this is some free speech crusade,” Maza said. “When the government starts intervening … YouTube’s going to wish they had done something.”
Copyright 2023 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed. Regular the hill posts