YouTube cracking down on hate speech

YouTube is implementing a new feature it says will help curb hate speech on its platform.

The company is targeting videos that contain “controversial religious or supremacist content,” but which do not violate its terms of service. Instead of taking down the videos in question, YouTube will put them in a stripped down digital purgatory where they will no longer be easily discoverable, nor have the features normal videos do.

{mosads}

Users viewing the videos will be greeted with a message warning them that they are about to see offensive content. The videos also will not appear in sections where YouTube recommends content to users. Key features like leaving comments, likes or seeing suggested videos will not be available. The videos also will not be monetized.

Google, which owns YouTube, declined to comment on the record about the changes.

In June, the company vowed it “will be taking a tougher stance on videos that do not clearly violate [its] policies.”

“These videos will have less engagement and be harder to find,” Google’s general counsel Kent Walker said at the time. “We think this strikes the right balance between free expression and access to information without promoting extremely offensive viewpoints.”

Increasingly, videos on YouTube are testing the companies line between problematic speech and speech that clearly violates its standards. More videos deride certain groups but fall short of calling for violence.

The alt-right and white supremacists have also turned to the platform to spread their views.

The company was criticized earlier in the year for profiting off ads next to videos with hate content, such as those from white supremacists and Islamic State in Iraq and Syria (ISIS) sympathizers.

In March, major companies like Verizon and Johnson & Johnson pulled their ads from YouTube temporarily.

YouTube and Google in response announced they would take more aggressive action against hateful and offensive content, including better detection of such videos.

YouTube says it will notify users whose content will be affected on Thursday. The Google subsidiary stressed that Thursday was a starting point and that more videos would be targeted on a rolling basis as content is flagged and reviewed.

The company also noted the process requires human review, and acknowledged there may occasionally be incorrect assessments of what is offensive. Accordingly, users who feel like their content has been incorrectly marked as offensive can seek an appeal.

YouTube’s move is the latest step internet companies are taking to reduce hate speech and the presence of white supremacists on their platforms in the wake of deadly violence in Charlottesville, Va.

Facebook, Twitter, Spotify, GoDaddy, Google and others announced that they would remove white supremacist content from their sites.

Tags

Copyright 2023 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed. Regular the hill posts

Main Area Top ↴

Daily News

Hunter Biden's SECOND TRIAL Set To Begin, Prosecutors Look To Bring Addiction Back Into Spotlight

Hunter Biden's SECOND TRIAL Set To Begin, Prosecutors ...
RFK Jr tells Roseanne Barr he staged dead bear cub ...
Kamala Harris's VP shortlist narrows
Harris, Trump court voters in Georgia as they stand ...
More Videos
Main Area Middle ↴
See all Hill.TV See all Video
Main Area Bottom ↴

Most Popular

Load more