YouTube to Remove Videos Containing Supremacist Content

YouTube to Remove Videos Containing Supremacist Content

YouTube to Remove Videos Containing Supremacist Content

YouTube is placing stricter restrictions on the type of content available on its platform

YouTube made an important announcement on its blog. The service stated that it would be taking down supremacist and hate speech videos.[/tweetit]

In its blog, the video sharing platform went into great detail on its policy against hate speech. YouTube aims to bring an end to content that promotes supremacist views and attempt to justify actions such as discrimination, exclusion, and segregation.

YouTube to Remove Videos Containing Supremacist Content[/tweetthis]

At the same time, the service made it clear that it is going to ban content which is in favor of Nazi ideology. The platform will no longer host videos which deny Holocaust or promote fascist views. It is a welcome move, as the company was facing flak for playing a role in promoting conspiracy theories and far-right views among the masses. However, the post didn’t talk about specific videos or channels it would be targeting, as part of the purge.

Earlier, YouTube wasn’t strict against these types of videos, as the company supported free speech. Tommy Robinson, a far-right activist, can upload material on this site, even though other social media websites such as Twitter, Instagram, and Facebook banned him.

A month ago, Facebook acted against Alex Jones, David Duke, and Louis Farrakhan, because of their discriminatory views. The social media giant removed all professional, personal, and fan pages. Recently, White House launched a tool, which allows people to report to the government if they believe a platform suspended or banned their account due to political reasons. Due to constant scrutiny and pressure from advertisers, the public, and the media, the video sharing platform had to take action.

In the blog, YouTube stated that channels which go against their hate speech policies would no longer be part of its Partner Program. As a result, these owners won’t be able to generate revenue for their content, with the placement of ads.

In January, YouTube introduced changes in its algorithm, as part of its effort to combat videos which spread misinformation and harmful content. It affects videos which make false claims, by reducing their viewership.

YouTube didn’t talk about how it would be tracking the violations, as users upload over 50-hours of content, every minute. As the service’s algorithm tends to suggest videos based on the interest of its users, the platform is taking a precautionary measure. Those who watch conspiracy, supremacist, and hate speech videos, YouTube will recommend content from authoritative sources.


Follow the Conversation on Twitter