YouTube & TikTok crack down on conspiracy theories

Since the launch of YouTube in 2005, it has been an almost universal rite of passage to spend endless hours down rabbit holes of conspiracy theories. The video platform slowly replaced cable TV shows that discussed alternative takes on controversial yet popular subjects like John F. Kennedy’s assassination or Area 51 aliens, which are ultimately far removed from daily life.

However, conspiracy theories like QAnon and Pizzagate are not only controversial but also dangerous. Since 2016, many far-right groups have spawned following the ludicrous belief that many Democrat figures in US politics run the government through paedophilia rings. Some have acted upon those theories with violence and now many affiliated groups are posing an increasing threat to the US’ stability in the run-up to the 2020 Presidential Elections.

For this reason, YouTube has decided to make the unprecedented move of outright banning whole channels and deleting any QAnon video without warning. The owners of those channels are obviously not happy and they have since sued YouTube, but their claim that the crackdown is causing users “irreparable harm” is dubious at best.

YouTube is not the only platform that is taking stricter measures towards QAnon-related content, though. TikTok, which has also recognized QAnon as a ‘hateful ideology’ and thus banned all their related content, has recently updated their Community Guidelines to reinforce their measures. However, reports indicate that many users can get around the ban by using coded hashtags, such as “#RedOctober,” “#TheStormIsComing,” “#TheStormIsHere,” and “#TheStorm.

We think it is about time platforms take harsher steps in moderating hate speech with real-consequences. But will this be enough?

Learn more here (YouTube) and here (TikTok)

Related