YouTube announced on Thursday it is adopting stricter rules against the spread of conspiracy theories, targeting the QAnon movement. Twitter and Facebook had already taken measures against this ultra-right movement.
Google’s video-sharing service said it is tightening its anti-hate and harassment policies “to prohibit content that targets an individual or group with conspiracy theories that have been used to justify real-world violence.”
This could mean removing videos that threaten or harass people, suggesting they are complicit in a conspiracy like Pizzagate, about an alleged child sex trafficking ring with ties to former Democratic White House candidate Hillary Clinton, which operated out of a Washington pizzeria.
The QAnon movement has grown sharply during the pandemic because it has acted as a binding force — mixing its central anti-Semitic and white supremacist tenet with long-standing conspiracy theories about vaccines and 5G mobile technology, as well as far-right and libertarian politics.
YouTube said it had already removed “tens of thousands of QAnon videos” and closed some channels used by the movement, namely those that explicitly threaten violence or deny the existence of major violent events.
Earlier this month, Facebook banned accounts linked to QAnon on its main social network and on Instagram. Twitter began a crackdown on QAnon earlier this year.
YouTube’s latest move comes amid heightened tensions over misinformation spreading on social media, while some conservatives have accused the platforms of bias in taking down content.
Published on Plataforma