YouTube, along with Twitter and Facebook, say more steps are being taken to limit Quennon and other baseless conspiracy theories that could lead to real-world violence.
Examples include intimidating or harassing videos, suggesting that someone is a partner in a conspiracy like Video Cannon, in which President Donald Trump is portrayed as a secret warrior against a ring of child traffickers run by celebrities and “deep state” government officials.
Pizzagate is another internet conspiracy theory – originally a predecessor of Quinone – that would fall into the restricted category. His promoters claimed that children were being harmed at a pizza restaurant in Washington. DC, who believed in the conspiracy, entered the restaurant in December 2016 and carried out the assault. In 2017, he was sentenced to life in prison.
YouTube is the third major social platform for announcing policies to curb Quinnon.
Twitter announced tough action against Quanon in July, although it did not ban its supporters from its platform. It banned thousands of accounts associated with QUON content and banned the sharing of related URLs. Twitter has also said it will stop highlighting and recommending tweets related to QEON.
Facebook, meanwhile, announced last week that it was banning groups that openly support Quinnon. It says it will remove pages, groups and Instagram accounts to represent Queon – although they do not promote violence.
The social network said it would consider various factors in deciding whether a community met the ban criteria. This includes the group name, its character or the “about it” section and the discussion on the Facebook page or in the group or on the Facebook-owned Instagram account.
Facebook’s move comes two months after the announcement of a soft action decision that the group and its followers will stop promoting. But the implementation of the spot explanation thwarted that effort.
YouTube says it has already removed thousands of quinoa-videos and destroyed hundreds of channels in line with existing policies – especially those that threaten violence or deny the existence of major violent incidents.
“All of this work has been important in curbing the voices of malicious conspiracies, but there is much more we can do as a solution to some of the conspiracy theories used to justify real-world violence, such as Quinn,” the company said in a blog post on Thursday.
This story has been published from the Wire Agency feed without altering the text. Only the caption has been changed.