Facebook has banned all accounts related to the QAnon conspiracy theory from its platforms.
“From today we will remove Facebook pages, groups and Instagram accounts,”
The move is a significant escalation on Facebook’s previous decision to remove or restrict groups and accounts that share and promote QAnon material.
QAnon is a conspiracy theory that states that President Trump is waging a war against elite pedophiles who worship Satan.
In a statement released on Tuesday, Facebook said its employees had started removing content and deleting groups and pages. “However, this work will take time and will continue in the days and weeks to come.”
“Our Dangerous Organizations Operations team will continue to enforce this policy and proactively identify any content for removal rather than relying on user reports,” the statement added.
- Facebook takes action against conspiracy groups
- QAnon, Coronavirus, and the Conspiracy Cult
Facebook said it is updating the measures implemented in August aimed at “disrupting QAnon’s ability” to organize and operate through its networks.
This policy, which was introduced to limit the public safety risks posed by QAnon, “offline anarchist groups”, and US militia organizations, resulted in restrictions on more than 1,950 Facebook groups and over 10,000 Instagram accounts.
This is a big step by Facebook, which has set out how to proactively remove all evolving QAnon content from its platforms.
It comes after I asked Facebook Vice President for Global Affairs Nick Clegg why QAnon can continue to spread political disinformation on the site to US voters and beyond using hashtags like #SaveOurChildren.
Facebook’s initial crackdown on this dangerous conspiracy theory focused on violent content plugged in by supporters and removed a number of groups and pages.
But those who support QAnon soon adapted, using tasty new hashtags to reach parent groups, local forums, and the average Instagram feed. And the movement kept growing.
This latest step is welcomed – but also very difficult to implement, especially since QAnon has grown so big and has spread under new guises.
I recently spoke to U.S. voters about how QAnon’s disinformation about candidates and child trafficking rings could have affected their friends and neighbors even before election day.
They explained how people they now know believe totally unsubstantiated claims on Instagram and Facebook about the Democrats running a child trafficking ring or presidential candidate Joe Biden abusing children.
Could this step – like the last – also be too late?
Facebook isn’t the only social media giant engaged in fighting the QAnon conspiracy movement.
In July, Twitter banned thousands of accounts and announced that it would no longer recommend QAnon-linked content to prevent “offline damage”. It has also been announced that it will prevent the URLs associated with the group from being shared on the platform.
What is QAnon?
In October 2017, an anonymous user made a number of posts on the 4chan message board. The user has signed out as a “Q” claiming to have a US security clearance called a “Q Release”.
These messages were known as “Q Drops” or “Breadcrumbs”, often written in cryptic language, peppered with slogans, pledges and pro-Trump topics.
Traffic to mainstream social networking sites like Facebook, Twitter, Reddit and YouTube has exploded since 2017, and there is evidence that the number has increased during the coronavirus pandemic.
- QAnon: What is it and where does it come from?
Judging by social media, there are hundreds of thousands of people who believe in at least some of the bizarre theories QAnon is offering.
QAnon continued the 2016 “pizzagate” saga – a fake theory about Democratic Party politicians running a pedophile ring out of a Washington pizza restaurant.