Advertisement

SKIP ADVERTISEMENT

Facebook Tried to Limit QAnon. It Failed.

The social network tried cracking down on the spread of the conspiracy theory and other extremist material. But QAnon groups are still flourishing on the site.

QAnon followers at a demonstration outside the Massachusetts State House last month. Despite Facebook’s attempts to quash their growth, groups promoting the conspiracy theory are thriving on the site.Credit...Brian Snyder/Reuters

OAKLAND, Calif. — Last month, Facebook said it was cracking down on activity tied to QAnon, a vast conspiracy theory that falsely claims that a satanic cabal runs the world, as well as other potentially violent extremist movements.

Since then, a militia movement on Facebook that called for armed conflict on the streets of U.S. cities has gained thousands of new followers. A QAnon Facebook group has also added hundreds of new followers while questioning common-sense pandemic medical practices, like wearing a mask in public and staying at home while sick. And a campaign that claimed to raise awareness of human trafficking has steered hundreds of thousands of people to conspiracy theory groups and pages on the social network.

Perhaps the most jarring part? At times, Facebook’s own recommendation engine — the algorithm that surfaces content for people on the site — has pushed users toward the very groups that were discussing QAnon conspiracies, according to research conducted by The New York Times, despite assurances from the company that that would not happen.

None of this was supposed to take place under new Facebook rules targeting QAnon and other extremist movements. The Silicon Valley company’s inability to quash extremist content, despite frequent flags from concerned users, is now renewing questions about the limits of its policing and whether it will be locked in an endless fight with QAnon and other groups that see it as a key battleground in their online war.

The stakes are high ahead of the Nov. 3 election. QAnon groups, which have cast President Trump as the hero in their baseless conspiracy, have spread and amplified misinformation surrounding the election. Among other things, they have shared false rumors that widespread voter fraud is already taking place and have raised questions about the competency of the Postal Service with mail-in ballots.

“In allowing QAnon groups to get to this point and continue to grow, Facebook has created a huge problem for themselves and for society in a more general sense,” said Travis View, a host of QAnon Anonymous, a podcast that seeks to explain the movement.

The QAnon movement has proved extremely adept at evading detection on Facebook under the platform’s new restrictions. Some groups have simply changed their names or avoided key terms that would set off alarm bells. The changes were subtle, like changing “Q” to “Cue” or to a name including the number 17, reflecting that Q is the 17th letter of the alphabet. Militia groups have changed their names to phrases from the Bible, or to claims of being “God’s Army.”

Others simply tweaked what they wrote to make it more palatable to the average person. Facebook communities that had otherwise remained insulated from the conspiracy theory, like yoga groups or parenting circles, were suddenly filled with QAnon content disguised as health and wellness advice or concern about child trafficking.

A Facebook spokeswoman said the company was continuing to evaluate its best practices. “Our specialists are working with external experts on ways to disrupt activity designed to evade our enforcement,” said the spokeswoman.

Facebook and other social media companies began taking action against the extremist groups this summer, prompted by rapid growth in QAnon and real-world violence linked to the group and militia-style movements on social media.

Twitter moved first. On July 21, Twitter announced that it was removing thousands of QAnon accounts and was blocking trends and key phrases related to the movement from appearing in its search and Trending Topics section. But many of the QAnon accounts on Twitter returned within weeks of the initial ban, according to researchers who study the platform.

In a statement on Thursday, Twitter said that impressions, or views, of QAnon content had dropped by 50 percent since it had rolled out its restrictions.

Then on Aug. 19, Facebook followed. The social network said it was removing 790 QAnon groups from its site and was introducing new rules to clamp down on movements that discuss “potential violence.” The effect would be to restrict groups, pages and accounts belonging to extremist groups, in the company’s most sweeping action against QAnon and other such groups that had used Facebook to call for violence.

About 100 QAnon groups on Facebook tracked by The Times in the month since the rules were instituted continued to grow at a combined pace of over 13,600 new followers a week, according to an analysis of data from CrowdTangle, a Facebook-owned analytics platform.

That was down from the period before the new restrictions, when the same groups added between 15,000 and 25,000 new members a week. Even so, it indicated that QAnon was still recruiting new followers.

Members of those groups were also more active than before. Comments, likes and posts within the QAnon groups grew to over 600,000 a week after Facebook’s rules went into effect, according to CrowdTangle data. Previous weeks had seen an average of less than 530,000 interactions a week.

“The groups, including QAnon, feel incredibly passionate about their cause and will do whatever they can do attract new people to their conspiracy movement. Meanwhile, Facebook has nowhere near the same type of urgency or mandate to contain them,” Mr. View said. “Facebook is operating with constraints and these extremist movements are not.”

Researchers who study QAnon said the movement’s continued growth was partly related to Facebook’s recommendation engine, which pushes people to join groups and pages related to the conspiracy theory.

Marc-André Argentino, a Ph.D. candidate at Concordia University who is studying QAnon, said he had identified 51 Facebook groups that branded themselves as anti-child trafficking organizations, but which were actually predominantly sharing QAnon conspiracies. Many of the groups, which were formed at the start of 2020, spiked in growth in the weeks after Facebook and Twitter began enforcing new bans on QAnon.

The groups previously added dozens to hundreds of new members each week. Following the bans, they attracted tens of thousands of new members weekly, according to data published by Mr. Argentino.

Facebook said it was studying the groups, but has not taken action on them.

The company is increasingly facing criticism, including from Hollywood celebrities and civic rights groups. On Wednesday, celebrities including Kim Kardashian West, Katy Perry and Mark Ruffalo said they were freezing their Instagram accounts for 24 hours to protest Facebook’s policies. (Instagram is owned by Facebook.)

The Anti-Defamation League also said it was pressing Facebook to take action on militia groups and other extremist organizations. “We have been warning Facebook safety teams literally for years about the problem of dangerous and potentially violent extremists using their products to organize and to recruit followers,” Jonathan Greenblatt, the chief executive of the A.D.L., said.

The A.D.L., which has been meeting with Facebook for months about its concerns, has publicly posted lists of hate groups and conspiracy organizations present on the social network. David L. Sifry, the vice president of A.D.L.’s Center for Technology and Society, said that the A.D.L. has had similar conversations about extremist content with other platforms like Twitter, Reddit, TikTok and YouTube, which have been more receptive.

“The response we get back is markedly different with Facebook,” he said. “There are people of good conscience at every single one of these platforms. The core difference is leadership.”

Sheera Frenkel reported from Oakland, Calif., and Tiffany Hsu from Hoboken, N.J. Davey Alba contributed reporting from New York and Ben Decker from Boston.

Sheera Frenkel is a prize-winning technology reporter based in San Francisco. In 2021, she and Cecilia Kang published “An Ugly Truth: Inside Facebook's Battle for Domination.” More about Sheera Frenkel

Tiffany Hsu is a tech reporter covering misinformation and disinformation. More about Tiffany Hsu

A version of this article appears in print on  , Section B, Page 1 of the New York edition with the headline: Facebook Tried to Limit QAnon. It Failed.. Order Reprints | Today’s Paper | Subscribe

Advertisement

SKIP ADVERTISEMENT