Facebook officials have long known how the platform’s recommendations can lead users into “rabbit holes” linked to conspiracy theories. We now know how clear that picture was thanks to documents provided by Facebook whistleblower Frances Haugen.
During the summer of 2019, a Facebook researcher found that it took the company just five days to start recommending QAnon groups and other disturbing content to a fictitious account, according to an internal report whose findings he reported ,, another Friday. A document titled “Carol’s Journey to QAnon” was also in the cache of records Haugen submitted to the Securities and Exchange Commission as part of it.
It is reportedly described as a Facebook researcher opening a brand new account for “Carol,” which has been described as a “conservative mom”. After liking several conservative but “mainstream” pages, Facebook’s algorithms began to suggest more marginal and conspiratorial content. Within five days of joining Facebook, “Carol” saw “groups with open ties to QAnon,” conspiracy theories about “white genocide,” and other content that the researcher described as “extreme, conspiratorial, and explicit content.”
The fact that Facebook has fostered QAnon conspiracy theories and other movements in this regard has been well known outside the company for some time. Researchers and journalists have also documented a former conspiracy theory during the coronavirus pandemic in 2020. But documents show that Facebook researchers raised the alarm over conspiracy theory before the pandemic. The Wall Street Journal notes that researchers have proposed measures such as preventing or slowing down content sharing, but Facebook officials have generally decided not to take those steps.
Facebook did not immediately answer questions about the document. “We have been working since 2016 to invest in people, technologies, policies and processes to ensure we are ready and we started our planning for the 2020 elections themselves two years in advance,” Facebook’s vice president of integrity wrote on a long Friday night. In a statement, Rosen reiterated a number of measures he said Facebook had taken in the weeks and months leading up to the 2020 election – including a ban on QAnon and a group of militias – but did not directly address the company’s recommendations before banning QAnon.
The documents come at an uncertain time for Facebook. Now there were those who submitted the documents to the SEC saying that the company had deceived investors and preferred growth and profit over user safety. Control is likely to be further intensified as more than a dozen media organizations now have access to some of these documents.
All products recommended by Engadget are selected by our editorial team, independent of our parent company. Some of our stories include affiliate links. If you buy something through one of these links, we may earn a commission for partners.
Friendly communicator. Music maven. Explorer. Pop culture trailblazer. Social media practitioner.