White House Dispute Exposes Facebook Blind Spot on Misinformation

Ad Blocker Detected

Our website is made possible by displaying online advertisements to our visitors. Please consider supporting us by disabling your ad blocker.

“The suggestion we haven’t put resources toward combating Covid misinformation and supporting the vaccine rollout is just not supported by the facts,” said Dani Lever, a Facebook spokeswoman. “With no standard definition for vaccine misinformation, and with both false and even true content (often shared by mainstream media outlets) potentially discouraging vaccine acceptance, we focus on the outcomes — measuring whether people who use Facebook are accepting of Covid-19 vaccines.”

Executives at Facebook, including its chief executive, Mark Zuckerberg, have said the company committed to removing Covid-19 misinformation since the start of the pandemic. The company said it had removed over 18 million pieces of Covid-19 misinformation since the start of the pandemic.

Experts who study disinformation said the number of pieces that Facebook removed was not as informative as how many were uploaded to the site, or in which groups and pages people were seeing the spread of misinformation.

“They need to open up the black box that is their content ranking and content amplification architecture. Take that black box and open it up for audit by independent researchers and government,” said Imran Ahmed, chief executive of the Center for Countering Digital Hate, a nonprofit that aims to combat disinformation. “We don’t know how many Americans have been infected with misinformation.”

Mr. Ahmed’s group, using publicly available data from CrowdTangle, a Facebook-owned program, found that 12 people were responsible for 65 percent of the Covid-19 misinformation on Facebook. The White House, including Mr. Biden, has repeated that figure in the past week. Facebook says it disagrees with the characterization of the “disinformation dozen,” adding that some of their pages and accounts were removed, while others no longer post content that violate Facebook’s rules.

Renée DiResta, a disinformation researcher at Stanford’s Internet Observatory, called on Facebook to release more granular data, which would allow experts to understand how false claims about the vaccine were affecting specific communities within the country. The information, which is known as “prevalence data,” essentially looks at how widespread a narrative is, such as what percentage of people in a community on the service see it.

“The reason more granular prevalence data is needed is that false claims don’t spread among all audiences equally,” Ms. DiResta said. “In order to effectively counter specific false claims that communities are seeing, civil society organization and researchers need a better sense of what is happening within those groups.”