Posted on: Friday, April 5th 2024 at 3:00 am

Written By: GreenMedInfo Research Group

When the Center for Countering Digital Hate (CCDH) released its “Disinformation Dozen” report, it ignited a firestorm of criticism and calls for censorship against the named individuals. However, the report’s questionable methodology and contradictory data from Meta raise concerns about the potential for overreach and the suppression of legitimate debate under the guise of combating misinformation.

The Center for Countering Digital Hate (CCDH) released a 40-page report on March 24, 2021 titled “The Disinformation Dozen,” which claimed that just 12 individuals were responsible for 65% of anti-vaccine content on social media platforms.1 The report named the following individuals:

  1. Dr. Joseph Mercola
  2. Robert F. Kennedy Jr.
  3. Ty and Charlene Bollinger
  4. Dr. Sherri Tenpenny
  5. Rizza Islam
  6. Dr. Rashid Buttar
  7. Erin Elizabeth
  8. Sayer Ji
  9. Dr. Kelly Brogan
  10. Dr. Christiane Northrup
  11. Dr. Ben Tapper
  12. Kevin Jenkins

The report’s title “Disinformation Dozen” returns 192,000 results on Google Search as of April 3rd, 2024. Yet, despite its global penetrance, and the profound damage it has done to the reputations and platforms of these 12 individuals, a closer examination of the report reveals significant flaws in its methodology and conclusions, particularly in light of data provided by Meta (formerly Facebook).

Meta’s Response and Data 

In an August 18, 2021 statement, Meta disputed the CCDH’s findings, stating that there is no evidence to support the claim that the “Disinformation Dozen” are responsible for such a significant portion of anti-vaccine content on their platforms.2 Meta revealed that these 12 individuals are actually responsible for only about 0.05% of all views of vaccine-related content on Facebook, including both accurate and inaccurate posts.2

Flawed Methodology 

The CCDH report’s methodology is a critical weakness that leads to misleading conclusions. The report analyzed only 483 pieces of content over six weeks from just 30 groups, some with as few as 2,500 members.1 This sample is not representative of the billions of posts about COVID-19 vaccines shared on Facebook and Instagram.3

Furthermore, the CCDH failed to provide clear explanations for how they identified content as “anti-vax” or chose the groups they included in their analysis.2 This lack of transparency raises questions about the report’s objectivity and the validity of its findings.4

Disregarding Meta’s Efforts 

The CCDH report also fails to acknowledge the steps Meta had taken to combat what it called ‘vaccine misinformation.’ Meta reported removing over three dozen pages, groups, and Facebook or Instagram accounts linked to the “Disinformation Dozen,” as well as imposing penalties on nearly two dozen additional pages, groups, or accounts.2 It would later be revealed in the “Facebook Files” email leaks that these targeted individuals were not, in fact, in violation of Meta’s policies but were deplatformed by Meta staff anyway, most likely due to White House pressure. 

In essence, by focusing on a small group of individuals and ignoring Meta’s efforts to address misinformation, the CCDH report presented an incomplete and biased picture of the situation.5

Widespread Condemnation and Calls for Censorship 

Despite the flaws in the CCDH report, it led to widespread condemnation of the named individuals and calls for censorship. The report went to great lengths to disparage those named, including screenshots of their social media posts and accusations of spreading misinformation.1

The impact was far-reaching, with top government officials, state attorneys general, and members of Congress referencing it to pressure tech companies to take action.6 In July 2021, President Joe Biden himself stated that the 12 people named were “killing people” by spreading vaccine misinformation, putting immense pressure on social media companies to censor them.7

Fourteen state attorneys general sent a letter to Facebook CEO Mark Zuckerberg, demanding answers about whether the “Disinformation Dozen” received special treatment exempting them from content moderation rules.8 The U.S. Surgeon General and Department of Homeland Security also cited the report in their efforts to combat alleged COVID-19 misinformation.9 10

During a Congressional hearing, lawmakers questioned the CEOs of Facebook, Twitter, and Google, pressing them to immediately deplatform the 12 individuals named.11

Defamatory Claims and Unintended Consequences Given the flaws in the report’s methodology and the contradictory data provided by Meta, the CCDH’s claims appear to be misleading and potentially defamatory. Critics argue that the campaign against the “Disinformation Dozen” amounts to an attack on free speech and sets a dangerous precedent of government officials pressuring private companies to censor individuals with whom they disagree.12

The targeting of these individuals, based on a flawed report, could lead to the suppression of legitimate debate and dissent on important issues.13 By focusing on a handful of people, the CCDH report risks oversimplifying the complex issue of medical misinformation and diverting attention from the need for a more comprehensive approach.14

Conclusion The CCDH’s “Disinformation Dozen” report, while ostensibly well-intentioned, is undermined by its flawed methodology, lack of transparency, and failure to acknowledge Meta’s efforts to combat misinformation on its platform. The report’s claims, when viewed in light of Meta’s data, appear to be misleading and potentially defamatory.

As the controversy surrounding the report continues, it raises critical questions about the role of government, tech companies, and advocacy groups in policing online speech and the potential for overreach and abuse in the name of combating misinformation.15

To effectively address medical misinformation, we must rely on accurate, transparent, and unbiased research. Overstating the influence of a small group risks diverting attention from broader issues and may lead to unintended consequences. The focus should be on developing comprehensive, evidence-based strategies to counter misinformation and promote accurate information across all platforms.16

For References