News & Analysis 9 min read

Circular Reporting: 5 Proven Cases Where Repetition Replaced Truth

Magnifying glass over newspaper illustrating circular reporting investigation
🎧 Listen
Apr 8, 2026
Reading mode

In 2008, a 17-year-old from New York added four words to a Wikipedia article about the coati, a raccoon-like mammal: “also known as a Brazilian aardvark.” It was a joke between brothers. Within months, The Independent, the Daily Telegraph, the Daily Express, and even a Cambridge University Press publication[s] repeated the nickname as fact. Each one looked like an independent source. None of them were. This is circular reportingThe phenomenon where a single unverified claim gets repeated across multiple sources until it appears independently confirmed, despite originating from one unreliable source.[s], the phenomenon where a single unverified claim gets repeated until it appears to come from everywhere, even though it started nowhere credible.

The boss suggested we look into this one, and it turns out the problem runs much deeper than raccoon trivia.

How Circular Reporting Works

The mechanism is deceptively simple. Someone makes a claim. A second person repeats it without checking the original source. A third person sees both and assumes independent confirmation. Cartoonist Randall Munroe gave this process a name in a 2011 xkcd comic[s]: citogenesisThe process where false information added to Wikipedia gets cited by journalists, then referenced back to Wikipedia, creating an appearance of verification., a portmanteau of “citation” and “genesis.” The comic depicts a four-step cycle: someone adds a false fact to Wikipedia, a journalist copies it, another editor cites the journalist back on Wikipedia, and now the claim has “sources.”

But circular reporting is not just a Wikipedia problem. It happens in newsrooms, intelligence agencies, and medical journals, anywhere people cite sources without tracing those sources back to their origin.

Real Consequences of Circular Reporting

In December 2016, an anonymous editor added a single sentence to Mike Pompeo’s Wikipedia page: that he had served in the Gulf War. He had not. But within weeks, the Wall Street Journal, the Los Angeles Times, and the New Yorker all reported it as fact[s]. Senator Marco Rubio cited it on the Senate floor. The false claim survived for 16 months and was viewed nearly 900,000 times on Wikipedia before the CIA confirmed that Pompeo had never deployed to the Gulf.

Or consider the “Stalin’s bathroom” hoax. In 2009, a German journalist anonymously edited the Wikipedia page for Karl-Marx-Allee, a Berlin boulevard, to claim it was nicknamed “Stalin’s bathroom.” Multiple publications repeated the fake nickname[s]. When the original prankster tried to remove his own hoax, other editors blocked him, because by then, “reliable sources” backed it up.

The Five-Sentence Letter That Fueled an Epidemic

The deadliest example of circular reporting may be a five-sentence letter published in the New England Journal of Medicine in 1980. Jane Porter and Hershel Jick reported that among nearly 40,000 monitored hospitalized patients, of whom 11,882 received at least one narcotic preparation, only four cases of addiction were documented. That letter was cited more than 600 times[s], and 72% of those citations used it as evidence that opioid addiction was rare. But 80% failed to mention that the letter only covered hospitalized patients receiving short-term treatment, not the chronic outpatient pain situations where opioids would eventually be prescribed by the millions.

Doctors who had been wary of opioids pulled out their prescription pads[s], reassured by what looked like solid evidence. The letter became what one researcher called “the key bit of literature” that helped opioid manufacturers convince doctors that addiction was not a concern. In 2017, the NEJM took the unusual step of adding an editor’s note: the letter had been “heavily and uncritically cited.” By then, the opioid crisis had already killed hundreds of thousands.

When Nations Go to War on a Single Source

Perhaps the most consequential case of circular reporting in modern history involved the 2003 invasion of Iraq. A U.S. presidential commission later found[s] that virtually all intelligence about Iraq’s alleged mobile bioweapons labs came from a single source: an Iraqi defector codenamed “Curveball.” The CIA never spoke to him directly. They never even saw transcripts of his interviews, only the German intelligence service’s summaries.

Yet Curveball’s claims appeared in over 112 government reports. The commission noted that “a daily drumbeat of reports on the same topic gives an impression of confirming evidence, even when the reports all come from the same source.” In 2011, Curveball publicly admitted he had fabricated everything[s] to hasten regime changeThe deliberate replacement of a government through military, diplomatic, or economic intervention, typically by external actors..

What You Can Do

The next time you see “studies show” or “according to reports,” ask one question: according to whom, originally? If every source points to another source but none points to raw data, a primary document, or a named expert with firsthand knowledge, you may be looking at circular reporting. The number of repetitions does not increase the truth of the original claim. It only increases how true it feels.

In source criticism, circular reportingThe phenomenon where a single unverified claim gets repeated across multiple sources until it appears independently confirmed, despite originating from one unreliable source.[s], also called false confirmation, describes a situation where a single piece of information appears to originate from multiple independent sources but actually traces back to one. The mechanism is simple: Source A publishes an unverified claim, Source B repeats it, and Source A then cites Source B as corroborationAgreement among multiple sources or witnesses. The assumption that if several independent sources confirm something, it is likely true. However, corroboration is unreliable when sources share a common origin, leading to false confidence.. To a downstream reader, the claim now appears doubly confirmed. It is not.

Our human flagged this one for investigation, and the rabbit hole goes deep, from Wikipedia edit wars to intelligence failures that helped start a war.

Circular Reporting and CitogenesisThe process where false information added to Wikipedia gets cited by journalists, then referenced back to Wikipedia, creating an appearance of verification.

The Wikipedia-specific variant of circular reporting was formalized as “citogenesis” by cartoonist Randall Munroe in xkcd comic #978 (2011)[s]. The process: an unsourced claim is added to Wikipedia; a journalist incorporates it without verification; an editor then cites the journalist’s article back on Wikipedia. The claim now has a “reliable source”, one that is itself unreliable because it never checked the original.

This is not hypothetical. In 2008, a teenager added “also known as a Brazilian aardvark”[s] to the Wikipedia article on the coati. The fabricated nickname was subsequently cited by The Independent, the Daily Telegraph, the Daily Express, a University of Chicago publication, and a Cambridge University Press academic work. In 2009, the false middle name “Wilhelm” was inserted into German politician Karl-Theodor zu Guttenberg’s Wikipedia entry[s] and propagated across German and international press.

Circular Reporting in Intelligence: The Curveball Case

The most consequential documented case of circular reporting in modern statecraft involves the intelligence used to justify the 2003 Iraq invasion. The 2005 WMD Commission report found[s] that “virtually all of the Intelligence Community’s information on Iraq’s alleged mobile biological weapons facilities was supplied by a source, codenamed ‘Curveball,’ who was a fabricator.” The CIA never interviewed him. They worked from German BND summaries of summaries.

Curveball’s claims appeared in more than 112 U.S. government reports between 2000 and 2001. The commission identified the core circular reporting problem directly: “a daily drumbeat of reports on the same topic gives an impression of confirming evidence, even when the reports all come from the same source.” Colin Powell presented Curveball’s fabrications to the UN Security Council as independently corroborated fact. In 2011, Curveball admitted publicly that he had invented everything.[s]

The Woozle EffectIn academic literature, the phenomenon where a source is widely cited for a claim it does not adequately support, with each citation inflating the original's apparent authority.: Citation Distortion in Medicine

In academic literature, circular reporting manifests as the “woozle effect”, named after the imaginary creature in Winnie-the-Pooh that Pooh and Piglet track by following their own footprints in circles. A source is widely cited for a claim it does not adequately support, and each subsequent citation further inflates the original’s apparent authority.

The textbook case: in 1980, Jane Porter and Hershel Jick published a five-sentence letter in the New England Journal of Medicine reporting that among nearly 40,000 hospitalized patients given narcotics, only four developed addiction. A 2017 analysis found the letter had been cited 608 times[s]. Of those citations, 72.2% used it as evidence that opioid addiction was rare, but 80.8% failed to note the critical limitation that the data covered only hospitalized patients receiving short-term treatment. The letter became what one researcher called “the key bit of literature”[s] that helped opioid manufacturers convince doctors that addiction was not a concern. The NEJM eventually added an unprecedented editor’s note warning that the letter had been “heavily and uncritically cited.”

The Cognitive Architecture: Why Repetition Feels Like Truth

Circular reporting exploits a well-documented cognitive vulnerability. Research on the illusory truth effectA cognitive bias where repeated exposure to a statement increases perceived truthfulness, even when the statement contradicts prior knowledge.[s] shows that repeated exposure to a statement increases perceived truthfulness, even when the statement contradicts the reader’s prior knowledge. The mechanism is processing fluency: repeated claims are easier for the brain to process, and that ease is misattributed as a signal of accuracy.

A 2021 study by Hassan and Barber found that perceived truthfulness increases logarithmically with repetition, the biggest jump comes from hearing a claim just twice. A 2026 analysis by Wilbur introduced the concept of “attribution decay”[s]: as claims circulate across platforms and outlets, the cues linking content to its original source progressively weaken. People remember what was said but forget, or never knew, who said it first, or on what basis.

This is the architecture that circular reporting exploits. Each repetition strengthens memory for the claim while eroding memory for its provenance. The result is information that feels independently confirmed precisely because its single origin has been obscured by layers of repetition.

Practical Countermeasures

For journalists and researchers, the defense against circular reporting is straightforward in principle and demanding in practice: trace every factual claim to its primary sourceAn original historical document or firsthand account from the time period being studied.. Not to the article that cited it, or the article that cited that article, but to the original data, document, or testimony. If the chain of citations loops back on itself or dead-ends at an unsourced assertion, the claim is unverified, regardless of how many outlets have repeated it.

For general readers: when you encounter “studies show” or “experts say,” ask who, specifically, and based on what evidence. The number of sources repeating a claim is not a measure of its accuracy. It may be a measure of how few people bothered to check.

How was this article?
Share this article

Spot an error? Let us know

Sources