Wednesday, February 02, 2005

100,000 and not counting

Daniel Davies, who back in November responded to numerous attacks on the Lancet study in the liberal Anglo-American social sciences blog Crooked Timber -

“This isn’t an estimate. It’s a dart board”. The critique here, from Slate, is that the 95% confidence interval for the estimate of excess deaths (8,000 to 200,000) is so wide that it’s meaningless. It’s wrong. Although there are a lot of numbers between 8,000 and 200,000, one of the ones that isn’t is a little number called zero. That’s quite startling. One might have hoped that there was at least some chance that the Iraq war might have had a positive effect on death rates in Iraq. But the confidence interval from this piece of work suggests that there would be only a 2.5% chance of getting this sort of result from the sample if the true effect of the invasion had been favourable. A curious basis for a humanitarian intervention; “we must invade, because Saddam is killing thousands of his citizens every year, and we will kill only 8,000 more”.

- recently muses over the response:

Les Roberts, the principal author, is going through long dark nights of the soul, wondering if it was a tactical mistake to request accelerated peer review and to have been so vocal about the US elections (btw, the Chronicle reiterates the point we made here earlier; that accelerated peer review is uncommon but by no means unknown with important papers). The Lancet editor Richard Horton refuses to comment, and well he might given that he wrote an entirely misleading summary of the paper which referred to “100,000 civilian deaths” when the paper did not make this distinction.

But there is no way on earth that I am going to write a comment harping on about this or that minor faux pas on the part of the authors.

Because the fundamental point that Roberts makes in the article is absolutely correct; it is a far greater disgrace that 100,000 people can be needlessly killed and everybody carries on as they were before. You don’t have to accept an entirely consequentialist view of wars to accept that the consequences of wars have to be relevant to assessing whether they’ve succeeded or not. The best evidence that we have is that the consequences of this one were bloody disastrous.

...And links to a new article in the Chronicle of Higher Education which documents how rushing the study to print has ironically sunk rather than augmented its impact:
In late October, a study was published in The Lancet, a prestigious British medical journal, concluding that about 100,000 civilians had been killed in Iraq since it was invaded by a United States-led coalition in March 2003. On the eve of a contentious presidential election -- fought in part over U.S. policy on Iraq -- many American newspapers and television news programs ignored the study or buried reports about it far from the top headlines.

The paper, written by researchers at the Johns Hopkins University, Columbia University, and Baghdad's Al-Mustansiriya University, was based on a door-to-door survey in September of nearly 8,000 people in 33 randomly selected locations in Iraq. It was dangerous work, and the team of researchers was lucky to emerge from the survey unharmed.

The paper that they published carried some caveats. For instance, the researchers admitted that many of the dead might have been combatants. They also acknowledged that the true number of deaths could fall anywhere within a range of 8,000 to 194,000, a function of the researchers' having extrapolated their survey to a country of 25 million.

But the statistics do point to a number in the middle of that range. And the raw numbers upon which the researchers' extrapolation was based are undeniable: Since the invasion, the No. 1 cause of death among households surveyed was violence. The risk of death due to violence had increased 58-fold since before the war. And more than half of the people who had died from violence and its aftermath since the invasion began were women and children.

Neither the Defense Department nor the State Department responded to the paper, nor would they comment when contacted by The Chronicle. American news-media outlets largely published only short articles, noting how much higher the Lancet estimate was than previous estimates. Some pundits called the results politicized and worthless.

Les F. Roberts, a research associate at Hopkins and the lead author of the paper, was shocked by the muted or dismissive reception. He had expected the public response to his paper to be "moral outrage."

On its merits, the study should have received more prominent play. Public-health professionals have uniformly praised the paper for its correct methods and notable results.

"Les has used, and consistently uses, the best possible methodology," says Bradley A. Woodruff, a medical epidemiologist at the U.S. Centers for Disease Control and Prevention.

Indeed, the United Nations and the State Department have cited mortality numbers compiled by Mr. Roberts on previous conflicts as fact -- and have acted on those results.

What went wrong this time? Perhaps the rush by researchers and The Lancet to put the study in front of American voters before the election accomplished precisely the opposite result, drowning out a valuable study in the clamor of the presidential campaign.

No comments: