The October 21 issue of the Lancet has published a controversial
article on the civilian deaths in Iraq since the invasion. The article reports on a survey which suggests that an estimated 654,965 (392, 979–942,636) extra deaths may have occurred in Iraq between March 2003 and July 2006 as result of the war (
news release). An earlier study by the same authors estimated that there were around 100,000 excess deaths during the early part of the war. The death rate has steadily increased since the war began. The study found that most of the additional deaths were of men and caused by gunshots or bombs. The Washington Post provides further information on the study in a discussion
transcript with Dr. Gilbert Burnham, lead author of the study and Co-Director, Center for Refugee and Disaster Response , Johns Hopkins University Bloomberg School of Public Health. As for the reasons for doing the study he says: “Our Center in Baltimore is one that is focused on public health in disasters--and we look at many types of disasters--being heavily involved in estimating deaths in Aceh after the tsunami and also from famine in North Korea. Of all the disasters--war is the worst--any way you sample, any way you count.”
This is a much higher estimate than is given by other sources. Jefferson Morley in his Washington Post
blog provides a good summary of the reaction to this article. Addition summaries can be found at various news sites (
1,
2,
3). (The variation between 600,000 and 655,000 in different reports reflects violent deaths (the lower number) or total excess deaths.) The political reaction was predictable: President Bush denounced the estimate as way too high. His followers agreed generally citing figures of 30,000-60,000 dead. U.S. and Iraq Government estimates that have attempted to go beyond official reports are higher than other estimates but around 5 times lower than this new report. Even some less biased groups such as the
Iraq Body Count think the new estimates may be a factor of 5 to 10 too high. At the other extreme some antiwar groups were immediately convinced.
The reliance on official sources by most others seems to be the source of much of the disparity in estimates. This study actually went out and asked people about deaths in their family (or living group). Most (92%) of the family death reports were backed up by death certificates. If I understand Dr. Burnham’s discussion answers correctly, the study group found that official reporting channels had either reported only 20% of these deaths orin some cases none of them.
Many pollsters, demographers and others knowledgeable in statistical survey methods agree that the study was well designed and the conclusions may be correct. But there are numerous critics. Some commentators just distrust statistical methods or assume that the authors made some stupid mistake (multiple counting of the same deaths, etc). These critics often point that this would mean officials who currently recognize a death rate of 86 per day would be overlooking more than 500 additional bodies lying around. As one of the study’s authors points out in the BBC
article, “"There have to be ~300 deaths per day from natural cause even if Iraq was the healthiest 26 million people in the world.” Clearly, the officials are not reporting most of the country’s deaths whether or not this study is correct.
Additional criticism of the study was written by
Steven Moore in the Wall Street Journal. Steven Moore thinks the choice of only 47 sampling clusters was way too low: “I wouldn't survey a junior high school… using only 47 cluster points.” It would be nice to have data from more clusters. The small number of clusters and few actually reported deaths a the major cause of the uncertainty of the number of deaths. But a real bias would show up only if the study included a cluster or two with a really high number of deaths.
There are knowledgeable critics who fault the study for failing to do demographics on their sample households and match them with the overall Iraqi population (assuming valid figures are available). This would have improved the accuracy of the study but the authors felt that collecting such data would make the families less likely to participate. Many of the report’s critics dislike its use of a pre-war death rate of 5.5 per 1,000. There is a UN study used 9 per 1,000. However the lower 5.5 rate is what the study obtained from the family reports and this is in line with the official CIA estimate. The higher pre- war death rate would make fewer of the deaths “excessive.
A news article in Science magazine quotes physicists at Oxford University who are concerned that the study’s avoidance of back alleys caused a main street bias which could have caused erroneously high mortality estimates. A similar concern was expressed elsewhere but no one had a clear explanation of why this would lead to a higher death rate. It would be nice to see this concern addressed but security concerns kept the sampling on larger streets.
I have read the study (as a mathematically knowledgeable non-expert) and find it fairly convincing. My biggest worry would be that a sampling bias has occurred in spite of the authors' serious effort to randomize their selection of sampling cohorts. In a war zone, I don’t know why the biases would favor a high rather than a low estimate. The “limitations” section of the report has some chilling hints that might suggest that the estimate could be low rather than high. The study directly determines death rate so if a substantial number of people have fled Iraq the number of deaths would be lower. It boggles my mind that that the death rate could be so high without much official notice but I think someone will have to do a well funded credible and complete study before these numbers can be dismissed (or accepted).
Critics are skeptical of the “October Surprise” release and the fact that it was a statistical survey instead of a direct body count. They also don’t like the fact that some news outlets reported the 655K excess deaths while others reported the 600K violent deaths. I have yet to find an entirely compelling argument against the study but it seems destined to be ignored because it both annoys the war supporters and disagrees with studies generated by groups that would like a big club to use on the Bush administration. Radical groups will adopt it but their approval won’t help the studies credibility. The authors seemed to think they were just publishing a nonpartisan disaster evaluation.