It’s bad enough that the editors of the New York Times have refused so far to tell the truth about what we know about the magnitude of the death toll in Iraq as a result of the US invasion and occupation of the country since 2003, according to the standards that are used to describe human tragedies for which the U.S. government does not bear primary responsibility. If the New York Times used the same standards of evidence to describe human tragedies regardless of the degree of responsibility of the U.S. government, it would report that "hundreds of thousands of Iraqis have died" as a result of the US war, a fact that we know with the level of confidence that we know similar facts that the New York Times publishes as a matter of routine (such as a recent report that "hundreds of thousands of Iraqis died" – in the Iraq-Iran war.) The New York Times is reluctant to publish this fact about the U.S. war, perhaps, because this fact is awkward to acknowledge for those in Washington who support the status quo policy of permanent war.
But now the New York Times has exacerbated the harm of its denial about the Iraqi death toll, by using its own failure to accurately report the death toll in Iraq as a benchmark for comparison to other human tragedies: in particular, to claim that murder in Venezuela claimed more lives in 2009 than did violence in Iraq. The New York Times editors are like the boy who killed his parents and demanded mercy on the grounds he was an orphan.
In a front page article this week headlined "Venezuela, More Deadly Than Iraq, Wonders Why," NYT reporter Simon Romero claims:
Some here [in Caracas] joke that they might be safer if they lived in Baghdad. The numbers bear them out.
In Iraq, a country with about the same population as Venezuela, there were 4,644 civilian deaths from violence in 2009, according to Iraq Body Count; in Venezuela that year, the number of murders climbed above 16,000.
Note that the headline and the first two paragraphs of this piece depend crucially on the assumption that the partial tally of Iraqi deaths constructed by the NGO Iraq Body Count by monitoring press reports gives an accurate picture of the magnitude of the Iraqi death toll. If the Iraq Body Count partial tally is not an accurate picture of the magnitude of the Iraqi death toll, if it is too small by several orders of magnitude, then the comparison of the lede and the headline in the New York Times article is baseless.
But we know, by the standards ordinarily used to establish such things, that the Iraq Body Count partial tally is not an accurate measure of the magnitude of the Iraqi death toll.
In January 2008 the World Health Organization reported the results of the "Iraq Family Health Survey," published in the New England Journal of Medicine. The WHO study estimated 151,000 deaths due to violence, with a 95% confidence interval of 104,000 to 223,000, from March 2003 through June 2006.
The New York Times reported at the time,
[…]The World Health Organization on Wednesday waded into the controversial subject of Iraqi civilian deaths, publishing a study that estimated that the number of deaths from the start of the war through June 2006 was at least twice as high as the oft-cited Iraq Body Count.
The Iraq Body Count, a nongovernmental group based in Britain that bases its numbers on news media accounts, put the number of civilians dead at 47,668 during the same period of time as the World Health Organization study, the W.H.O. report said. President Bush in the past used a number that was similar to one put forward at the time by the Iraq Body Count.
About this, the WHO said at the time:
"Our survey estimate is three times higher than the death toll detected through careful screening of media reports by the Iraq Body Count project and about four times lower than a smaller-scale household survey conducted earlier in 2006," added Naeema Al Gasseer, the WHO Representative to Iraq.
The latter reference is to the Johns Hopkins/Lancet study, which estimated a death toll due to violence four times higher, as the WHO official stated. If the Lancet numbers estimate were correct, then the Iraq Body Count number is 12 times too small.
But here I emphasize the WHO study because it makes a stronger argument that using the Iraq Body Count partial tally as if it were a picture of the magnitude of the overall death toll is very wrong. The Lancet numbers have been disputed as too high. The WHO numbers have been disputed as too low, but as far as I am aware, no serious critic claims that they are too high.
What does the WHO study tell us about whether the Iraq Body Count tally captures the magnitude of the Iraqi death toll?
It tells us, by the standards ordinarily used in statistics, that it does not.
A 95% confidence interval means that you assess a 95% probability that that interval covers the true value you are trying to estimate. If the WHO study was correct, then the probability that the true death toll as of June 2006 was 47,668, or any other number less than 100,000, was extremely small, less than 2.5%.
In response to my request for a correction or clarification, a New York Times editor wrote that Romero did not
declare the Iraq Body Count correct; he simply used an official figure, even if one subject to debate, to make a comparison with the violence in Venezuela.
But this explanation is inaccurate and does not make sense.
As the New York Times correctly reported in January 2008, the Iraq Body Count partial tally of Iraqi deaths is not an "official figure." It is complied by "a nongovernmental group based in Britain that bases its numbers on news media accounts." You could say it’s "official" because George W. Bush implicitly endorsed it, but I don’t think that’s a definition of "official" that the New York Times editors would want to try to defend.
And if you want to say that X is bigger than Y, you have to know how big Y is; at least, you must have a handle on how big Y might be. If you want to claim that X is bigger than Y, it makes a big difference if your "subject to debate" way of measuring Y produces a number that is too small by orders of magnitude; certainly if, in fact, the error might be great enough that in truth, Y is bigger than X. According to the numbers given in Romero’s New York Times piece, if the Iraq Body Count is only too small as an estimate by a factor of 3, then Romero’s claim might still be true; but if Iraq Body Count is too small by a factor of 4, then Romero’s claim is false. If Iraq Body Count is too small by a factor of 10 or more, as the Lancet study suggested, then Romero’s claim is way off. Thus, to judge the leading claim of Romero’s article and the NYT headline that accompanied it on the front page, you have make a judgment about the claims about the scale of Iraqi deaths.
Isn’t that obvious?
It’s true, of course, that the degree to which Iraq Body Count is a poor measure of the magnitude of Iraqi deaths might not be constant over time. You might reasonably expect that it captured a smaller share of deaths at the times of greatest violence, and therefore that it captured a greater share of deaths in 2009, when violence, by all accounts, was much lower than at the peak of the civil war. (Of course, noting that violence was much lower in Iraq in 2009 suggests Romero’s comparison was misleading in another way: when people think of "violent Iraq," they are more likely thinking of Iraq at the height of the civil war than in 2009.) But there is no evidence that the Times made any effort to judge these issues. They just acted as if the Iraq Body Count partial tally was a picture of the magnitude of deaths, which it manifestly is not – unless you’re George Bush.
It is reasonable to expect that the overwhelming majority of people who saw and will see the front-page story in the New York Times won’t be aware of any of this. They will see the headline "More Killings in Venezuela Than in Iraq," complete with a huge color photo of a funeral with grieving relatives of a murder victim. This will go all over the world, and many people will think, "Isn’t that amazing! More killings in Venezuela than Iraq! That Hugo Chavez has really made a mess out of the country."
Arguably, that is the point of such an article, to produce this result.
The publication of this article coincides with an all-out effort to make violence and insecurity in Venezuela the main opposition campaign theme in the September congressional elections in Venezuela.
Since most of the Venezuelan media, as measured by audience, is controlled by the opposition in Venezuela, that has been the main theme in the Venezuelan media lately. CNN en Espanol contributed their part by showing – four times – a documentary on violence in Venezuela, blaming the government. Now the Times has provided international validation for this campaign. No meetings to establish collaboration are necessary: a New York Times reporter in Caracas seeking to attack the Venezuelan government can easily take his cues from the opposition media.
There has indeed been a large increase in the murder rate in Venezuela over the last decade. There is something to be explained, since poverty, a standard explanation for increased criminal violence, has been sharply reduced in Venezuela during the time.
But Romero offers almost nothing in the way of explanation, and most of what he does offer is wrong or makes no sense:
Reasons for the surge are complex and varied, experts say. While many Latin American economies are growing fast, Venezuela’s has continued to shrink.
This could possibly explain some of the crime of the first quarter of 2010, in which the Venezuelan economy did shrink. But Venezuela’s economic growth was the fastest in the hemisphere from 2003-2008, so the fact that there was one quarter where most of Latin America was growing and Venezuela was not doesn’t explain a ten-year trend.
The gap between rich and poor remains wide, despite spending on anti-poverty programs, fueling resentment.
A few months ago the UN Economic Commission on Latin America published a report which showed that Venezuela had reduced inequality from 2003-2008 more than any country in Latin America, and now had the lowest level of inequality in the region. Not surprisingly, this has not been reported in the Times.
Police salaries remain low, sapping motivation. And in a country with the highest inflation rate in the hemisphere, more than 30 percent a year, some officers have turned to supplementing their incomes with crimes like kidnappings.
Inflation has averaged about 20 percent annually over the last 7 years; however, since nominal incomes grew much more rapidly than this, most people gained quite a bit in real terms, which is what matters.
This has been standard for NYT reporting on Venezuela over the past seven years: high inflation is reported regularly but the real income gains have almost never been noted, with the reader left to think that most Venezuelans are worse off each year as inflation erodes their real income: the opposite of what has happened for nearly six of the last 7 years.
Would you feel sorry for someone whose cost of living went up 20% last year, while they got a 30% raise? Then you should feel sorry for someone whose cost of living remained flat, while they got a 10% raise.
But if you understand that, then you understand that it’s meaningless to report high inflation, as if high inflation intrinsically made people poor, without telling the reader what was happening to real incomes.
Many may say "so what else is new" regarding the tendency of the Times to slant the news in the direction of a hawkish U.S. foreign policy. But the Times‘ influence on the US media is so great that the Times affects the thinking of many people who never read it. That’s why it’s important to call them to account.