Articles in the media about studies on health, safety and other matters often suffer from subtle statistical flaws than can cause major misunderstandings, says an MIT researcher. The same is true for advertisements that cite scientific research, he adds.
Dr. Arnold I. Barnett contends that some statistical flaws were already present in the original studies, while others reflect misconceptions by journalists about what the studies actually found.
"The most cautious general course for the reader is to treat such reports more as public announcements that studies have been done than as clear guides to their content or reliability," said Dr. Barnett, professor of operations research/statistics at MIT's Sloan School of management and himself an author of well known studies on topics such as air safety, voting patterns, war casualties and urban homicide risk.
"Think tanks, government agencies, special-interest groups and academics conduct myriad studies about health, safety, and the physical and social sciences," he writes in his article, "How Numbers Are Tricking You," which appeared in the October issue of Technology Review, a magazine of technology policy published at MIT.
Dr. Barnett said the popular press usually summarizes studies by relying on a few statistics. "Unfortunately," he said, "many such reports are compromised by errors in statistical reasoning. And while people have developed a healthy skepticism about advertisements that also appear in the media, the numbers in these paid-for messages can be even more distorted than we cynically imagine."
In an interview, Dr. Barnett said that most media reports on research are accurate. Where problems do arise, he said in his article, a substantial fraction of statistical misunderstandings fall into a half-dozen categories. These make up what he calls the "six deadly sins of statistical misrepresentation": Generalizing from nonrandom samples; constructing trends from temporary fluctuations; verbal imprecision; `unjust' laws of averages; unsound comparisons; and hidden defects.
His illustrations of troubled media entries include the following:
(1) A report that episodes of anger can double the risk of heart attack made no allowance for the possibility that releasing tension through anger ever prevented any heart attacks. Thus, its implied advice to "stay cool" may have been misguided.
(2) An airline that carried eight percent of passenger traffic between New York and Chicago announced in an advertisement that it was the favorite airline of 84 percent of travelers on that route. It reached this conclusion through a "random" survey of travelers that was restricted to its own passengers.
(3) An International Airline Passengers Association study dividing US airlines into two categories by safety failed to note that modest changes in the time period under consideration would drastically alter the groupings.
(4) A report suggesting that a "glut" of doctors was pushing down physicians' salaries did not adjust for a huge increase in the number of young doctors, which would have reduced the overall average salary even if doctors' earnings were going up at every particular age.
(5) NASA's estimate of the probability that debris from the falling Skylab would threaten people on earth was distorted by the accidental assumption that the debris could under no circumstances hit more than one person.
(6) Reports that four of the 10 largest US cities reached all-time highs in killing in 1991 failed to point out that these four cities also reached all-time highs in population. Thus, even if their per capita murder rates "had been unchanging since Cain slew Abel," they would have had more murders in 1991 than before.
(7) Like the media, the US Supreme Court misunderstood the conclusions of a study about racial disparities in death sentencing because it erroneously viewed odds and probabilities as interchangeable.
(8) A study reporting that "safe" 40-year-old drivers were less likely to die on a 600-mile journey by car than by airplane included several peculiar assumptions, the effect of which may have been to dismiss a result five times more favorable to air travel.
"Ultimately," Dr. Barnett concluded, "should the conclusions of the study matter to the reader, then there is no avoiding the arduous task of finding the study and reading it." But he offered the upbeat assessment that "for the alert individual, statistical humbug should be no harder to ferret out than other forms of illogical argument. It just takes practice and time."
A version of this
article appeared in the
October 19, 1994
issue of MIT Tech Talk (Volume