Funny, but this is not a correct statement. The test was also correct when it determined that 990,000 people were negative, and correct when it determined that 1 person was positive. It was incorrect 9,999 times out of 1 million: 99% accurate.
If you have an hour or so, do yourself a favor and listen to the Radiolab on Stochasticity: http://www.radiolab.org/2009/jun/15/ and if you tear through that, the one called Numbers will blow your mind.
I found this thread because I'm reading Doctorow's "Little Brother", which by the way contains a few paragraphs about false positives, same as this article. I think you're splitting hairs to say it's not a correct statement. It seems obvious to me that there was an implied "of the times the test detected an infection" in there. That's his entire point - a 99% accurate test, for a one-in-a-million event, will tell you that event occurred 10,000 times, and be wrong 9,999 times.
I agree that I might be splitting hairs. But I'm in research, and I guess it makes me sensitive to incorrect statistical language. When I think of the accuracy of a 'test', I am thinking about the degree to which I can trust any result the test gives. Doctorow was talking about the accuracy of one result that the test gives, and then painting the value of the test in that light. It's worth considering, but it should be remembered that all those correct negative results are potentially valuable too.
People do not support statistically-insignificant things because they don't understand "statistically insignificant." They support statistically-insignificant things because any statistical analysis more in-depth than "most of the time" or "almost never" is unknowable. More than that, every example he lists is in the category of some large organization saying "we know the odds so you don't have to." From an intellectual/psychological/emotional standpoint, people are giving over their trust to the TSA because then they don't have to think about it. They worry about random pedophiles because non-random pedophiles they think they can handle. What do we know about terrorists? Nothing. So we trust someone who ostensibly does. This is not a "we don't understand statistics" problem this is a "we're interested in minimizing risks that we don't have a firm grasp of" problem. Plenty of people know that they're more likely to be hit by lightning twice than they are to die in a terrorist attack... but they also know that if they stay indoors when it rains they can minimize their likelihood of being struck by lightning. "Die in a terrorist attack?" That you have to leave up to the experts.
Fear is a great story, and terrorism is fear incarnate. Great news, and great government contracts. The MSM could convince the public of almost anything. I agree the public quickly leaves it to the experts. They even leave the decision of 'what is risky?' to the experts.
With cars, the way we avoid crashes is by "not driving." Which we're not going to do - and besides, when we're driving, we're in charge. We can out-drive any crash! So in the end, reporting on car crashes isn't going to have that necessary "the world will kill you in unpredictable, random ways" that drives our irrational fear.
We just got a baby crib that is a few years old. It's a nice sturdy crib. However, what we choose to ignore, is that since the side can go down, we will be napping our baby in a DEATH TRAP. http://thechart.blogs.cnn.com/2011/06/28/dangerous-drop-side... I'm sure that in a couple of years we will find that current cribs are actually infant execution devices. Cribs are pretty mundane, but we can be taught to fear them.
And have. http://www.forbes.com/2010/03/12/toyota-autos-hoax-media-opi... http://jalopnik.com/5493693/america-you-brought-the-toyota-h...