Sometimes "common sense" hurts you in a simple statistical calculation, leading you to a wrong answer. And sometimes "common sense" helps you, by allowing you to understand why the answer is wrong.
Suppose you're going to select five random numbers between 0-9, with each selection being independent of the others. One way to do this is to throw an icosahedral die--a die with 20 triangular faces instead of the well known six square faces--having each digit appearing on two faces. The die is thrown five times and the digit that comes up is recorded. What is the probability that at least one of the selected numbers will be an 8?
"Common sense" says the probability is 50 percent, or 0.50. The reasoning is thus: On one throw, the probability of getting an 8 is ten percent, or 0.10. Because there are five throws, the probability of getting the target number is five times as good, or 0.10 * 5 = 0.50.
This argument sounds really good and would be utterly convincing to over 80 percent of an average group of adults. (Source: statistics I've pulled out of my own rear end.)
But the computation is wrong. And with a little "common sense," it's easy to see that it is indeed wrong.
By the same "logic," the chance of getting an 8 in six throws would be sixty percent (0.60), and the chance of getting an 8 in seven throws would be seventy percent (0.70). Carrying the logic further, the chance of getting an 8 in ten throws would be one hundred percent (1.00), or a certainty.
At this point, common sense kicks in and detects that something is wrong. Can it really be true that in ten throws the number 8 is guaranteed to come up at least once? After all, that's what one hundred percent means, doesn't it, that the result is guaranteed? That doesn't seem quite right. Sure, an 8 is very likely to come up at least once in ten throws, but it's not a certainty, is it?
Similarly, if there are twelve throws, "common sense" says that it is very likely that an 8 will come up at least once, but it's not an absolute certainty. And yet, by the "logic" discussed above, the chances of at least one 8 appearing in twelve throws are one hundred and twenty percent, which means ... what?? Better than certainty?
Also, who said that the number had to be an 8? It could have been a 9 or a 2 or a 0. By the "logic" discussed above, each of these numbers is guaranteed to make at least one appearance in ten throws. It would follow, wouldn't it, that in ten throws each digit from 0 through 9 makes one and only one appearance, with no duplications?
"Common sense" detects something wrong here, although it not clear exactly what is wrong. The logic seemed so straightforward, and the calculations seemed so simple. But they were still quite wrong.
By the way, the chance of getting an 8 (or any other digit) in five throws is about 41 percent. To get a better than fifty percent chance of getting an 8, you need seven throws. And you have only about a sixty-five percent chance of getting an 8 in ten throws, a far cry from certainty.