Alpha levels and Type I and Type II errors.


Why do we use an alpha level of .05 and not a smaller one like .01 or .001? -E.L. Thorndyke


The alpha level is defined as the probability of what is called a Type I error in statistics. That is the probability of rejecting H0 when in fact it was true.

Now, why should we select an alpha level of .05? If we are really worried about the possibility that we will reject H0 when it is true, then why don't we use a smaller alpha level like .01 or even .001? That would minimize the change that we would incorrectly reject H0.

The thing is that there is another error we could make (which statisticians call a Type II error). That is the error of not rejecting H0 when it is false. If you think about it, the stricter the criterion you set for rejecting H0 (i.e., the smaller the alpha level), the more likely it is that there will be cases where you *should* reject H0, but you don't.

The exact probability of a Type II error (failing to reject H0 when it is actually false) cannot be determined just from the alpha level. There is other information you need (which is not important here). The main thing is that as you set a more stringent (smaller) alpha level, like .01 or .001, (which decreases the probability of making a Type I error) you increase the likelihood of making a Type II error. Past experience has suggested that an alpha level of .05 is a good compromise between the likelihoods of making Type I and Type II errors, and so that is what we adopt in science.

Return to the PSY 418 web page.
This page last modified on .
You are visitor to this page.