LEGACY CONTENT.
If you are looking for Voteview.com, PLEASE CLICK HEREThis site is an archived version of Voteview.com archived from University of Georgia on
May 23, 2017. This point-in-time capture includes all files publicly linked on Voteview.com at that time. We provide access to this content as a service to ensure that past users of Voteview.com have access to historical files. This content will remain online until at least
January 1st, 2018. UCLA provides no warranty or guarantee of access to these files.
The Classical Theory of Two Simple Hypotheses
We have a large shipment of devices delivered to our manufacturing
plant. Suppose we know with certainty that either the proportion of
defective devices is either .01 or .001. We take a random sample and
Ù Ù
compute p. Given p, how do we decide between .01 and .001?
In the classical theory of two simple hypotheses we denote these two
possibilities as:
H0: p = p0
H1: p = p1
Where H0: is known as the
Null Hypothesis and
H1: is known as the
Alternative Hypothesis.
Given a decision, there are four possibilities:
Accept H0: and H0: is True.
Accept H0: and H1: is True.
Reject H0: and H0: is True.
Reject H0: and H1: is True.
We can represent these possibilites by a two by two table:
True State of World
H0 H1
--------------
Accept H0 |1 - a | b |
| | |
Decision |------|-----|
| | |
Reject H0 | a |1 - b|
| | |
--------------
Where:
a = TYPE I Error =
P[Reject H0: | H0: is True]
and
b = TYPE II Error =
P[Accept H0: | H1: is True]
Hypothesis Test Between Two Means for the Normal Distribution
(s2 is known)
H0:
m =
m0
H1:
m =
m1 >
m0
A reasonable decision rule for this problem is:
_
If Xn > m0 + c then Reject H0:
_
If Xn < m0 + c then Do Not Reject H0:
Where c is some constant. In most circumstances
m0 <
m0 + c <
m1.
The a and
b errors are:
_
a = P[Xn > m0 + c | m = m0]
_
b = P[Xn < m0 + c | m = m1]
Example: Suppose n = 25,
s2 = 400, and
a = .05,
and we have the hypothesis test:
H0: m = 100
H1: m = 110
_
a = P[Xn > 100 + c ] = .05 =
_
P[(Xn - 100)/20/5 > (100 + c - 100)/4 ] = P[Z > c/4]
Now, since P[Z > 1.645] = .05, c/4 = 1.645, and c = 6.58
_
b = P[Xn < 106.58 | m = 110] =
_
P[(Xn - 110)/4 < (106.58 - 110)/4 ] =
P[Z < -.855] = F(-.855) = 1 - F(.855) = .1967
The only way to simultaneously reduce
a and
b is to
increase the sample size. With fixed sample size, n, reducing
a causes
b to increase and vice versa.
There are many
situations in which it is desirable to make
a or
b as small
as possible even at the cost of greatly increasing the other error. A good
example of this is disease testing:
True State of World
Has Not
Disease Have
Disease
--------------
Patient Has Disease |1 - a | b |
| | |
Doctor's Decision |------|-----|
| | |
Patient Does Not Have | a |1 - b|
Disease | | |
--------------
Clearly, telling a patient that he/she does not have a disease when they in
fact have the disease -- the Type I error -- is much more costly
than telling a patient that he/she has a disease when they in fact do not
have the disease -- the Type II error. In the first instance, a
sick person can go infect other people and cause great harm. In the second,
the harm is to scare a healthy person. Clearly, in disease testing,
minimizing the a probability makes
sense.
Recall that the Hypothesis Test Between Two Means for the Normal
Distribution where
s2 is known is:
H0: m =
mo
H1: m =
m1 >
mo
The decision rule for this problem is:
_
If Xn > mo + c then Reject H0:
_
If Xn < mo + c then Do Not Reject H0:
Which is equivalent to:
_
If (Xn - mo)/s/n1/2 > za then Reject H0:
_
If (Xn - mo)/s/n1/2 < za then do not Reject H0: