45-733 PROBABILITY AND STATISTICS I Notes #7B
February 2000
Ù If E(q) = q, then q is an unbiased estimator.Efficiency:
If the variance of one estimator is smaller than a second estimator, then it is relatively more efficient. Example: Note that E(X1 + X2)/2 = m. However, if n > 2, the variance of this estimator is larger _ than the variance of Xn so it is less efficient.Consistency:
Ù Ù If E(q) ® q as n ® +¥, then q is a consistent estimator of q.
Ù In particular, E(s2) = [(n-1)/n]s2 To see this, recall that: VAR(X) = E(X2) - [E(X)]2 so that E(X2) = s2 + m2; and _ _ _ _ VAR(Xn) = E[(Xn)2] - [E(Xn)]2 so that E[(Xn)2] = s2/n + m2 Therefore: Ù _ E(s2) = E[åi=1,n (xi - Xn)2]/n = _ _ (1/n)E{[åi=1,n (xi)2] - 2(Xn)[åi=1,n (xi)] + n(Xn)2} = _ (1/n)E[åi=1,n (xi)2] - E[(Xn)2] = (1/n)[n(s2 + m2)] - (s2/n + m2) = [(n-1)/n]s2
Ù Although s2 is the maximum likelihood estimator for s2 it is biased. _ This bias stems from the fact that Xn is used in the formula. _ Replacing Xn with m removes the bias. This is due to the fact that _ Xn in a finite random sample is very unlikely to be exactly on m. Hence, the sum of the squares of the sample will be slightly underestimated. To obtain an unbiased estimator, denoted as s2, the sum of the squares of the sample is divided by n-1 and this serves to slighly inflate the value and produces an unbiased estimator; namely: _ s2 = [åi=1,n (xi - Xn)2]/(n-1) and it is easy to show that: E(s2) = s2