LEGACY CONTENT. If you are looking for Voteview.com, PLEASE CLICK HERE

This site is an archived version of Voteview.com archived from University of Georgia on May 23, 2017. This point-in-time capture includes all files publicly linked on Voteview.com at that time. We provide access to this content as a service to ensure that past users of Voteview.com have access to historical files. This content will remain online until at least January 1st, 2018. UCLA provides no warranty or guarantee of access to these files.


February 2000

Properties of Estimators

  1. Properties of Estimators
         The three most important properties of Estimators are:

    1. Unbiased:
        If E(q) = q, then q is an unbiased estimator.
    2. Efficiency:
      If the variance of one estimator is smaller than a 
      second estimator, then it is relatively more 
      Example:  Note that E(X1 + X2)/2 = m.  However, 
      if n > 2, the variance of this estimator is larger 
      than the variance of Xn so it is less efficient.
    3. Consistency:
             Ù                        Ù
        If E(q) ® q as n ® +¥, then q 
        is a consistent estimator of q.
  2.   Ù
    s2 is a biased estimator of s2.
    In particular, E(s2) = [(n-1)/n]s2
    To see this, recall that:  
    VAR(X) = E(X2) - [E(X)]2 so that E(X2) = s2 + m2; and
        _        _         _                _
    VAR(Xn) = E[(Xn)2] - [E(Xn)]2 so that E[(Xn)2] = s2/n + m2
    Therefore:  Ù                   _
              E(s2) = E[åi=1,n (xi - Xn)2]/n = 
                                      _                   _
             (1/n)E{[åi=1,n (xi)2] - 2(Xn)[åi=1,n (xi)] + n(Xn)2} = 
             (1/n)E[åi=1,n (xi)2] - E[(Xn)2] = 
             (1/n)[n(s2 + m2)] - (s2/n + m2) = 
  3.           Ù
     Although s2 is the maximum likelihood estimator for s2 it is biased. 
     This bias stems from the fact that Xn is used in the formula.  
     Replacing Xn with m removes the bias.   This is due to the fact that 
     Xn in a finite random sample is very unlikely to be exactly on 
     m.  Hence, the sum of the squares of the sample will be slightly 
     underestimated.  To obtain an unbiased estimator, denoted as 
     s2, the sum of the squares of the sample is divided by
     n-1 and this serves to slighly inflate the value and produces an 
     unbiased estimator; namely:
     s2 = [åi=1,n (xi - Xn)2]/(n-1)
    and it is easy to show that:  E(s2) = s2