This site is an archived version of Voteview.com archived from University of Georgia on

45-733 PROBABILITY AND STATISTICS I Notes #7A

February 2000

- In parametric estimation we assume that we know the type of distribution
(e.g, Normal, Poisson, Bernoulli, etc.) from which our random sample is drawn,
and on the basis of that random sample we must infer the values of the
*parameters*of the distribution. For example, we take a random sample from a Poisson distribution and on the basis of the random sample, we decide what the value of**l**is. In addition, we use the random sample to make statements about how confident we are in our guess about the values of the parameters. - An Estimator is a formula, or a rule, that
we use to get values for the parameters. For example, we have an urn with
a large number of balls in it. There are only two colors of balls -- green and red -- in
the urn. We draw 10 balls
*with replacement*and note their color. Clearly, the best guess about the proportion of green balls in the urn is the number of green balls drawn divided by 10. However, note that this is just the sample mean.

Technically, anis a real valued funtion of the sample.*estimator* - Maximum Likelihood Method of Obtaining Estimators

We need a systematic way of getting estimators. The method of maximum likelihood is a very powerful method of doing so and is strongly intuitive. It is not fool-proof, but for almost all important distributions of interest, it provides us with plausible and useful estimators of the underlying parameters. - The Maximum Likelihood Method has three steps. With respect to
the Bernoulli distribution these are:

- First: Form the Joint distribution of the sample (the Likelihood function
of the sample)

**f(x**_{1}, x_{2}, ... , x_{n}| p) = f_{1}(x_{1})f_{2}(x_{2}) f_{3}(x_{3})...f_{n}(x_{n}) = P_{i=1,n}f_{i}(x_{i}) =

p^{x1}(1 - p)^{(1 - x1)}p^{x2}(1 - p)^{(1 - x2)}p^{x3}(1 - p)^{(1 - x3)}... p^{xn}(1 - p)^{(1 - xn)}=

p^{(åi=1,n xi)}(1 - p)^{ (n - åi=1,n xi)}

- Second: Take the natural log of the Likelihood function

**L(x**_{1}, x_{2}, ... , x_{n}| p) = ln{f(x_{1}, x_{2}, ... , x_{n}| p)} =

[å_{i=1,n}x_{i}]ln(p) + (n - å_{i=1,n}x_{i})ln(1 - p) - Third: Find the maximum by taking the first
derivative of the log of the Likelihood function

**¶L/ ¶p = [å**_{i=1,n}x_{i}]/p - (n - å_{i=1,n}x_{i})/(1 - p) = 0

= å_{i=1,n}x_{i}- p[å_{i=1,n}x_{i}] - np + p[å_{i=1,n}x_{i}] = å_{i=1,n}x_{i}- np_{^}_{_}Hence: p = X_{n}

- First: Form the Joint distribution of the sample (the Likelihood function
of the sample)
- Example: Find the Maximum Likelihood Estimator for
**l**in the Poisson Distribution.

**f(x**_{1}, x_{2}, ... , x_{n}| l) = [(e^{-l}l^{x1})/x_{1}!] [(e^{-l}l^{x2})/x_{2}!] [(e^{-l}l^{x3})/x_{3}!]... [(e^{-l}l^{xn})/x_{n}!] =

[(e^{-nl}l^{åi=1,n xi})/(P_{i=1,n}x_{i}!)]

L(x_{1}, x_{2}, ... , x_{n}| l) = -nl + (å_{i=1,n}x_{i}) ln(l) - ln(P_{i=1,n}x_{i}!)

¶L/ ¶l = -n + (å_{i=1,n}x_{i})/l = 0

_{^}_{_}Hence: l = X_{n} - Example: Find the Maximum Likelihood Estimator for
**m**and**s**in the Normal Distribution.^{2}

**f(x**_{1}, x_{2}, ... , x_{n}| m, s^{2}) = {1/[(2p)^{1/2}s]} [e^{-(x1 - m)2/ 2(s)2}]

{1/[(2p)^{1/2}s]} [e^{-(x2 - m)2/ 2(s)2}] {1/[(2p)^{1/2}s]} [e^{-(x3 - m)2/ 2(s)2}]...

{1/[(2p)^{1/2}s]} [e^{-(xn - m)2/ 2(s)2}] =

{1/[(2p)^{n/2}(s^{2})^{n/2}]} {e^{-[1/2(s)2] [åi=1,n (xi - m)2]}}

L(x_{1}, x_{2}, ... , x_{n}| m, s^{2}) = - (n/2)ln(2 p) - (n/2)ln (s^{2}) - [1/2(s)^{2}] [å_{i=1,n}(x_{i}- m)^{2}]

¶L/ ¶ m = [1/(s^{2})] [å_{i=1,n}(x_{i}- m)] = 0

¶L/ ¶ s^{2}= -[n/2(s^{2})] + [1/2(s^{4})] [å_{i=1,n}(x_{i}- m)^{2}] = 0

Hence:

_{^}_{_}m = X_{n}, Using ¶L/¶m_{^}_{_}s^{2}= [å_{i=1,n}(x_{i}- X_{n})^{2}]/n