# Method of Moments

## Primary tabs

## Method of Moments

Recall that $E[m'_r]=\mu'_r$, that is, the expected value of the $r$th sample moment is equal to the $r$th population moment. This is the basis for the method of moments for estimation.

### Procedure

- Assume a pdf for your sample data.
- Determine which parameters need to be estimated.
- Set $E[X^k]=m'_k$ for $k=1$ to the total number of parameters. You should have as many equations as you have parameters.
- Solve the system of equations to provide unique solutions to all parameters.

EXAMPLE: suppose you have a random sample drawn from a population that follows a Bernoulli distribution. Find estimators for the parameters of this distribution using the Method of Moments.

Using the procedure outlined above:

1. The sample data follows as Bernoulli distribution, that is, the pdf is $f(x)=p^x(1-p)^(1-x)$

2. The only parameter that needs to be estimated is $p$, the probability of success.

3. We set:

$$ E[X^1] = m'_1 $$

4. To solve the equation, we first need to remember that: $ E[X^1]=E[X]=p $ and $m'_1=\frac{1}{n} \sum\limits_{i=1}^n x_i $. Therefore,

$$ p=\frac{1}{n} \sum\limits_{i=1}^n x_i = \bar{X} $$