Unbiasedness

We say θ^ is an unbiased estimator of θ iff

E[θ^(X1,..,Xn)]=θ

If an estimator θ^ is biased for θ, the amount of bias is given by

bias(θ^)=E(θ^θ)=E(θ^)θ

An estimator θ^(X1,,Xn) is asymptotically unbiased for θ if

limnbias(θ^(X1,,Xn))=0

E(θ^)θ as n

Examples

Using maximum statistic as an estimator of θ, Xuniform

E(X(n))=xfX(n)(x)dx=0θxnxn1θndx=nθn[1n+1xn+1|0θ=nn+1θ biased underestimator

An unbiased estimator would be θ^=n+1nX(n)
X(n) is still asymptotically unbiased as n

Estimating the variance of a sampling distribution:

S12=1ni=1n(Xiμ)2=σ2 is unbiased

Using only the sample mean:

\begin{align} S_{2}^2 & = \frac{1}{n}\sum_{i=1}^n (X_{i}-\bar{X})^2 \\ E(S_{2}^2) &= E\left( \frac{1}{n}\sum (x_{i}-\bar{x}) \right)^2 \\ & =\frac{1}{n}E\left( \sum_{i=1}^n(x_{i}-\mu + \mu -\bar{x})^2\right) \\ &= \frac{1}{n} E\left( \sum _{i=1}^n(x_{i}-\mu)^2+2(x_{i}-\mu)(\mu-\bar{x})+(\mu-\bar{x})^2 \right) \\ & =\frac{1}{n}E\left( \sum (x_{i}-\mu)^2 \right)+ \frac{1}{n}E\left( \sum 2(x_{i}-\mu)(\mu-\bar{x}) \right) + \frac{1}{n} E\left( \sum (\mu-\bar{x})^2 \right) \\ & =\frac{1}{n} n\sigma^2 + \frac{2}{n}E\left( (\mu-\bar{x})\sum (x_{i}-\mu) \right) + \frac{1}{n} n E((\mu-\bar{x})^2) \\ & =\sigma^2 + 2E((\mu-\bar{x})(\bar{x}-\mu))+ \frac{\sigma^2}{n} \\ & =\sigma^2 - 2 E((\bar{x}-\mu)^2) + \frac{\sigma^2}{n} \\ & =\sigma^2 - 2 \frac{\sigma^2}{n} + \frac{\sigma^2}{n} \\ & =\frac{n-1}{n}\sigma^2 \quad \text{(a biased estimator for } \sigma { #2} )\\ \end{align}

That's why we estimate the variance with

1n1i=1n(XiX¯)2

An unbiased estimate for σ2 (not the MLE tho)

The Mean square error is why

E[(X¯μ)2]=Var(X¯)=σ2n