Rao-Blackwell theorem
Rao-Blackwell Theorem: Let $hat theta$ be an unbiased estimator for $theta$, and $T$ is the minimal sufficient statistics for $theta$.
Then define $$hat theta^* = E(theta|T)$$
result is $E(hat theta^*) = theta$ and $Var(hat theta^*)=Var(hat theta)$
Recall:
$E(y|x)$ is a r.v, $E(E(y|x))=E(y)$, $Var(y)=E[Var(y|x)]+Var[E(y|x)]$
$E(hat theta^*)=E[E(hattheta|T)]=E(hat theta)=theta$, so $hat theta^*$ is unbiased.
$Var(hat theta)=E[Var(hat theta|T)]+Var[E(hat theta|T)]=underbrace{E[Var(hat theta|T)]}_{geq 0}+Var(hat theta^*) geq Var(hat theta^*)$
when is $E(hat theta|T)=hat theta$? When $hat theta$ is a function of $T$.
Conclude: lowest variance is obtained when $hat theta$ is a function of minimal sufficient statistics.
Theorem: The minimal variance unbiased estimator (MVUE) must be a function of the minimal sufficient statistics.
Ex $x_1,x_2,dots,x_n overset{text{i.i.d}}sim N(mu, sigma^2)$, minimal sufficient statistics is $bar x$ and $S^2$.
The MVUE of $mu$ is $bar x$ since $E(bar x) = mu$.
The MVUE of $sigma^2$ is $S^2$ since $E(S^2) = sigma^2$.
MLE of $sigma^2$ is $frac{n-1}{n}S^2$.
How to find MVUE?
- Find MLE
- If MLE is unbiased, it is MVUE.
- If MLE is biased, adjust it to make an unbiased estimator.
Properties of the MLE
- it is a function of the minimal sufficient statistics
- it is consistent
- it is the most efficient estimator as $n to infty$
- variance of the MLE, when n is large the $hat theta_{MLE}overset{text{approx}}sim N(theta, frac{1}{I_n(theta)})$, where $theta$ is the true value of the parameter and $I_n(theta)$ is the fisher information.
- $I_n(theta) = -E[underbrace{l’’(theta)}_{2^{nd}text{derivative of log-like}}] overset{text{approx since $theta$ unknown}} approx -l’’(hat theta_{MLE})$
- Hence $Var(hat theta_{MLE})approx frac{1}{I_n(theta)}$
- $hat Var(hat theta_{MLE}) = frac{-1}{underbrace{l’’(hat theta_{MLE})}_{<0}}$
近期评论