next up previous
Link to Statistics and Actuarial Science main page
Next: Assignment 13 Up: 22S:193 Statistical Inference I Previous: Assignment 12

Solutions

5.8
a.

$\displaystyle \sum(X_{i}-\overline{X})^{2}$ $\displaystyle = \sum X_{i}^{2} - \frac{1}{n}\left(\sum X_{i}\right)\left(\sum X_{j}\right)$    
  $\displaystyle = \frac{1}{2n}\left(2\sum_{i}\sum_{j}X_{i}^{2} -2\sum_{i}\sum_{j} X_{i}X_{j}\right)$    
  $\displaystyle = \frac{1}{2n}\left(\sum_{i}\sum_{j}X_{i}^{2} -2\sum_{i}\sum_{j} X_{i}X_{j} +\sum_{i}\sum_{j}X_{j}^{2}\right)$    
  $\displaystyle = \frac{1}{2n}\sum_{i}\sum_{j}(X_{i}-X_{j})^{2}$    

b.
Assume, without loss of generality, that $ E[X_{i}]=\theta_{1}=0$. Then

$\displaystyle E[S^{2}] = \sigma^{2} = \theta_{2}$    

and

$\displaystyle E[S^{4}] = \frac{1}{4n^{2}(n-1)^{2}} \sum_{i}\sum_{j}\sum_{k}\sum_{\ell} E[(X_{i}-X_{j})^{2}(X_{k}-X_{\ell})^{2}]$    

If $ i = j$ or $ k = \ell$, then $ E[(X_{i}-X_{j})^{2}(X_{k}-X_{\ell})^{2}]=0$. If all $ i,j,k,\ell$ are different, then

$\displaystyle E[(X_{i}-X_{j})^{2}(X_{k}-X_{\ell})^{2}] = E[(X_{1}-X_{2})^{2}]^{2} = (2\sigma^{2})^{2} = 4 \theta_{2}^{2}$    

If $ \{i,j\} \cap \{k,\ell\} = \{i\}$, say $ k=i$, then

$\displaystyle E[(X_{i}-X_{j})^{2}(X_{i}-X_{\ell})^{2}]$ $\displaystyle = E[(X_{i}^{2}-2X_{i}X_{j}+X_{j}^{2}) (X_{i}^{2}-2X_{i}X_{\ell}+X_{\ell}^{2})]$    
  $\displaystyle = E[ \begin{aligned}[t]& X_{i}^{4}-2X_{i}^{3}X_{j}+X_{i}^{2}X_{j}...
...i}^{2}X_{\ell}^{2}-2X_{i}X_{j}X_{\ell}^{2}+X_{j}^{2}X_{\ell}^{2}] \end{aligned}$    
  $\displaystyle = \theta_{4}+3\theta_{2}^{2}$    

If $ \{i,j\}=\{k,\ell\}$, then

$\displaystyle E[(X_{i}-X_{j})^{2}(X_{i}-X_{\ell})^{2}]$ $\displaystyle = E[(X_{i}-X_{j})^{4}]$    
  $\displaystyle = E[X_{i}^{4}-4X_{i}^{3}X_{j}+6X_{i}^{2}X_{j}^{2}-4X_{i}X_{j}^{3} + X_{j}^{4}]$    
  $\displaystyle = 2 \theta_{4}+6\theta_{2}^{2}$    

So

$\displaystyle E[S^{4}]$ $\displaystyle = \frac{1}{4n^{2}(n-1)^{2}} \begin{aligned}[t][&n(n-1)(n-2)(n-3) ...
..._{4}+3\theta_{2}^{2}) &+ 2n(n-1)(2 \theta_{4}+6\theta_{2}^{2})] \end{aligned}$    
  $\displaystyle = \frac{1}{4n(n-1)}\left[ 4(n-2)(n-3)\theta_{2}^{2} + 4(n-2)(\theta_{4}+3\theta_{2}^{2}) + 4(\theta_{4}+3\theta_{2}^{2})\right]$    
  $\displaystyle = \frac{1}{n(n-1)} [(n-1)\theta_{4}+((n-2)(n-3)+3(n-2)+3)\theta_{2}^{2}]$    
  $\displaystyle = \frac{1}{n(n-1)} [(n-1)\theta_{4}+(n^{2}-2n+3)\theta_{2}^{2}]$    

So

Var$\displaystyle (S^{2})$ $\displaystyle = E[S^{4}]-\frac{n(n-1)}{n(n-1)}\theta_{2}^{2}$    
  $\displaystyle =\frac{1}{n(n-1)}[(n-1)\theta_{4}+(n^{2}-2n+3-n^{2}+n)\theta_{2}^{2}]$    
  $\displaystyle =\frac{1}{n(n-1)}[(n-1)\theta_{4}-(n-3)\theta_{2}^{2}]$    
  $\displaystyle =\frac{1}{n}\left[\theta_{4}-\frac{n-3}{n-1}\theta_{2}^{2}\right]$    

c.
Still assume $ \theta_{1}=0$.

$\displaystyle E[\overline{X}S^{2}]$ $\displaystyle = \frac{1}{2n^{2}(n-1)} \sum_{i}\sum_{j}\sum_{k} E[(X_{i}-X_{j})^{2}X_{k}]$    
  $\displaystyle = \frac{1}{2n^{2}(n-1)} 2n(n-1) E[(X_{1}-X_{2})^{2}X_{1}]$    
  $\displaystyle = \frac{1}{n} E[X_{1}^{3}-2X_{1}^{2}X_{2}+X_{1}{X_{2}^{2}}]$    
  $\displaystyle = \frac{1}{n} E[X_{1}^{3}] = \frac{1}{n}\theta_{3}$    

So $ \overline{X}$ and $ S^{2}$ are uncorrelated if and only if $ \theta_{3}=0$.
5.10
If $ n$ is odd,

$\displaystyle \int z^{n}\frac{1}{\sqrt{2\pi}}e^{-z^{2}/2}dz = 0$    

If $ n$ is even,

$\displaystyle \int z^{n}\frac{1}{\sqrt{2\pi}}e^{-z^{2}/2}dz = 2\frac{1}{\sqrt{2\pi}}\int_{0}^{\infty}z^{n}e^{-z^{2}/2}dz$    

Let $ y=z^{2}/2$, $ z=\sqrt{2y}$, $ dz=\frac{1}{\sqrt{2y}}dy$. So

$\displaystyle E[Z^{n}]$ $\displaystyle = \frac{2}{\sqrt{2\pi}} \int_{0}^{\infty}(2y)^{n/2}e^{-y}\frac{1}{\sqrt{2y}}dy$    
  $\displaystyle = \frac{2^{n/2}}{\sqrt{\pi}}\int_{0}^{\infty}y^{\frac{n-1}{2}}e^{-y}dy$    
  $\displaystyle = \frac{2^{n/2}}{\sqrt{\pi}}\int_{0}^{\infty}y^{\frac{n+1}{2}-1}e^{-y}dy$    
  $\displaystyle = \frac{2^{n/2}}{\sqrt{\pi}}\Gamma\left(\frac{n+1}{2}\right)$    

Now $ \Gamma(\frac{n+1}{2}) = \frac{n-1}{2}\Gamma(\frac{n-1}{2})$,
$ n$ $ \Gamma(\frac{n+1}{2})$ $ E[Z^{n}]$
0 $ \sqrt{\pi}$ 1
2 $ \frac{1}{2}\sqrt{\pi}$ 1
4 $ \frac{3}{2}\times\frac{1}{2}\sqrt{\pi}$ $ 3 \times 1$
6 $ \frac{5}{2}\times\frac{3}{2}\times\frac{1}{2}\sqrt{\pi}$ $ 5 \times 3 \times 1$
So for even $ n$,

$\displaystyle E[Z^{n}] = (n-1)\times(n-3)\times \cdots\times 3\times1 = \frac{n!}{\left(\frac{n}{2}\right)!2^{n/2}}$    

and

$\displaystyle E[(X-\mu)^{n}] = \begin{cases}0 & \text{$n$ odd} \frac{n!}{\left(\frac{n}{2}\right)!2^{n/2}}\sigma^{n} & \text{$n$ even} \end{cases}$    

if $ X \sim N(\mu,\sigma^{2})$. So for $ X_{i} \sim N(\mu,\sigma^{2})$

$\displaystyle \theta_{4}$ $\displaystyle = 3\sigma^{4}$    
$\displaystyle \theta_{2}$ $\displaystyle = \sigma^{2}$    

and

Var$\displaystyle (S^{2})$ $\displaystyle = \frac{1}{n}\left(3\sigma^{4}-\frac{n-3}{n-1}\sigma^{4}\right)$    
  $\displaystyle = \frac{\sigma^{4}}{n} \frac{2n}{n-1} = \frac{2}{n-1}\sigma^{4}$    

Using $ S^{2} \sim \frac{\sigma^{2}}{n-1}\chi^{2}_{n-1}$,

Var$\displaystyle (S^{2}) = \frac{\sigma^{4}}{n-1}$Var$\displaystyle (\chi^{2}_{n-1}) = \frac{\sigma^{4}}{(n-1)^{2}}2(n-1) = \frac{2}{n-1}\sigma^{4}$    

5.11
This follows either from Jensen's inequality or from the fact that $ E[S^2] - E[S]^2 =$   Var$ (S) \ge 0$ and thus $ E[S] \le
\sqrt{E[S^2]} = \sigma$. Equality holds if and only if Var$ (S) =
0$, i.e. $ S$ is constant with probability one. If we assume finite fourth moments and Var$ (S) =
0$, then we also have Var$ (S^2) = 0$ and

$\displaystyle 0 =$   Var$\displaystyle (S^2) = \frac{1}{n}\left(\theta_4-\frac{n-3}{n-1}\theta_2^2\right...
...}{n-1}\theta_2^2\right) = \frac{2}{n(n-1)}\theta_2^2 = \frac{2}{n(n-1)}\sigma^4$    

since $ \theta_4 \ge \theta_2^2$. So equality implies $ \sigma=0$ and thus $ \sigma > 0$ implies $ E[S] < \sigma$.


next up previous
Link to Statistics and Actuarial Science main page
Next: Assignment 13 Up: 22S:193 Statistical Inference I Previous: Assignment 12
Luke Tierney 2004-12-03