next up previous
Link to Statistics and Actuarial Science main page
Next: Assignment 12 Up: 22S:193 Statistical Inference I Previous: Assignment 11

Solutions

4.27
Approach from class: Let $ Z_{1}, Z_{2}$ be independent standard normals and let

$\displaystyle X$ $\displaystyle = \mu + \sigma Z_{1}$    
$\displaystyle Y$ $\displaystyle = \gamma + \sigma Z_{2}$    

Then

$\displaystyle U$ $\displaystyle = X + Y = \mu + \gamma + \sigma Z_{1} + \sigma Z_{2}$    
$\displaystyle V$ $\displaystyle = X - Y = \mu - \gamma + \sigma Z_{1} - \sigma Z_{2}$    

So

$\displaystyle C = B B^{T} = \begin{bmatrix}2 \sigma^{2} & 0  0 & 2 \sigma^{2} \end{bmatrix}$    

and

$\displaystyle f_{U,V}(u,v)$ $\displaystyle = \frac{1}{2\pi 2\sigma^{2}} \exp\left\{-\frac{(u-(\mu+\gamma))^{2}}{4\sigma^{2}} -\frac{(v-(\mu-\gamma))^{2}}{4\sigma^{2}}\right\}$    
  $\displaystyle = f_{U}(u) f_{V}(v)$    

where $ U \sim N(\mu+\gamma,2\sigma^{2})$ and $ V \sim
N(\mu-\gamma,2\sigma^{2})$; $ U, V$ are independent.

4.30
a.
The mean of $ Y$ is

$\displaystyle E[Y] = E[E[Y\vert X]] = E[X] = 1/2$    

The variance is

Var$\displaystyle (Y)$ $\displaystyle = E[$Var$\displaystyle (Y\vert X)] +$   Var$\displaystyle (E[Y\vert X]) = E[X^2] +$   Var$\displaystyle (X)$    
  $\displaystyle = 1/3 + 1/12 = 5/12$    

The covariance is

Cov$\displaystyle (X,Y)$ $\displaystyle = E[(Y-\mu_Y)(X-\mu_X)] = E[E[Y-\mu_Y\vert X](X-\mu_X)]$    
  $\displaystyle = E[(X - \mu_X)^2] =$   Var$\displaystyle (X) = 1/12$    

b.
The conditional distribution of $ Z = Y/X$, given $ X = x$ is $ N(1,1)$. Since this conditional distribution does not depend on $ x$, $ Z$ and $ X$ are independent.

5.2
a.
Condition on $ X_{1}=x$:

$\displaystyle P(Y > y \vert X_{1} = x) = \begin{cases}1 & \text{if $y \le 0$} F(x)^{y} & \text{if $y \ge 1$} \end{cases}$    

i.e. $ Y\vert X_{1}=x$ is geometric with $ p = 1-F(x)$. So for $ y \ge 1$

$\displaystyle P(Y > y)$ $\displaystyle = E[F(X_{1})^{y}] = \int_{0}^{1}u^{y}du$    
  $\displaystyle = \frac{1}{y+1}$    

since $ F(X_{1})$ is uniform on $ [0,1]$ by the probability integral transform. So for $ y = 1, 2, \ldots,$

$\displaystyle P(Y=y) = \frac{1}{y}-\frac{1}{y+1} = \frac{1}{y(y+1)}$    

Alternative argument: For $ y = 1, 2, \ldots,$

$\displaystyle P(Y > y)$ $\displaystyle = P(X_{1} > \max\{X_{2}, \ldots, X_{y+1}\})$    
  $\displaystyle = P($argmax$\displaystyle ({X}_{1},\ldots,{X}_{y+1})=1)$    
  $\displaystyle = \frac{1}{y+1}$    

by symmetry.
b.
Using $ \lfloor y \rfloor$ to denote the largest integer less than or equal to $ y$ we have $ P(Y > y) = 1 - F_Y(y) =
1/(\lfloor y \rfloor + 1)$ for all $ y \ge 0$. So

$\displaystyle E[Y]$ $\displaystyle = \int_0^\infty 1 - F_Y(y) dy = \int_0^\infty \frac{1}{\lfloor y \rfloor + 1} dy$    
  $\displaystyle \ge \int_0^\infty \frac{1}{y + 1} dy = \infty$    

5.4
a.
Suppose $ X_{i_{1}},\ldots,X_{i_{k}}\vert P$ are independent Bernoulli($ P$). Then

$\displaystyle P(X_{i_{1}}=x_{1},\ldots,X_{i_{k}}=x_{k}\vert P=p)$ $\displaystyle = p^{\sum_{i=1}^{k}x_{i}}(1-p)^{\sum_{i=1}^{k}(1-x_{i})}$    
  $\displaystyle = p^{\sum_{i=1}^{k}x_{i}}(1-p)^{k-\sum_{i=1}^{k}x_{i}}$    
  $\displaystyle = p^{t}(1-p)^{k-t}$    

So

$\displaystyle P(X_{i_{1}}=x_{1},\ldots,X_{i_{k}}=x_{k})$ $\displaystyle = \int_{0}^{1}p^{t}(1-p)^{k-t}dp$    
  $\displaystyle = \frac{\Gamma(t+1)\Gamma(k-t+1)}{\Gamma(k+2)}$    
  $\displaystyle = \frac{t!(k-t)!}{(k+1)!}$    

b.
$ P(X_{1}=x_{1})=\frac{x_{1}!(1-x_{1})!}{2!} =
\frac{1}{2}$. So

$\displaystyle P(X_{1}=x_{1}) \times\cdots\times P(X_{k}=x_{k}) = \frac{1}{2^{k}}$    

For $ k = 2$,
  (0,0) (0,1) (1,0) (1,1)
$ t!(2-t)!/3!$ 1/3 1/6 1/6 1/3
independent 1/4 1/4 1/4 1/4


next up previous
Link to Statistics and Actuarial Science main page
Next: Assignment 12 Up: 22S:193 Statistical Inference I Previous: Assignment 11
Luke Tierney 2004-12-03