next up previous
Link to Statistics and Actuarial Science main page
Next: Assignment 9 Up: 22S:193 Statistical Inference I Previous: Assignment 8

Solutions

3.23
LATEX2Html is giving me grief about this one-it is available in the pdf version.
3.24
a.
$ Y=X^{1/\gamma}$, $ X \sim$ Exponential($ \beta$).

$\displaystyle f_{Y}(y) = \begin{cases}\frac{\gamma}{\beta}y^{\gamma-1}e^{-y^{\gamma}/\beta} & y > 0 0 & y \le 0 \end{cases}$    

$\displaystyle E[Y^{k}]$ $\displaystyle = \int_{0}^{\infty}x^{k/\gamma}\frac{1}{\beta}e^{-x/\beta}dx = \Gamma\left(\frac{k}{\gamma}+1\right)\beta^{k/\gamma}$    
$\displaystyle E[Y]$ $\displaystyle = \Gamma(1/\gamma+1)\beta^{1/\gamma}$    
Var$\displaystyle (Y)$ $\displaystyle = \Gamma(2/\gamma+1)\beta^{2/\gamma} -\Gamma(1/\gamma+1)^{2}\beta^{2/\gamma}$    

b.
$ Y=(2X)^{1/2}$, $ X \sim$   Gamma$ (1,1) =$   Exponential$ (1)$.

$\displaystyle f_{Y}(y) = \begin{cases}y e^{-\frac{1}{2}y^{2}} & y > 0 0 & y \le 0 \end{cases}$    

$\displaystyle E[Y]$ $\displaystyle = \int_{0}^{\infty} \sqrt{2} \sqrt{x} e^{-x} dx = \sqrt{2}\int_{0}^{\infty}x^{3/2-1}e^{-x}dx = \sqrt{2}\Gamma(3/2)$    
  $\displaystyle = \sqrt{2}\frac{1}{2}\sqrt{\pi}=\sqrt{\frac{\pi}{2}}$    
$\displaystyle E[Y^{2}]$ $\displaystyle = 2 \int_{0}^{\infty}x e^{-x} dx = 2 \Gamma(2) = 2$    
Var$\displaystyle (Y)$ $\displaystyle = 2-\frac{\pi}{2}$    

c.
$ X \sim$   Gamma$ (a,b)$, $ Y = 1/X$.

$\displaystyle f_Y(y) = \frac{e^{-1/(\beta y)}}{\Gamma(\alpha) \beta^{\alpha} y^{\alpha+1}}$    

for $ y > 0$. Moments:

$\displaystyle E[Y^k] = E[X^{-k}] = \int_0^\infty \frac{x^{\alpha-k-1} e^{-\beta x}}{\Gamma(\alpha) \beta^\alpha} = \frac{\Gamma(\alpha-k)}{\Gamma(\alpha)\beta^k}$    

for $ \alpha > k$ and $ E[Y^{-k}] = \infty$ for $ \alpha \le k$. So for $ \alpha > 2$

$\displaystyle E[Y]$ $\displaystyle = \frac{\Gamma(\alpha-1)}{\Gamma(\alpha)\beta} = \frac{1}{(\alpha-1)\beta}$    
$\displaystyle E[Y^2]$ $\displaystyle = \frac{\Gamma(\alpha-2)}{\Gamma(\alpha)\beta^2} = \frac{1}{(\alpha-1)(\alpha-2)\beta^2}$    
Var$\displaystyle (Y)$ $\displaystyle = \frac{1}{(\alpha-1)\beta^2} \left(\frac{1}{\alpha-2}-\frac{1}{\alpha-1}\right) = \frac{1}{(\alpha-1)^2(\alpha-2)\beta^2}$    

d.
$ X \sim$   Gamma$ (3/2,1)$, $ Y = X^{1/2}$.

$\displaystyle f_{Y}(y) = \begin{cases}\frac{2}{\Gamma(3/2)}y^{2}e^{-y^{2}} & y > 0 0 & y \le 0 \end{cases}$    

$\displaystyle E[Y^{k}]$ $\displaystyle = E[X^{k/2}] = \int_{0}^{\infty}\frac{1}{\Gamma(3/2)}x^{k/2+3/2-1}e^{-x}dx$    
  $\displaystyle = \Gamma\left(\frac{k+3}{2}\right)/\Gamma(3/2)$    
$\displaystyle E[Y]$ $\displaystyle = \Gamma(2)/\Gamma(3/2) = 1/(\sqrt{\pi}/2) = \frac{2}{\sqrt{\pi}}$    
$\displaystyle E[Y^{2}]$ $\displaystyle = \Gamma(5/2)/\Gamma(3/2) = \frac{3}{2}$    
Var$\displaystyle (X)$ $\displaystyle = \frac{3}{2}-\frac{4}{\pi}$    

e.
$ X \sim$   Exponential$ (1)$, $ Y = \alpha - \gamma \log X$

$\displaystyle f_{Y}(y) = \frac{1}{\gamma}\exp\{e^{-(y-\alpha)/\gamma}-(y-\alpha)/\gamma\}$    

$\displaystyle E[Y]$ $\displaystyle = \alpha-\gamma E[\log X]$    
$\displaystyle M_{\log X}(t)$ $\displaystyle = E[e^{t \log X}] = E[X^{t}] = \int_{0}^{\infty}x^{(t+1)-1}e^{-x}dx = \begin{cases}\Gamma(t+1) & t > -1 \infty & t \le -1 \end{cases}$    
$\displaystyle E[\log X]$ $\displaystyle = \Gamma'(1) = -($Euler's number$\displaystyle ) = -0.577216$    
$\displaystyle E[(\log X)^{2}]$ $\displaystyle = \Gamma''(1) = ($Euler's number$\displaystyle )^{2} + \pi^{2}/6 = 1.97811$    
Var$\displaystyle (\log X)$ $\displaystyle = \Gamma''(1)-\Gamma'(1)^{2} = \pi^{2}/6$    

Once the moment generating function has been obtained, the rest can be done in Mathematica: The derivative of the Gamma function is obtained as
In[23]:= D[Gamma[x],x]

Out[23]= Gamma[x] PolyGamma[0, x]
The value at $ x=1$ is obtained using the ``slash-dot'' operator:
In[24]:= % /. x->1

Out[24]= -EulerGamma
The variable % refers to the last output expression. The numerical value is obtained to 10 digits by
In[25]:= N[%,10]

Out[25]= -0.5772156649

The second derivative of the Gamma function is

In[26]:= D[Out[23],x]

                                 2
Out[26]= Gamma[x] PolyGamma[0, x]  + Gamma[x] PolyGamma[1, x]
Substituting $ x=1$ produces
In[27]:= % /. x->1

                         2
                   2   Pi
Out[27]= EulerGamma  + ---
                        6

In[28]:= N[%,10]

Out[28]= 1.978111991

3.25

$\displaystyle h_{T}(t)$ $\displaystyle = \lim_{\delta \downarrow 0}\frac{P(1 \le T < t+\delta\vert T>t)}{\delta}$    
  $\displaystyle = \lim_{\delta \downarrow 0}\frac{1}{\delta} \frac{F(t+\delta)-F(t)}{1-F(t)}$    
  $\displaystyle = \frac{\frac{1}{\delta}(F(t+\delta)-F(t))}{1-F(t)}$    
  $\displaystyle = \frac{F'(t)}{1-F(t)} = \frac{f(t)}{1-F(t)}$    

and

$\displaystyle - \frac{d}{dt}\log(1-F(t))=\frac{f(t)}{1-F(t)}$    

The quantity

$\displaystyle H_{T}(t) = -\log(1-F_{T}(t)) = \int_{0}^{t}h_{T}(u)du$    

is called the cumulative hazard function, and

$\displaystyle F_{T}(t) = 1-\exp\{-H_{T}(t)\}$    

Constant hazard: $ h_{T}(t) \equiv c$.

$\displaystyle F_{T}(t) = 1-e^{-ct}$    

an exponential distribution.

Power hazard: Weibull.

3.26
a.

$\displaystyle f_{X}(t)$ $\displaystyle = \frac{1}{\beta} e^{-t/\beta}$    
$\displaystyle h_{T}(t)$ $\displaystyle = \frac{\frac{1}{\beta}e^{-x/\beta}}{e^{-x/\beta}} = \frac{1}{\beta}$    

b.

$\displaystyle f_{T}(t)$ $\displaystyle = \frac{\gamma}{\beta}t^{\gamma-1}e^{-t^{\gamma}/\beta}$    
$\displaystyle F_{T}(t)$ $\displaystyle = P(X^{1/\gamma}\le t) = P(X\le t^{\gamma}) = 1-e^{-t^{\gamma}/\beta}$    
$\displaystyle h_{T}(t)$ $\displaystyle = \frac{\gamma}{\beta} t^{\gamma-1}$    

c.

$\displaystyle F_{T}(t)$ $\displaystyle = \frac{1}{1+e^{-(1-\mu)/\beta}}$    
$\displaystyle f_{T}(t)$ $\displaystyle = \frac{1}{(1+e^{-(1-\mu)/\beta})^{2}} \frac{1}{\beta}e^{-(t-\mu)/\beta}$    
  $\displaystyle = \frac{1}{\beta} F(t) (1-F(t))$    

So $ h(t) = \frac{1}{\beta} F(t)$.

3.28
a.

$\displaystyle f(x\vert\mu,\sigma)$ $\displaystyle = \frac{1}{\sqrt{2\pi}\sigma} \exp\left\{-\frac{(x-\mu)^2}{2\sigma^{2}}\right\}$    
  $\displaystyle = \underbrace{\frac{1}{\sqrt{2\pi}}e^{-\mu^{2}/(2\sigma^{2})}}_{c...
...\underbrace{\frac{\mu}{\sigma^2}}_{w_{2}(\theta)}\right\} \underbrace{1}_{h(x)}$    

b.

$\displaystyle f(x\vert\alpha,\beta)$ $\displaystyle = \frac{1}{\Gamma(\alpha)\beta^{\alpha}}x^{\alpha-1}e^{-x/\beta} 1_{(0,\infty)}(x)$    
  $\displaystyle = \underbrace{\frac{1}{\Gamma(\alpha)\beta^{\alpha}}}_{c(\theta)}...
...{\frac{1}{\beta}}_{w_{2}(\theta)}\right\} \underbrace{1_{(0,\infty)}(x)}_{h(x)}$    

c.

$\displaystyle f(x\vert\alpha,\beta)$ $\displaystyle = \frac{\Gamma(\alpha+\beta)}{\Gamma(\alpha)\Gamma(\beta)} x^{\alpha-1}(1-x)^{\beta-1} 1_{[0,1]}(x)$    
  $\displaystyle = \underbrace{ \frac{\Gamma(\alpha+\beta)}{\Gamma(\alpha)\Gamma(\...
...a)} \underbrace{\log (1-x)}_{t_{2}(x)}\right\} \underbrace{1_{[0,1]}(x)}_{h(x)}$    

d.

$\displaystyle f(x\vert\lambda) = \frac{\lambda^{x}}{x!}e^{-\lambda} = \underbra...
...\exp\left\{ \underbrace{x}_{t(x)}\underbrace{\log \lambda}_{w(\lambda)}\right\}$    

e.

$\displaystyle f(x\vert r,p)$ $\displaystyle = \binom{r+x-1}{r}p^{r}(1-p)^{x}$    
  $\displaystyle = \underbrace{\binom{r+x-1}{r}}_{h(x)} \underbrace{p^{r}}_{c(p)} \exp\left\{\underbrace{x}_{t(x)}\underbrace{\log(1-p)}_{w(p)}\right\}$    

3.30
a.
For the binomial, $ w(p) = \log(p/(1-p))$, $ c(p)=(1-p)^n$, and $ t(x)=x$. The variance Var$ (t(X))=$Var$ (x)$ satisfies

$\displaystyle (w'(p))^2$   Var$\displaystyle (X) = -\frac{d^2}{dp^2}c(p) - w''(p)E[X]$    

Now

$\displaystyle w'(p)$ $\displaystyle = \frac{1}{p}+\frac{1}{1-p}=\frac{1}{p(1-p)}$    
$\displaystyle w''(p)$ $\displaystyle = -\frac{1}{p^2}+\frac{1}{(1-p)^2}$    
$\displaystyle \frac{d^2}{dp^2}c(p)$ $\displaystyle = -\frac{n}{(1-p)^2}$    

So

$\displaystyle \left(\frac{1}{p(1-p)}\right)^2$   Var$\displaystyle (X)$ $\displaystyle = \frac{n}{(1-p)^2} -\left(-\frac{1}{p^2}+\frac{1}{(1-p)^2}\right)np$    
  $\displaystyle = n \left(\frac{1}{1-p}+\frac{1}{p}\right) = \frac{n}{p(1-p)}$    

and thus Var$ (X) = n p (1-p)$
b.
For the Beta distribution $ t_1(x)=\log x$ and $ t_2(x)=\log(1-x)$. The function $ f(x)=x$ cannot be expressed as a linear combination of $ t_1(x)$ and $ t_2(x)$, so the identities in Theorem 3.4.2 cannot be used to find the mean and variance of $ X$.

If $ X \sim$   Poisson$ (\lambda)$ then $ t(x)=x$, $ w(\lambda) =
\log\lambda$, and $ c(\lambda = e^{-\lambda}$. So

$\displaystyle w'(\lambda)$ $\displaystyle = 1/\lambda$   $\displaystyle w''(\lambda)$ $\displaystyle = -1/\lambda^2$    
$\displaystyle \frac{\partial}{\partial\lambda} \log c(\lambda)$ $\displaystyle = -1$   $\displaystyle \frac{\partial^2}{\partial\lambda^2} \log c(\lambda)$ $\displaystyle = 0$        

So Theorem 3.4.2 produces the equations

$\displaystyle E[X/\lambda]$ $\displaystyle = 1$    
Var$\displaystyle (X/\lambda)$ $\displaystyle = E[X/\lambda^2]$    

with solutions $ E[X] = \lambda$ and Var$ (X) = \lambda$.


next up previous
Link to Statistics and Actuarial Science main page
Next: Assignment 9 Up: 22S:193 Statistical Inference I Previous: Assignment 8
Luke Tierney 2004-12-03