- Markov Chain Monte Carlo
- Miscellaneous Topics and Final Notes

We will continue to look at Markov chain Monte Carlo methods. You should read reading Monahan’s Chapter 13 and Chapter 7 in Givens and Hoeting. Also explore the R packages related to Markov chain Monte Carlo that are available on the Linux systems and on CRAN. In particular, look at the `coda`

and `boa`

packages for output analysis.

- Markov Chain Monte Carlo

We will start to look at Markov chain Monte Carlo methods. You should read reading Monahan’s Chapter 13 and Chapter 7 in Givens and Hoeting. Also explore the R packages related to Markov chain Monte Carlo that are available on the Linux systems and on CRAN. In particular, look at the `coda`

and `boa`

packages for output analysis.

- Markov Chain Monte Carlo

We will continue to look at Markov chain Monte Carlo methods. You should read reading Monahan’s Chapter 13 and Chapter 7 in Givens and Hoeting. Also explore the R packages related to Markov chain Monte Carlo that are available on the Linux systems and on CRAN. In particular, look at the `coda`

and `boa`

packages for output analysis.

- Markov Chain Monte Carlo

We will continue to look at Markov chain Monte Carlo methods. You should read reading Monahan’s Chapter 13 and Chapter 7 in Givens and Hoeting. Also explore the R packages related to Markov chain Monte Carlo that are available on the Linux systems and on CRAN. In particular, look at the `coda`

and `boa`

packages for output analysis.

- Variance reduction
- Markov Chain Monte Carlo

We will continue to look at variance reduction ideas. Givens and Hoeting discuss these in Section 7.3.

We will start to look at Markov chain Monte Carlo methods. You should read Monahan’s Chapter 13 and Chapter 7 in Givens and Hoeting. Also explore the R packages related to Markov chain Monte Carlo that are available on the Linux systems and on CRAN. In particular, look at the `coda`

and `boa`

packages for output analysis.

- Generating random variables and random vectors

- Simulation (Slides).

We will continue to look at methods of generating random variables from non-uniform distributions. We sill also look at some ways to generate random vectors from multi-variate distributions.

- Brief introduction to uniform pseudo-random variables
- Generating random variables and random vectors

- Simulation (Slides).

We will look at methods of generating random variables from non-uniform distributions. Monahan discusses this in Chapter 11. You should also read Chapter 6 in Givens and Hoeting through 6.2.3.

You should look at the facilities R provides for generating variates from standard distributions (`rnorm`

, `rgamma`

, etc.). Also look at the control provided by `RNGkind`

and `RNGversion`

over the underlying method for generating uniform pseudo-random numbers.

- Some Machine Learning
- Brief introduction to uniform pseudo-random variables
- Generating random variables and random vectors

We will briefly review methods for and issues in generating uniform pseudo-random numbers.

We may start to look at methods of generating random variables from non-uniform distributions. Monahan discusses this in Chapter 11. You should also read Chapter 6 in Givens and Hoeting through 6.2.3.

You should start to look at the facilities R provides for generating variates from standard distributions (`rnorm`

, `rgamma`

, etc.). Also look at the control provided by `RNGkind`

and `RNGversion`

over the underlying method for generating uniform pseudo-random numbers.

- Additive Models
- Some Machine Learning

- Density estimation and smoothing.

You should read Chapters 10 and 11 in Givens and Hoeting. Monahan also discusses density estimation on pages 344–349 and curve fitting on 159–163. You should also explore the function `density`

, the `KernSmooth`

package and the functions `gam`

, `loess`

, and other methods based on smoothing.

You should also read Chapters 12 in Givens and Hoeting and explore some of the packages and functions implementing related methods in R. These include `SemiPar`

, `mgcv`

, `acepack`

, among others. Package `MASS`

also implements several relevant functions.

- Brief introduction to optimization.
- Density estimation and smoothing.

You should read Chapters 8 and 9 in Monahan and Chapters 2 and 4 in Givens and Hoeting. You should also explore the `optim`

and `optimize`

functions and the Optimization task view.

You should also read Chapters 10 and 11 in Givens and Hoeting. Monahan also discusses density estimation on pages 344–349 and curve fitting on 159–163. You should also explore the function `density`

, the `KernSmooth`

package and the functions `gam`

, `loess`

, and other methods based on smoothing.

- Matrices with special structure.
- Iterative methods for solving linear equations.
- Linear algebra software.
- Brief introduction to optimization.

You should read Chapters 8 and 9 in Monahan and Chapters 2 and 4 in Givens and Hoeting. You should also explore the `optim`

and `optimize`

functions and the Optimization task view.

- Brief introduction to numerical linear algebra.

This week we will start a brief review of numerical linear algebra, very briefly covering the material in Monahan’s Chapters 3–6. You should start to read these chapters now and continue next week. The objective is not to understand every detail, but to get a general sense of the issues and the methods available.

As you read, explore which methods are available in R and how they can be used. Some functions to examine are `lm.fit`

, `solve`

, and `qr`

.

- Outline of computer architecture.
- Overview of computer arithmetic.

This week we will finish reviewing basic computer architecture and look at computer arithmetic.

- Review of syllabus and course outline.
- Brief overview of tools.
- Outline of computer architecture.

To brush up on prerequisites you should read the first chapter of Givens and Hoeting.

Read the web pages