OPTIMIZATION TECHNIQUES-22M:174/22C:174

SPRING SEMESTER 1999


Instructor: Laurent O. Jay, 225L MLH, ljay@math.uiowa.edu

Classes: at 8:30-9:20 MWF in 218 MLH

Prerequisites: 22M:100 or equivalent. A knowledge of computer programming, like Fortran and Matlab.

22M:174/22C:174 deals with numerical optimization techniques. Optimization is becoming more important than ever before in the modern world of science and industry. Optimization algorithms are being called on to handle problems that are much larger and more mathematically complex than in years past. The best methods available today are extremely complex, and their manner of operation is far from obvious.

Course description: This course will cover modern optimization techniques for both constrained and unconstrained optimization with continuous (as opposed to discrete) variables. Given its strong links to optimization techniques, the numerical solution of nonlinear equations will also be considered. At the end of the course the student should master most of the issues in numerical optimization. No particular textbook will be followed. My classnotes based on several sources will be available to students. See http://www.math.uiowa.edu/~ljay/m174.html for current information.

Contents: Characterization of solutions (such as optimality conditions in optimization) and convergence analysis of the algorithms will be essential to this course. We give below a partial list of topics and algorithms to be treated in connexion with three classes of problems:

  1. Nonlinear equations: The bisection method, Newton's method, Finite difference Newton's method, Secant methods, Modified Newton methods, Inexact Newton methods, Quasi-Newton methods (Broyden's method).
  2. Unconstrained optimization: Convexity, Line-search methods, Step-length selection algorithms, Trust-region methods, Steepest-descent methods, Linear/Nonlinear conjugate gradient methods, Newton-like methods, Quasi-Newton methods.
  3. Constrained optimization with (in)equality constraints: Lagrange multipliers, Active-set methods (for inequality constraints), Linear programming, Quadratic programming, Interior-point methods (for inequality constraints), Penalty function methods, Barrier methods (for inequality constraints), Reduced-gradient methods, Gradient-projection methods, Augmented Lagrangian methods, Projected Lagrangian methods, Sequential linearly constrained programming, Sequential quadratic programming, Duality,
As many topics as possible will be covered during the semester.

Grading procedures: One mid-term examination (30%), one final examination (30%), and homework (40%).