OPTIMIZATION TECHNIQUES-22M:174/22C:174
SPRING SEMESTER 2007
Instructor: Laurent O. Jay, 225L MLH, ljay@math.uiowa.edu
Classes: at 10:30-11:20 MWF in 214 MLH
Prerequisites: 22M:100 or equivalent.
A knowledge of computer programming.
22M:174/22C:174 deals with numerical optimization techniques.
Optimization is becoming more important than ever before
in the modern world of science and industry. Optimization
algorithms are being called on to handle problems that are
much larger and more mathematically complex than in years
past. The best methods available today are complex,
and their manner of operation is far from obvious.
Course description:
This course will cover modern optimization techniques
for both unconstrained and constrained optimization with
continuous (as opposed to discrete) variables.
Given its strong links to optimization techniques, the
numerical solution of nonlinear equations will also be
considered. At the end of the course the student
should master most of the issues in numerical optimization.
My classnotes based on several sources will be available to students.
See
http://www.math.uiowa.edu/~ljay/m174_07.html
for current information.
Contents:
Characterization of solutions (such as optimality conditions in
optimization) and convergence analysis of the algorithms will be
essential to this course. We give below a partial list of topics
and algorithms to be treated in connexion with three general
classes of problems:
- Unconstrained optimization:
Steepest-descent method;
Newton-like methods;
Quasi-Newton methods;
Linear/nonlinear conjugate gradient methods;
Interval reduction methods;
Line-search methods;
Trust-region methods;
Local and global convergence;
- Nonlinear equations:
Newton's method;
Modified Newton's methods;
Broyden's (quasi-Newton) method;
Inexact Newton methods;
The bisection method;
Line-search methods and merit functions;
Trust-region methods;
Local and global convergence;
- Constrained optimization with (in)equality constraints:
Lagrange multipliers;
Karush-Kuhn-Tucker conditions;
Line-search methods and merit functions;
Active-set methods (for inequality constraints);
Penalty function methods (for equality constraints);
Reduced-gradient and gradient-projection methods;
Augmented Lagrangian and projected Lagrangian methods;
Barrier methods (for inequality constraints);
Interior-point methods (for inequality constraints);
Sequential linearly constrained programming;
Sequential quadratic programming;
- More topics in optimization:
Convexity;
Linear programming and the simplex method;
Quadratic programming;
Duality;
Nonlinear least-squares problems;
Variational calculus;
Nonsmooth optimization;
Dynamic optimization and the maximum principle of Pontryagin;
Dynamic programming and the Hamilton-Jacobi-Bellman equation;
Neural networks and the backpropagation algorithm;
Stochastic optimization;
Simulated annealing;
Genetic algorithms;
As many topics as possible (but not all of course!) will be covered during the semester.
Grading procedures: One mid-term examination (30%),
one final examination (35%), and homework (35%).