Preliminary Announcement

Methods of Optimization
Instructor:
Professor Andrej V. Cherkaev, 

Department of Mathematics
Office: JWB 225 
Email: cherk@math.utah.edu
Tel : +1 801 - 581 6822 
Fax: +1 801 - 581 4148 

 Search for the Perfection: 
An image from 
Bridgeman Art Library

 addressed to senior undergraduate and graduate students in
Applied Mathematics, Science, and Engineering

The desire for optimality (perfection) is inherent for humans. The search for extremes inspires mountaineers, scientists, mathematicians, and the rest of the human race. A beautiful and practical mathematical theory of optimization (i.e. search-for-optimum strategies) is developed since the sixties when computers become available. Every new generation of computers allows for attacking new types of problems and calls for new methods. The goal of the theory is the creation of reliable methods to catch the extremum of a function by an intelligent arrangement of its evaluations (measurements). This theory is vitally important for modern engineering and planning that incorporate optimization at every step of the complicated decision making process.

This course discusses classical direct search-for-optimum methods, such as Golden Mean, Conjugate Gradients, Modified Newton Method, methods for constrained optimization, including Linear and Quadratic Programming, and others. We will also briefly review genetic algorithms that mimic evolution and stochastic algorithms that account for uncertainties of mathematical models. The course work includes several homework assignments that ask to implement the studied methods and a final project, that will also be orally presented in the class.

The textbook Practical optimization by Philip Gill, Walter Murray, and Margaret H. Wright  is interesting and readable (a British colleague of my recommended it as "a bloody good book"); the authors are among the top experts in the field. The book discusses pros and cons of various methods, often in the context of specific applications. The review part of the course will also use the instructor's notes.

Prerequisite: Calculus, ODE, elementary programming.

To learn more about the course, please visit instructor's website
www.math.utah.edu/~cherk; then follow the online instructions.


Contents

Go to the Contents
Go to the top Go to Contents
Go to the top
 


Introduction to Optimization

Everyone who studied calculus knows that an extremum of a smooth function is reached at a stationary point where its gradient vanishes. Some may also remember the Weierstrass theorem which proclaims that the minimum and the maximum of a function in a closed finite domain do exist. Does this mean that the problem is solved?

A small thing remains: To actually find that maximum. This problem is the subject of the optimization theory that deals with algorithms for search of the extremum. More precisely, we are looking for an algorithm to approach a proximity of this maximum and we are allowed to evaluate the function (to measure it) in a finite number of points. Below, some links to mathematical societies and group in optimization are placed that testify how popular the optimization theory is today: Many hundreds of groups are intensively working on it around the globe.




Go to Contents
Go to the top

Optimization and Modeling

  • The modeling of the optimizing process is conducted along with the optimization. Inaccuracy of the model is emphasized in optimization problem, since optimization usually brings the control parameters to the edge, where a model may fail to accurately describe the prototype. For example, when a linearized model is optimized, the optimum often corresponds to infinite value of the linearized control. (Click here for an example) On the other hand, the roughness of the model should not be viewed as a negative factor, since the simplicity of a model is as important as the accuracy. Recall the joke about the most accurate geographical map: It is done in the 1:1 scale.
  • Unlike the models of a physical phenomena, an optimization models critically depend on designer's will.  Firstly, different aspects of the process are emphasized or neglected depending on the optimization goal. Secondly, it is not easy to set the goal and the specific constains for optimization.
  • Naturally, one wants to produce more goods, with lowest cost and highest quality. To optimize the production, one either may constrain by some level the cost and the quality and maximize the quantity, or constrain the quantity and quality and minimize the cost, or constrain the quantity and the cost and maximize the quality. There is no way to avoid the difficult choice of the values of constraints.  The mathematical tricks go not farther than: "Better be healthy and wealthy than poor and ill". True, still not too exciting.
  • It seems that the maximization of the profit solves the problem by applying an universal criterion. Still, the short-term and long-term profits require very different strategies; and it is necessary to assign the level of insurance, to account for possible market variations, etc.
  • Sometimes, the solution of an optimization problem shows unexpected features: for example, an optimal trajectory zigzags infinitely often. Such behavior points to an unexpected, but optimal behavior of the solution. It should not be rejected as a mathematical extravaganza, but thought through! (Click here for some discussion.)


  • Go to Contents
    Go to the top


    Assumptions of the Theory

  • A smoothness or sometimes the convexity of the function is a priori assumed. Without some assumptions, no rational algorithms can be suggested. For example, it is impossible to suggest a nontrivial algorithm for choosing the oldest person from the alphabetical telephone directory. The search methods approximate -- directly or indirectly -- the behavior of the function in the neighborhood of measurements. Various methods assume different types of the approximation.
  • My maximum is higher than your maximum! The optimized function may have more than one local maximum. Generally, there are no ways to predict the behavior of the function everywhere in the permitted domain. Most of the methods pilot the search to a local maximum without a guarantee that this maximum is also a global one.
  • Several classical optimization problems are of special interest. Those are: maximum of an one-dimensional unimodal function, the mean square approximation, linear and quadratic programming. Those problems and their modifications serve as testing grounds for optimization algorithms and they used for algorithms evaluation

  • Go to Contents
    Go to the top

    Classification of Optimization Problems

    To explain how knotty the optimization problems are, one may try to classify them. I like the optimization tree by NEOS. Here are some of my comments and  examples of optimization problems. In any practical problem, the researcher meets a unique combination of mentioned factors and has to decide what numerical tools to use or modify to reach the goal. Therefore, the optimization always includes creativity and intuition. It is said that optimization belongs to both science and art.

    Go to Contents
    Go to the top

    Related Links

    Below, there are several links to the interesting popular and tutorial material in Optimization Theory at the Internet. There are hundreds more.
    Please send me more links: (cherk@math.utah.edu).
  • Personal websites
  • Institutions and Societies

  • Go to Contents
    Go to the top
    Go to my Homepage