Amazon logo Help support MIT OpenCourseWare by shopping at! MIT OpenCourseWare offers direct links to to purchase the books cited in this course. Click on the book titles and purchase the book from, and MIT OpenCourseWare will receive up to 10% of all purchases you make. Your support will enable MIT to continue offering open access to MIT courses.

Course Description

This course will focus on fundamental subjects in (deterministic) optimization that are connected through the themes of convexity, Lagrange multipliers, and duality.

The aim is to develop the core analytical issues of continuous optimization, duality, and saddle point theory, using a handful of unifying principles that can be easily visualized and readily understood.

The mathematical theory of convex sets and functions will be central, and will allow an intuitive, highly visual, geometrical approach to the subject. This theory will be developed in detail and in parallel with the optimization topics.

  1. Basic Convexity Concepts (4 lectures): Review of linear algebra and real analysis. Convex sets and functions. Convex and affine hulls. Closure, relative interior, and continuity issues. Recession cones.
  2. Convexity and Optimization (4 lectures): Global and local minima. Directions of recession and existence of optimal solutions. Hyperplanes. Elementary form of duality. Saddle points and minimax theory.
  3. Polyhedral Convexity (3 lectures): Polyhedral cones and sets. Extreme points. Polyhedral aspects of optimization. Polyhedral aspects of duality.
  4. Subgradients and Constrained Optimization (3 lectures): Directional derivatives. Subgradients and subdifferentials. Conical approximations. Optimality conditions.
  5. Lagrange Multipliers (3 lectures): Enhanced Fritz John optimality conditions. Informative Lagrange multipliers. Pseudonormality and con-straint qualifications. Exact penalty functions.
  6. Lagrangian Duality (2 lectures): Geometric multipliers. Constrained optimization duality. Linear and quadratic programming duality. Strong duality theorems.
  7. Conjugate Duality (2 lectures): Conjugate functions. The Fenchel duality theorem. Exact penalty functions.
  8. Dual Computational Methods (4 lectures): Classical subgradient and cutting plane methods. Incremental subgradient methods. Application in Lagrangian relaxation and combinatorial optimization.

This course places emphasis on proofs as well as geometric intuition and understanding. It is more mathematically sophisticated than Linear Programming, and Nonlinear Programming (6.251, 6.252), and has some but not much overlap with these two courses.


Bertsekas, Dimitri P. Convex Analysis and Optimization. Belmont, MA: Athena Scientific, 2003. ISBN: 1886529450.


A course in linear algebra and a course in real analysis.


Homework Assignments 40%
Terms Paper 60%


Prof. D. P. Bertsekas.