This course covers the fundamentals of convex optimization. We will talk about mathematical fundamentals, modeling (how to set up optimization problems for different applications), and algorithms.

Instructor: Justin Romberg

Download the syllabus

Go to Piazza

Course Notes

Notes 1, introduction and example (see also: intro slides)

I. Convexity
Notes 2, convex sets
Notes 3, convex functions

II. Unconstrained Optimization
Notes 4, optimality conditions iterative descent methods, line search
Notes 5, gradient descent
Notes 6, accelerated first-order methods
Notes 7, Newton’s method and quasi-Newton methods
Notes 8, nonsmooth optimization, subgradients and subdifferentials
Notes 9, proximal algorithms

III. Constrained Optimization
Notes 10, geometric conditions for solving constrained problems
Notes 11, KKT conditions
Notes 12, Lagrange duality
Notes 13, interlude on duality: support functions, Fenchel conjugates, and Fenchel duality
(these notes were significantly re-worked over spring break; original version here)
Notes 14, algorithms for constrained optimization I
Notes 15, alternating primal-dual methods

IV. Further Topics
Notes 16, distributed optimization
Notes 17, decentralized optimization
Notes 18, convex relaxations
Notes 19, stochastic gradient descent
Notes 20, the minimax theorem and matrix games

Homework

Homework 1, due Thursday January 16
Homework 2, due Thursday January 23. You will need the file hw02_prob06.py.
Homework 3, due Thursday January 30.
Homework 4, due Friday February 14. You will need the file hw04p6.py
Homework 5, due Friday February 21. You will need the file hw05p6.mat
Homework 6, due Friday February 28.
Homework 7, due Friday March 14.
Homework 8, due Friday April 4. You will need the files hw08_prob4.py and hw8p5_noisy_signal.mat
Homework 9, due Friday April 18. You will need the files hw9_grouptest.py, hw9p4.mat, hw9p6.mat, and hw9p7_G.mat.

preload imagepreload image