The main focus of this course is on algorithms to solve convex optimization problems which have recently gained some attention in continuous optimization. The course starts with basic theoretical results and then well-known algorithms will be analyzed and discussed.
Algorithms to solve large-scale convex optimization problems have been recently an important topic in continuous optimization. This lecture intends to provide basic mathematical tools to understand these algorithms focusing on computational aspects when solving large-scale problems.
1. Convex sets and related results
2. Properties of Lipschitz continuous differentiable functions
3. Optimality conditions for differentiable functions
4. Complexity analysis of algorithms for minimizing unconstrained functions
5. Properties of convex differentiable functions
6. Worse cases for gradient based methods
7. Steepest descent methods for differentiable convex and differentiable strongly convex functions
8. Accelerated gradient methods
D. P. Bertsekas, Nonlinear Programming, 2nd edition, (Athena Scientific, Belmont, Massachusetts, 2003).
D. G. Luenberger and Y. Ye, Linear and Nonlinear Programming, 3rd edition, (Springer, New York, 2008).
O. L. Mangasarian, Nonlinear Programming, (SIAM, Philadelphia, PA, 1994).
Y. Nesterov, Introductory Lectures on Convex Optimization: A Basic Course, (Kluwer Academic Publishers, Boston, 2004).
J. Nodedal and S. J. Wright, Numerical Optimization, 2nd edition, (Springer, New York, 2006).
It is necessary to have basic knowledge of linear algebra, calculus, topology and computational complexity.
Final exam and/or reports.