This course will cover basic notions to comprehend the gradient-based methods for convex optimization problems considered in mathematical optimization, machine learning and image processing. It starts with the basics, from the definition of convex sets in convex optimization, and will gradually focus on continuously differentiable convex functions. Along the lectures, it will also cover the characterization of solutions of optimization problems (optimality conditions), and numerical methods for general problems such as steepest descent methods, Newton method, conjugate gradient methods, and quasi-Newton methods. In the latter part, the accelerated gradient method of Nesterov for Lipschitz continuous differentiable convex functions will be detailed.
Objectives: Learn the mathematical concepts and notions from the basics necessary for numerical methods for convex optimization problems. Definitions and proofs of theorems will be carefully explained. The objective is to understand the role of basic theorems of convex optimization in scientific articles, and to be prepared to apply them for other problems in mathematical optimization and machine learning.
Theme: In the first part, important theorems to analyze convex optimization problems will be introduced. In the second part, the Nesterov's accelerated gradient method which has received a lot of attention in the recent years will be explained from the mathematical point of view.
Convex function, algorithm analysis, convex optimization problem, numerical methods in optimization, differentiable convex functions with Lipschitz continuous gradients, accelerated gradient method
|Intercultural skills||Communication skills||✔ Specialist skills||Critical thinking skills||Practical and/or problem-solving skills|
All the lectures will have proofs of theorem and explanations of concepts behind a method or definition.
|Course schedule||Required learning|
|Class 1||Convex sets and related results||Criterion to grade will be explained|
|Class 2||Lipschitz continuous differentiable functions|
|Class 3||Optimal conditions for differentiable functions|
|Class 4||Minimization algorithms for unconstrained optimization problems, steepest descent method|
|Class 5||Newton method, conjugate gradient methods, quasi-Newton methods|
|Class 6||Convex differentiable function|
|Class 7||General assignment to check the comprehension|
|Class 8||Differentiable convex functions with Lipschitz continuous gradients|
|Class 9||Worse case analysis for gradient based methods|
|Class 10||Steepest descent method for differentiable convex functions|
|Class 11||Estimate sequence in accelerated gradient methods for differentiable convex functions|
|Class 12||Accelerated gradient method for differentiable convex functions|
|Class 13||Accelerated gradient methods for min-max problems|
|Class 14||Extensions of the accelerated gradient methods|
D. P. Bertsekas, Nonlinear Programming, 2nd edition, (Athena Scientific, Belmont, Massachusetts, 2003).
Y. Nesterov, Introductory Lectures on Convex Optimization: A Basic Course, (Kluwer Academic Publishers, Boston, 2004).
Y. Nesterov, Lectures on Convex Optimization, 2nd edition, (Springer, Cham, Switzerland, 2018).
J. Nodedal and S. J. Wright, Numerical Optimization, 2nd edition, (Springer, New York, 2006).
Understand the basic theorems related to convex sets and convex functions, and the basic numerical methods to solve mathematical optimization problems. Grade will be based on mid-term and final exams or on reports along the course.
It is necessary that the attendees have basic notions of mathematics such as linear algebra and calculus as well as understand "basic" methodologies to prove.