2020 Mathematical Optimization: Theory and Algorithms

Font size  SML

Register update notification mail Add to favorite lecture list
Academic unit or major
Graduate major in Mathematical and Computing Science
Fukuda Mituhiro  Yamashita Makoto 
Class Format
Lecture    (ZOOM)
Media-enhanced courses
Day/Period(Room No.)
Mon3-4(W834)  Thr3-4(W834)  
Course number
Academic year
Offered quarter
Syllabus updated
Lecture notes updated
Language used
Access Index

Course description and aims

This course will cover basic notions to comprehend the gradient-based methods for convex optimization problems considered in mathematical optimization, machine learning and image processing. It starts with the basics, from the definition of convex sets in convex optimization, and will gradually focus on continuously differentiable convex functions. Along the lectures, it will also cover the characterization of solutions of optimization problems (optimality conditions), and numerical methods for general problems such as steepest descent methods, Newton method, conjugate gradient methods, and quasi-Newton methods. In the latter part, the accelerated gradient method of Nesterov for Lipschitz continuous differentiable convex functions will be detailed.

Student learning outcomes

Objectives: Learn the mathematical concepts and notions from the basics necessary for numerical methods for convex optimization problems. Definitions and proofs of theorems will be carefully explained. The objective is to understand the role of basic theorems of convex optimization in scientific articles, and to be prepared to apply them for other problems in mathematical optimization and machine learning.
Theme: In the first part, important theorems to analyze convex optimization problems will be introduced. In the second part, the Nesterov's accelerated gradient method which has received a lot of attention in the recent years will be explained from the mathematical point of view.


Convex function, algorithm analysis, convex optimization problem, numerical methods in optimization, differentiable convex functions with Lipschitz continuous gradients, accelerated gradient method

Competencies that will be developed

Specialist skills Intercultural skills Communication skills Critical thinking skills Practical and/or problem-solving skills

Class flow

All the lectures will have proofs of theorem and explanations of concepts behind a method or definition.

Course schedule/Required learning

  Course schedule Required learning
Class 1 Convex sets and related results Criterion to grade will be explained
Class 2 Lipschitz continuous differentiable functions
Class 3 Optimal conditions for differentiable functions
Class 4 Minimization algorithms for unconstrained optimization problems, steepest descent method
Class 5 Newton method, conjugate gradient methods, quasi-Newton methods
Class 6 Convex differentiable function
Class 7 General assignment to check the comprehension
Class 8 Differentiable convex functions with Lipschitz continuous gradients
Class 9 Worse case analysis for gradient based methods
Class 10 Steepest descent method for differentiable convex functions
Class 11 Estimate sequence in accelerated gradient methods for differentiable convex functions
Class 12 Accelerated gradient method for differentiable convex functions
Class 13 Accelerated gradient methods for min-max problems
Class 14 Extensions of the accelerated gradient methods

Out-of-Class Study Time (Preparation and Review)

To enhance effective learning, students are encouraged to spend approximately 100 minutes preparing for class and another 100 minutes reviewing class content afterwards (including assignments) for each class.
They should do so by referring to textbooks and other course material.



Reference books, course materials, etc.

D. P. Bertsekas, Nonlinear Programming, 2nd edition, (Athena Scientific, Belmont, Massachusetts, 2003).
Y. Nesterov, Introductory Lectures on Convex Optimization: A Basic Course, (Kluwer Academic Publishers, Boston, 2004).
Y. Nesterov, Lectures on Convex Optimization, 2nd edition, (Springer, Cham, Switzerland, 2018).
J. Nodedal and S. J. Wright, Numerical Optimization, 2nd edition, (Springer, New York, 2006).

Assessment criteria and methods

Understand the basic theorems related to convex sets and convex functions, and the basic numerical methods to solve mathematical optimization problems. Grade will be based on mid-term and final exams or on reports along the course.

Related courses

  • MCS.T506 : Mathematical Models and Computer Science
  • IEE.A430 : Numerical Optimization

Prerequisites (i.e., required knowledge, skills, courses, etc.)

It is necessary that the attendees have basic notions of mathematics such as linear algebra and calculus as well as understand "basic" methodologies to prove.

Page Top