2019 Mathematical Optimization: Theory and Algorithms

Font size  SML

Register update notification mail Add to favorite lecture list
Academic unit or major
Graduate major in Mathematical and Computing Science
Instructor(s)
Fukuda Mituhiro  Yamashita Makoto 
Class Format
Lecture     
Media-enhanced courses
Day/Period(Room No.)
Mon3-4(S422)  Thr3-4(S422)  
Group
-
Course number
MCS.T402
Credits
2
Academic year
2019
Offered quarter
3Q
Syllabus updated
2019/3/18
Lecture notes updated
2019/11/25
Language used
English
Access Index

Course description and aims

This course will cover basic notions to comprehend the gradient-based methods for convex optimization problems considered in mathematical optimization, machine learning and image processing. It starts with the basics, from the definition of convex sets in convex optimization, and will gradually focus on continuously differentiable convex functions. Along the lectures, it will also cover the characterization of solutions of optimization problems (optimality conditions), and numerical methods for general problems such as steepest descent methods, Newton method, conjugate gradient methods, and quasi-Newton methods. In the latter part, the accelerated gradient method of Nesterov for Lipschitz continuous differentiable convex functions will be detailed.

Student learning outcomes

Objectives: Learn the mathematical concepts and notions from the basics necessary for numerical methods for convex optimization problems. Definitions and proofs of theorems will be carefully explained. The objective is to understand the role of basic theorems of convex optimization in scientific articles, and to be prepared to apply them for other problems in mathematical optimization and machine learning.
Theme: In the first part, important theorems to analyze convex optimization problems will be introduced. In the second part, the Nesterov's accelerated gradient method which has received a lot of attention in the recent years will be explained from the mathematical point of view.

Keywords

Convex function, algorithm analysis, convex optimization problem, numerical methods in optimization, differentiable convex functions with Lipschitz continuous gradients, accelerated gradient method

Competencies that will be developed

Specialist skills Intercultural skills Communication skills Critical thinking skills Practical and/or problem-solving skills

Class flow

All the lectures will have proofs of theorem and explanations of concepts behind a method or definition.

Course schedule/Required learning

  Course schedule Required learning
Class 1 Convex sets and related results Criterion to grade will be explained
Class 2 Lipschitz continuous differentiable functions
Class 3 Optimal conditions for differentiable functions
Class 4 Minimization algorithms for unconstrained optimization problems
Class 5 Steepest descent method and Newton method
Class 6 Conjugate gradient methods, quasi-Newton methods
Class 7 General assignment to check the comprehension
Class 8 Convex differentiable function
Class 9 Differentiable Convex functions with Lipschitz continuous gradients
Class 10 Worse case analysis for gradient based methods
Class 11 Steepest descent method for differentiable convex functions
Class 12 Estimate sequence in accelerated gradient methods for differentiable convex functions
Class 13 Accelerated gradient method for differentiable convex functions
Class 14 Accelerated gradient methods for min-max problems
Class 15 Extensions of the accelerated gradient methods

Textbook(s)

None.

Reference books, course materials, etc.

D. P. Bertsekas, Nonlinear Programming, 2nd edition, (Athena Scientific, Belmont, Massachusetts, 2003).
Y. Nesterov, Introductory Lectures on Convex Optimization: A Basic Course, (Kluwer Academic Publishers, Boston, 2004).
Y. Nesterov, Lectures on Convex Optimization, 2nd edition, (Springer, Cham, Switzerland, 2018).
J. Nodedal and S. J. Wright, Numerical Optimization, 2nd edition, (Springer, New York, 2006).

Assessment criteria and methods

Understand the basic theorems related to convex sets and convex functions, and the basic numerical methods to solve mathematical optimization problems. Grade will be based on mid-term and final exams or on reports along the course.

Related courses

  • MCS.T506 : Mathematical Models and Computer Science
  • IEE.A430 : Numerical Optimization

Prerequisites (i.e., required knowledge, skills, courses, etc.)

It is necessary that the attendees have basic notions of mathematics such as linear algebra and calculus as well as understand basic methodologies to prove.

Page Top