This course is composed of lectures and exercises. The lecture part first introduces linear programming problem, the most fundamental mathematical optimization problem. Then the lecture part overviews the simplex method and theoretical aspects of linear programming. The topic for nonlinear optimization problems includes the optimality conditions, the steepest descent method, the interior-point method. In the exercise part, students apply the simplex method to linear programming problems for understanding the computational steps of the simplex method, and give some parts of mathematical proof on optimization methods.
Mathematical optimization is a mathematical approach to find an optimal candidate from the set of candidates that satisfy specific conditions. It is closely related to scientific problems and it is widely employed to solve practical problems. For example, a diet problem in which we want to find a food recipe that minimizes the calories satisfying enough nutrients can be considered as an example of mathematical optimization problems. In this course , students learn both theoretical and computational aspects of optimization methods.
At the end of this course, students will be able to:
(1) Solve linear programming using the simplex method
(2) Understand theoretical properties of linear programming, for example, the duality theorem
(3) Understand the relation between optimal solutions and optimality conditions
(4) Explain the framework of numerical methods to solve nonlinear optimization problems, for example, the steepest decent method and the interior-point method
Linear programming, Simplex methods, Duality theorem, Sensitive analysis, Shortest path problem, Maximum flow problem, Convex function, Optimality condition for nonlinear optimization methods, Karush-Kuhn-Tucker condition, Steepest descent method, Newton method, Successive quadratic method, Interior-point method.
|✔ Specialist skills||Intercultural skills||Communication skills||Critical thinking skills||✔ Practical and/or problem-solving skills|
The lecture part overviews numerical methods and theoretical aspects. In the exercise part, exercise problems are assigned. Students apply the simplex method to linear programming problems by hand, and give a proof of optimization methods. For each exercise class, students submit reports.
|Course schedule||Required learning|
|Class 1||Overview of mathematical optimization, Linear programming||Formulate some problems using the standard form of linear programming.|
|Class 2||Exercise: Linear programming and Simplex method||Solve exercise problems related to linear programming.|
|Class 3||Simplex method||Solve linear programming problems by the simplex method.|
|Class 4||Bland's rule, Two-phase simplex method||Apply Bland's rule and/or two-phase simplex method to linear programming problems.|
|Class 5||Exercise: Two-phase simplex method||Solve exercise problems related to the simplex method and two-phase simplex method.|
|Class 6||Dual problem, Duality theorem||Derive a dual problem from a linear programming problem. Derive a theoretical properties from weak duality theorem.|
|Class 7||Complementary theorem, Sensitivity analysis||Derive an optimal solution using the complementary theorem. Apply the simplex method in a matrix style.|
|Class 8||Exercise: Duality theorem and complementary theorem||Solve exercise problems related to the duality thorem and the complementary problem.|
|Class 9||Shortest path||Analyze the sensitivity of linear programming problems. Apply Dikstra's method to obtain the shortest path.|
|Class 10||Maximum flow problem and question time||Apply the Ford-Fulkerson algorithm to solve the maximum flow.|
|Class 11||Exercise: Shortest path problem and Maximum flow problem||Solve exercise problems related to shortest path problem and maximum flow problem.|
|Class 12||Unconstrained optimization and Convex function||Formulate some problems as nonlinear optimization problems. Understand the definitions of convex sets and convex functions.|
|Class 13||Exercise: Convex function||Solve exercise problems related to convex function.|
|Class 14||Optimality condition for unconstrained optimization problem||Derive a proof on the convergence of the steepest descent method.|
|Class 15||Steepest descent method||Solve exercise problems related to convexity and optimality conditions.|
|Class 16||Exercise: Optimality condition and Steepest descent method||Solve exercise problems related to optimality condition and steepest descent method.|
|Class 17||Newton method, Karush-Kuhn-Tucker condition||Explain the framework of Newton method. Derive relations between the KKT condition and optimal solutions.|
|Class 18||Lagrange function, Duality theorem||Derive relations between Lagrange function and the KKT condition.|
|Class 19||Exercise: Newton method, Karush-Kuhn-Tucker condition, Duality theorem||Solve exercise problems related to Newton method, Karush-Kuhn-Tucker condition and Duality theorem.|
|Class 20||Successive quadratic method||Explain the framework of the successive quadratic method.|
|Class 21||Interior-point method||Explain the framework of the interior-point method.|
|Class 22||Exercise: Successive quadratic method||Solve exercise problems related to successive quadratic method|
To enhance effective learning, students are encouraged to spend a certain length of time outside of class on preparation and review (including for assignments), as specified by the Tokyo Institute of Technology Rules on Undergraduate Learning (東京工業大学学修規程) and the Tokyo Institute of Technology Rules on Graduate Learning (東京工業大学大学院学修規程), for each class.
They should do so by referring to textbooks and other course material.
None required. Parts of the course materials are based on the reference books below.
・ Mathematical optimizaiton, Takahito Kuno, Maiko Shigeno, Junnya Goto, Ohmsha, 2012 (in Japanese)
・ Optimization Methods, Akihisa Tamura, Masakazu Muramatsu, Kyouritsu-shuppan, 2002 (in Japanese)
・ Linear and Nonlinear Optimization," Igor Griva, Stephen G. Nash, Ariela Sofer, SIAM,
・ Linear Programming, Vasek Chvatal, Freeman, 1983
・ Linear Optimization, Glenn H. Hurlbert, Springer, 2010
・ Introductory Lectures on Convex Optimizaiton, Yurii Nesterov, Kluwer Academic Publishers, 2004
Students will be assessed on their understanding on the simplex method for linear programming problems, duality theorem, network optimization problems, and numerical methods and optimality conditions for nonlinear optimization problems.
Students' course scores are based on the final exam (80%) and exercise reports (20%).
[In the case that the final exam is not available, a final report will be used instead.)
Students must have studied Linear Algebra and Its Applications (MCS.T203) or have equivalent knowledge.
(FY2022) The final exam will be given in-person during the examination period. Depending on the situation of the new corona virus, it may be given as an online examination or a final report.