Some advanced topics and theories related to statistics and machine learning are taught. More specifically, PAC learnability, VC dimension, statistical properties of training and prediction errors, a nonparametric method called kernel method, high dimensional data analysis, and efficient optimization techniques for machine learning are taught.
[Objectives] Statistical science and machine learning are disciplines in which useful information is extracted from data to aid human decision making. Students will learn methodology not simply as knowledge, but also learning the background theory including the validity of those methods to promote understanding the essence. Students will broadly apply all kinds of techniques to a variety of problems, learning to construct new techniques on one's own.
[Topics] Students in this course will learn several of statistical science's more advanced techniques, based on their connection to various application fields. We will focus in particular on the connection with machine learning, introducing central topics from both statistical science and machine learning.
machine learning, statistics, PAC learning, high dimensional data analysis, kernel method, support vector machine, optimization, convex analysis
|✔ Specialist skills||Intercultural skills||Communication skills||✔ Critical thinking skills||Practical and/or problem-solving skills|
Lectures are given using black board mainly.
|Course schedule||Required learning|
|Class 1||Machine learning and statistics.||Learn the overview of machine learning and know the relation between machine learning and statistics. Review the probability theory and understand some probabilistic inequalities.|
|Class 2||Problem setup of statistical learning theory||Understand the problem setup of statistical learning theory. Learn definitions of training errors, prediction errors, Bayes errors and Bayes rules.|
|Class 3||PAC learnability and VC dimension||Learn PAC learning theory to analyze why machine learning works well.|
|Class 4||Fundamental theorem of statistical learning||Learn the Fundamental Theorem of Statistical Learning. Understand the relationship among PAC learnability, VC dimension and uniform convergence property.|
|Class 5||Rademacher complexity||Understand how to calculate prediction errors using Rademacher complexity.|
|Class 6||Theory of surrogate losses||Learn the relation between surrogate losses and prediction errors in classification problems.|
|Class 7||Regression analysis: least square methods, regularization and cross validation.||Learn some basic methods for statistical data analysis in regression problems.|
|Class 8||High dimensional sparse regression analysis||Learn sparse regression analysis for high dimensional data.|
|Class 9||Regression analysis and kernel methods||Understand statistical modeling with kernel functions in regression analysis.|
|Class 10||Kernel methods I: positive definite kernels||Learn some properties of kernel functions|
|Class 11||Kernel methods II: reproducing kernel Hilbert space||Learn a kernel method that is a nonparametric method on reproducing kernel Hilbert space.|
|Class 12||Spline smoothing and kernel methods||Learn the relationship between spline smoothing methods and kernel methods.|
|Class 13||Classification analysis and kernel methods: support vector machine||Learn kernel-based support vector machine for classification problems.|
|Class 14||Computation algorithm for support vector machines.||Learn an efficient computation algorithm for support vector machines.|
|Class 15||Multi-class classification||Understand learning algorithms for multi-class classification problems such as error correcting output coding method.|
Shai Shalev-Shwartz and Shai Ben-David, Understanding Machine Learning: From Theory to Algorithms, Cambridge University Press, 2014.
Evaluated by report submission.
No prerequisites. But, it is preferred that students know the basics of statistics and probability theory.
Contact by e-mail in advance.