Some advanced topics and theories related to statistics and machine learning are taught. More specifically, a nonparametric method called kernel method, statistical properties of training and prediction errors, generalization error bound using Rademacher complexity, and recent deep learning algorithms are taught.
[Objectives] Statistical science and machine learning are disciplines in which useful information is extracted from data to aid human decision making. Students will learn methodology not simply as knowledge, but also learning the background theory including the validity of those methods to promote understanding the essence. Students will broadly apply all kinds of techniques to a variety of problems, learning to construct new techniques on one's own.
[Topics] Students in this course will learn several of statistical science's more advanced techniques, based on their connection to various application fields. We will focus in particular on the connection with machine learning, introducing central topics from both statistical science and machine learning.
machine learning, statistics, kernel methods, prediction error, Rademacher complexity, deep learning
|Intercultural skills||Communication skills||Specialist skills||Critical thinking skills||Practical and/or problem-solving skills|
Lectures are given using black board mainly.
|Course schedule||Required learning|
|Class 1||Regression analysis and kernel methods||Understand statistical modeling with kernel functions in regression analysis.|
|Class 2||Kernel methods I: positive definite kernels||Learn some properties of kernel functions.|
|Class 3||Kernel methods II: reproducing kernel Hilbert space||Learn a kernel method that is a nonparametric method on reproducing kernel Hilbert space.|
|Class 4||Spline smoothing and kernel methods||Learn the relationship between spline smoothing methods and kernel methods.|
|Class 5||Classification analysis and kernel methods: support vector machine||Learn kernel-based support vector machine for classification problems.|
|Class 6||Multi-class classification||Understand learning algorithms for multi-class classification problems such as error correcting output coding method.|
|Class 7||Kernel embedding methods||Learn kernel embedding methods and its applications to various statistical inferences.|
|Class 8||Problem setup of statistical learning theory||Understand the problem setup of statistical learning theory. Learn definitions of training errors, prediction errors, Bayes errors and Bayes rules.|
|Class 9||Inequalities in Probability Theory||Review the probability theory and understand some probabilistic inequalities.|
|Class 10||Prediction Error Bound for Finite hypothesis class||Learn some properties understand the relationship between the complexity of the model and the prediction error. f kernel functions|
|Class 11||Rademacher Complexity||Learn the Rademacher complexity of the model.|
|Class 12||Generalization Error bound using Rademacher Complexity||Understand how to calculate prediction errors using Rademacher complexity.|
|Class 13||Theory of surrogate losses||Learn the relationship between surrogate losses and prediction errors in classification problems.|
|Class 14||Computation algorithm for support vector machines.||Learn an efficient computation algorithm for support vector machines.|
|Class 15||Generative Adversarial Networks(GAN)||Learn the algorithm and statistical properties of Generative Adversarial Networks(GAN) that is used as the generative model for images.|
Shai Shalev-Shwartz and Shai Ben-David, Understanding Machine Learning: From Theory to Algorithms, Cambridge University Press, 2014.
Evaluated by report submission.
It is preferred that students know the basics of statistics and probability theory.
Contact by e-mail in advance.