Main,Menu,Search

Register update notification mail Add to favorite lecture list
Font Size SmallMediumLarge Print

Pattern Information Processing
( Sugiyama Masashi  )


Tue 3-4Session W831

Credits  Lecture:2  Exercise:0  Experiment:0 / code:76013
Update : 2012/6/20
Access Index :   
Spring Semester

Purpose of lecture
Inferring an underlying input-output dependency from input and output examples is called supervised learning. This course focuses on a statistical approach to supervised learning and introduces its basic concepts as well as state-of-the-art techniques.
Plan of lecture
1. Introduction
2. Statistical Formulation of Supervised Learning
3. Linear, Kernel, and Non-Linear Models
4. Least-Squares Learning
5. Weighted Least-Squares Learning
6. Regularization Learning
7. Sparse Learning
8. Robust Learning
9. Error Back-Propagation Algorithm
10. Cross-Validation
11. Input-Dependent Estimation of Generalization Error
12. Active Learning
13. Concluding Remarks and Future Prospects
Textbook and reference
None. Handouts are distributed if necessary.
Related and/or prerequisite courses
Probability and Statistics, Pattern Recognition, Advanced Data Analysis
Evaluation
Small reports related to machine learning and students' projects.
Comments from lecturer
Statistical machine learning is an interdisciplinary subject with a wide range of applicability. Not only learning basic foundations of machine learning, but also applying the learned knowledge to their own research topics is expected.
Office Hours
Anytime if available.

Page Top