Machine learning theory and statistical mechanics are introduced. In the first half, hierarchical learning machines for accurate prediction and knowledge discovery are explained and its mathematical laws are derived. In the second half, statistical mechanical approximation theory for handling massive information processing is introduced.
The purpose of statistical learning is to estimate the true information source from empirical samples. In this course, several learning machines which have high dimensional parameters are introduced. Statistical mechanics theory plays an important role in studying such learning machines. Two other courses, ``Theory of statistical mathematics" and ``Machine learning" are strongly recommended for students.
Statistics, Information Theory, Statistical mechanics, Free energy, and Entropy
✔ Specialist skills | Intercultural skills | Communication skills | Critical thinking skills | ✔ Practical and/or problem-solving skills |
This course consists of two parts. Machine learning theory and statistical mechanics are introduced.
Course schedule | Required learning | |
---|---|---|
Class 1 | Introduction of statistical learning theory | statistical learning theory |
Class 2 | Neural network architecture | neural network |
Class 3 | Learning in neural networks | learning in neural networks |
Class 4 | Boltzmann machine | Boltzmann machine |
Class 5 | Deep Learning | Deep learning |
Class 6 | Information and relative entropy | Information and relative entropy |
Class 7 | Prediction Theory | Prediction theory |
Class 8 | Discovery theory | Discovery theory |
Class 9 | Monte Carlo methods | Computational difficulty of sampling in high dimensional spaces |
Class 10 | Markov chain Monte Carlo Methods | Metropolis-Hastings methods |
Class 11 | Advanced Markov chain Monte Carlo Methods | Advanced Markov chain Monte Carlo Methods |
Class 12 | Variational inference methods | Variational inference methods |
Class 13 | Variational Mixture of Gaussians | Variational Mixture of Gaussians |
Class 14 | Tree graphs and belief propagation | Belief propagation |
Class 15 | Loopy graphs and loopy belief propagation | Loopy belief propagation |
None.
None. Two other lectures "Theory of statistical mathematics" and "Machine Learning"are strongly recommended for students.
Reports.
Probability theory and statistics are necessary.