2019 Machine Learning (ICT)

Font size  SML

Register update notification mail Add to favorite lecture list
Academic unit or major
Undergraduate major in Information and Communications Engineering
Kumazawa Itsuo  Nakahara Hiroki 
Course component(s)
Mode of instruction
Day/Period(Room No.)
Mon7-8(W331)  Thr7-8(W331)  
Course number
Academic year
Offered quarter
Syllabus updated
Lecture notes updated
Language used
Access Index

Course description and aims

Various machine learning techniques and mathematics for their understanding are studies together with their programing techniques for practice. The techniques studied are (1) Multi-layer neural networks, (2) Convolutional Neural Network(CNN), (3) Other non-neural popular machine learning techniques. Mathematics behind these techniques such as differentiation, gradient descend, chain rule, backpropagation and nonlinear functions used for activation functions are studied. Programing techniques, the libraries of standard functions for efficient programing and implementation are practiced.

Student learning outcomes

Typical machine learning techniques such as (1) Multi-layer neural networks, (2) Convolutional Neural Network(CNN), (3) Other non-neural popular machine learning techniques and their mathematical backgrounds are studied for implementation by programming.


Neural Networks, Deep Learning, Convolutional Neural Network(CNN), Backpropagation, Gradient descend, Minimization technique for loss functions, Bayes estimation, Principal component analysis, Boosting technique, Support Vector machine, k-means method.

Competencies that will be developed

Specialist skills Intercultural skills Communication skills Critical thinking skills Practical and/or problem-solving skills

Class flow

In the first half of the course (7 classes), the basic principles of the multi-layer neural networks and machine learning are studied with the basic mathematics needed for understandings. CNN(Convolutional Neural Network), that is the core technique for deep neural networks, and its learning mechanism are theoretically explained. Programming techniques for deep learning are practiced. Exercises and examination are held in the eighth class.

Course schedule/Required learning

  Course schedule Required learning
Class 1 Backgrounds and the summary of the first half of the course. Biological neural networks and their modellings for engineering. Computation and programing of the models. Background and basic knowledge
Class 2 Basic mathematics needed for computation, learning and programing of the multi-layer neural networks part 1 (activation functions, SoftMax, logistic regression, gradient descend, chain rule) Study basic mathematical techniques to analyze multi layer neural networks
Class 3 Basic mathematics needed for computation, learning and programing of the multi-layer neural networks part 2 (backpropagation and its recursive computation) Study basic mathematical techniques for backpropagation
Class 4 Programming techniques for multi-layer neural networks and their learning. Study programing techniques for multi layer neural network and learning
Class 5 Computation of Convolutional Neural Network (convolution, pooling, SoftMax and their roles) and techniques to improve its performance (generalization capability and avoiding over fitting) Study computation techniques of Convolutional Neural Network
Class 6 Mathematics for learning of Convolutional Neural Network (Gradient Descend and Backpropagation) Study basic mathematics for Convolutional Neural Network
Class 7 Programing and implementation of learning of Convolutional Neural Network Study programming techniques of Convolutional Neural Network
Class 8 Exercises of the first half of the course, Programming exercise. Exercise to check understanding of the first half and programming
Class 9 Introduction of machine learning, and Python programming for machine learning.
Class 10 Least squares method, overfitting, sparse learning, and robust learning.
Class 11 Classification problem programming using the scikit-learn library. Logistic regression, support vector machine (SVM), and decision tree.
Class 12 Clustering including k-means method. Post/pre-processing for a dataset. L1 regularity, measurement of feature, and missing value.
Class 13 Maximum likelihood estimation, EM algorithm, Bayesian inference, confidence value.
Class 14 Data compression, principal component analysis (PCA), linear Discriminant Analysis (LDA), Kernel PCA.
Class 15 Ensemble learning, majority method, random forest, bugging, bootstrap method, under boost.
Class 16 Exercises and examination of the course, and a programming exercise.


A textbook to study mathematical bases for deep learning : Learning and Neural Networks, published by Morikita Publishing Co., written by Itsuo Kumazawa

Reference books, course materials, etc.

Books to exercise deep learning programming: Machine Learning by raspberry Pi, published by Kodansha Publishing Co., written by Takashi Kanemal. S. Raschka and V. Mirjalili, "Python Machine Learning (Second Ed.)," Packt Publishing. C. M. Bishop, "Pattern Recognition and Machine Learning," Springer.

Assessment criteria and methods

Final Examination and Programming Exercise.

Related courses

  • LAS.M102 : Linear Algebra I / Recitation
  • LAS.M101 : Calculus I / Recitation
  • LAS.M105 : Calculus II

Prerequisites (i.e., required knowledge, skills, courses, etc.)

Basic knowledge about Differential and Integral Calculus.

Page Top