2016 Statistical Theories for Brain and Parallel Computing

Font size  SML

Register update notification mail Add to favorite lecture list
Academic unit or major
Graduate major in Information and Communications Engineering
Kumazawa Itsuo 
Course component(s)
Mode of instruction
Day/Period(Room No.)
Tue1-2(G224)  Fri1-2(G224)  
Course number
Academic year
Offered quarter
Syllabus updated
Lecture notes updated
Language used
Access Index

Course description and aims

Summary: Some attempts are introduced to analyze and understand principals behind brain function and massively parallel computation. Methods of statistical physics and probabilistic computation are lectured in addition to programming exercises to confirm the behavior of the parallel systems based on these methods.
Purpose: Study theories of the statistical mechanics to analisys and design of highly parallel system. Make programs of neural networks and apply them to various problems.

Student learning outcomes

Basic knowledge on brain computation and models of its ultra parallel computation is learned. Mathematical and statistical theories to understand the highlly parallel system are mastered. Statistical or probablistic computation technique is learned to make programs of highly parallel computation systems. Through programming exercises of parallel computation systems, the effectiveness of the theories is confirmed and programming skills are improved.


Brain, Neural Network, Parallel Computation, Statistical Mechanics, Optimazation Problem, Learning

Competencies that will be developed

Specialist skills Intercultural skills Communication skills Critical thinking skills Practical and/or problem-solving skills

Class flow

Lectures and Progrmming Excircises are organized to study theories and examine their performances and effectiveness on practical proble,ms.

Course schedule/Required learning

  Course schedule Required learning
Class 1 Introduction of biological neural network (Neurons and Neural Networks Study biological neural networks and how they work.
Class 2 Artifical models of neuron (Derministic and probablistic models. Binary and continuous models. By analogy to biological neural networks, engineering models of neural networks areintroduced.
Class 3 Artifical models of neural network (Recurrent and Feed forward models As a typical models of neural networks, recurrent models and feed forward models are studied.
Class 4 Introduction of statistical mechanics (Magnetic Systems and Spin Glass Models As a theoretical foundation to understand recurrent models, statistical mechanics is stduied.
Class 5 How to understand the behavior of highly parallel system like a brain. Analogy between Neural Networks and Spin Glass Models By analogy with spin glass models, parallel computation of recurrent models is studied. The concept of energy is introduced to the parallel computation model of the brain.
Class 6 Enegy minimization by the deterministic models of recurrent neural network It is proven that the enegy is reduced by the deterministic models of recurrent neural network
Class 7 Analysis of Neural Computaion by Boltsmann's Theory Theorerical analysys is conducted for the probablistic computation model of brain by using the methods developed in statistical mechanics.
Class 8 Computer simulation of deterministic and probablistic models of recurrent neural network Computer simulation of deterministic and probablistic models is conducted to confirm the capability of recurrent neural network
Class 9 Application of a recurrent neural network for solving simultaneous equations and Four Queen Problem Simultaneous equations and optimization problems are implemented in the recurrent neural networks and shown to be solved by using their energy minimization capabilities.
Class 10 Tips for efficial computation: Ergodicity and automatic determination of connection weight Use of elgodicity property is studied to improve the computation efficiency of the recurrent neural networks. Ways of determing weights and thresholds to solve a given problem is studied.
Class 11 Mathematical basis for learning of feed forward neural networks: Gradient algorithm Learning mehods for feed forward neural networks are introduced and their mathematical bases of gradient algorithms are studied.
Class 12 Learning of a single neuron As a linear classifier, a single neuron works as an element of Support Vector Machine of an ensemble learning system and trained by various learning methods.
Class 13 Learning of a multi-layer neurral network As a basis of Deep Learning, learning metods for multiple layer neural networks are studied.
Class 14 Learning by Back Propagation algorithm The traditional learning method for feed forward neural networks: Back Propagation and its mathematics are studied.
Class 15 Computer Simulation of Back Propagation algorithm Computer Simulation of Back Propagation algorithm is conducted to confirm its capability.


"Learnig and Neural Netwoks" written by Itsuo Kumazawa and published by Morikita Publishing Company is used as an text book. As the book is written in Japanese, it is used partially and as an assist for the lecture that is given in English.

Reference books, course materials, etc.

Introduction to the Theory of Neural Computation, written by J. Hertz, A. Krogh and R.G. Palmer and published by Westview Press.

Assessment criteria and methods

Assignments on implementation of what studied in the courses. Submitted reports on computer simulation and discussion on the simulation results are evaluated.

Related courses

  • ICT.M202 : Probability and Statistics (ICT)
  • ICT.S302 : Functional Analysis and Inverse Problems
  • ICT.H318 : Foundations of Artificial Intelligence (ICT)
  • ICT.P204 : Basic Computer Programming (ICT)
  • ICT.M306 : Concrete Mathematics

Prerequisites (i.e., required knowledge, skills, courses, etc.)

Programming with any computer language or such applications as Matlab or mathematica is required to conduct computer simulation of various methods taught in this course.

Contact information (e-mail and phone)    Notice : Please replace from "[at]" to "@"(half-width character).


Office hours

10:00-19:00 on weekdays

Page Top