Some attempts are introduced to analyze and understand principals behind brain function and massively parallel computation. Methods of statistical physics and probabilistic computation are lectured in addition to programming exercises to confirm the behavior of the parallel systems based on these methods.
This course provides basic knowledge on brain computation, its models for engineering application and statistical theories to understand their behavior. Topics include biological neural networks, artificial neural networks, statistical theories to understand highly parallel computation systems, and programming exercises of parallel computation systems. This course requires basic programming capabilities as theories are learned through computer simulation.
01.Introduction of biological neural network (Neurons and Neural Networks).
02.Introduction of statistical mechanics (Magnetic Systems and Spin Glass Models).
03.How to understand the behavior of highly parallel system like a brain. (Analogy between Neural Networks and Spin Glass Models).
04.Models of neurons and computer simulation of their behavior.
05.Deterministic models of recurrent neural networks.
06.Computer simulation of deterministic models of recurrent neural networks.
07.Probabilistic models of recurrent neural networks.
08.Computer simulation of probabilistic models of recurrent neural networks.
09.Theoretical analysis of probabilistic models of recurrent neural networks.
10.Application of a recurrent neural network for solving simultaneous equations.
11.Application of a recurrent neural network for solving combinatorial problems. (Part 1)
12.Application of a recurrent neural network for solving combinatorial problems. (Part 2)
Introduction to the Theory of Neural Computation, written by J. Hertz, A. Krogh and R.G. Palmer and published by Westview Press.
Knowledge on basic computer programming
Assignments on computer simulation
This course would provide practical techniques both on theoretical analysis and programming