In the first half, this course reviews the fundamentals of information theory learned in undergraduate course, and teaches some information measures and their properties. Then, picking up general information source and channel as the most general source and channel without assuming stationarity nor ergodicity, source and channel coding theorems are demonstrated.
In the second half, after reviewing the fundamentals of coding theory, compare the computation complexity required for encoding and decoding, to understand the importance of efficient error correction. Students learn the definition and analysis of sum-product algorithm and low-density parity-check codes and how the algorithm is derived. As applications, students will learn the design method of capacity achieving codes, coding for memory channels, and compression.
By the end of this course, students will be able to
1) Understand various information measures and their properties, and use mathematical models for information communication networks.
2) Understand the information for the general source and channel, and acquire the basic methods to use them.
3) Understand the theory of error-correcting code which can be implemented with low computational complexity, and acquire its design method.
information theory, general source, general channel, source coding theorem, channel coding theorem, random number generation, rate-distortion theory, multi-terminal information theory, sum-product algorithm, low-density parity-check code, performance analysis method, coding for channel with memory, data compression
|✔ Specialist skills||Intercultural skills||Communication skills||Critical thinking skills||✔ Practical and/or problem-solving skills|
The instructor explains certain topics in every class. Towards the end of classes students are given exercise or report problems related to what is taught on that day to solve.
|Course schedule||Required learning|
|Class 1||Introduction of information theory||Review information theory.|
|Class 2||Coding problems for general information sources||Explain various information measures and its properties.|
|Class 3||ε-coding problem for general information sources||Explain the general source and the source coding theorem.|
|Class 4||Coding problem for memoryless channels||Explain the ε-coding problem and its fundamental theorem.|
|Class 5||Coding problem for general channels||Explain the memoryless channel and corresponding channel coding theorem.|
|Class 6||Random coding exponent for channel coding||Explain the general channel and corresponding coding theorem.|
|Class 7||MId-term examination||Review the whole contents|
|Class 8||Introduction of coding theory||Explain the reason why bounded distance decoder does not achieve the capacity.|
|Class 9||Decoding and computational complexity, sum-product algorithm||Explain decoding and computational complexity, sum-product algorithm|
|Class 10||An application of sum-product algorithm to the decoding problem for linear codes||Explain an application of sum-product algorithm to the decoding problem for linear codes|
|Class 11||Definition of low-density parity-check codes||Explain the definition of low-density parity-check codes|
|Class 12||Properties of low-density parity-check codes||Explain properties of low-density parity-check codes|
|Class 13||Design method of capacity achieving codes||Explain design method of capacity achieving codes|
|Class 14||Coding memory channels and compression||Explain coding memory channels and compression|
To enhance effective learning, students are encouraged to spend approximately 100 minutes preparing for class and another 100 minutes reviewing class content afterwards (including assignments) for each class.
They should do so by referring to textbooks and other course material.
Classes 1-7: All materials used in class can be found on OCW-i.
Classes 8-15: Course materials are provided during class.
T. S. Han, Information Spectrum Method in Information Theory, Springer, 2003.
T. Richardson and R. Urbanke, Modern Coding Theory, Cambridge University Press, 2008.
Mid-term exam: 50%
Final exam : 30%
Homework assignments : 20%
Homework assignments : 100%