While possibly so obvious that we normally are not even aware of the fact, substances existing in the real world naturally feature attributes physically quantifiable by man in weight and length. The academic disciplines of physics, chemistry, and biology have been developed to discuss nature quantitatively and objectively by focusing on these attributes. Now, can these academic disciplines be developed into "information" existing in the abstract world? One answer to this is "information theory". Information can be discussed quantitatively and objectively by focusing on the quantity of "code length" required for recording and transmitting information. Specifically, lecture topics will include information source modeling, self-information and entropy, source coding, and channel coding.
Attainment target: At the end of the course, students will have the skill of quantitatively handling "information" by using notions of information quantity.
Theme: The purpose of this course is to grasp the following three issues: 1) notions of information quantities, such as self-information, entropy, joint entropy, conditional entropy, mutual information, etc., 2) elements of source coding, and 3) elements of channel coding.
self-information, entropy, mutual information, source coding, channel coding
Intercultural skills | Communication skills | ✔ Specialist skills | Critical thinking skills | ✔ Practical and/or problem-solving skills |
We use both blackboards and a PC projector.
Course schedule | Required learning | |
---|---|---|
Class 1 | What is information theory? | Know overview of information theory |
Class 2 | Models of information source | Understand statistical properties of information source, representative models |
Class 3 | Entropy (1): Derivation of entropy, properties of entropy | Understand definition of entropy, derivation of related equations and inequalities. |
Class 4 | Entropy (2): Extension of entropy, entropy of Markov information source | Understand entropy of extended source, joint entropy, conditional entropy, entropy of Markov information source |
Class 5 | Data compression/source coding | Understand examples of coding and decoding, desired properties, code tree and prefix condition |
Class 6 | Craft inequality and lower bound of average code length | Understand craft inequality, lower bound of average code length |
Class 7 | Source coding theorem | Understand shannon code and Fano code, source coding theorem, Huffman code |
Class 8 | Exercise and summary of the first part of the course, | Solve exercise problems covering the contents of classes 1–7. |
Class 9 | Models of communication channel. Mutual information | Know statistical properties of channel models, representative channel models. Understand relationship between mutual information and communication。 |
Class 10 | Channel capacity. Channel coding/error correcting codes | Understand evaluation of communication capacity for representative models. Understand channel coding and its relevant parameters, decoding and error rates |
Class 11 | Asymptotic equipartition property and typical sequences | Understand asymptotic equipartition property and typical set, typical set and source coding theorem, jointly typical set |
Class 12 | Channel coding theorem (I) | Understand the channel coding theorem |
Class 13 | Channel coding theorem (II) | Understand the converse theorem to the channel coding theorem |
Slides for lectures will be distributed via OCW-i.
Shigeichi Hirasawa, Introduction of information theory, Baifukan publishing (Japanese) ISBN: 978-4-563-01486-5
Thomas M. Cover and Joy A. Thomas, Elements of information theory (2nd Edition), John Wiley & Sons, Inc ISBN: 978-0-471-24195-9
Students' knowledge of information quantities, skills for handling them, and understanding of their application such as data compression and channel coding will be assessed. Midterm and final exams 70%, exercise problems 30%.
Students must have successfully completed both Fundamentals of ProbabilityI (MCS.T212) and Mathematical Statistics (MCS.T223) or have equivalent knowledge.