While possibly so obvious that we normally are not even aware of the fact, substances existing in the real world naturally feature attributes physically quantifiable by man in weight and length. The academic disciplines of physics, chemistry, and biology have been developed to discuss nature quantitatively and objectively by focusing on these attributes. Now, can these academic disciplines be developed into "information" existing in the abstract world? One answer to this is "information theory". Information can be discussed quantitatively and objectively by focusing on the quantity of "code length" required for recording and transmitting information. Specifically, lecture topics will include information source modeling, self-information and entropy, source coding, and channel coding.
Attainment target: At the end of the course, students will have the skill of quantitatively handling "information" by using notions of information quantity.
Theme: The purpose of this course is to grasp the following three issues: 1) notions of information quantities, such as self-information, entropy, joint entropy, conditional entropy, mutual information, etc., 2) elements of source coding, and 3) elements of channel coding.
self-information, entropy, mutual information, source coding, channel coding
✔ Specialist skills | Intercultural skills | Communication skills | Critical thinking skills | ✔ Practical and/or problem-solving skills |
Towards the end of class, students are given exercise problems related to the lecture given that day to solve.
Course schedule | Required learning | |
---|---|---|
Class 1 | What is information theory? | Overview of information theory |
Class 2 | Models of information source | Probabilistic modeling of information source, definition of information quantity, redundancy and coding. |
Class 3 | Entropy (1): Derivation of entropy, properties of entropy | Definition of entropy, derivation of related equations and inequalities |
Class 4 | Entropy (2): Extension of entropy, entropy of Markov information source | Entropy of extended source, joint entropy, conditional entropy, entropy of Markov information source |
Class 5 | Data compression/source coding | Properties of codes, prefix property, Kraft's inequality |
Class 6 | Examples of source coding | Shannon-Fano code, Huffman code |
Class 7 | Source coding theorem | Derivation of source coding theorem |
Class 8 | Test level of understanding with exercise problems and summary of the first part of the course - Solve exercise problems covering the contents of classes 1–7. | Test level of understanding and self-evaluate achievement for classes 1–7. |
Class 9 | Models of communication channel | Probabilistic modeling of communication channels, classification of communication channels |
Class 10 | Mutual information and channel capacity | Definitions of mutual information and channel capacity, evaluation for stationary memoryless channels |
Class 11 | Channel coding/error correcting codes | Definition of channel coding, linear codes |
Class 12 | Channel coding theorem | Derivation of channel coding theorem |
Class 13 | Evaluation of probability of decoding failure | Evaluation of reliability function based on random coding |
Class 14 | Asymptotic equipartition property (1): Asymptotic equipartition property (AEP), typical sequence | Definition of typical sequences, evaluation of the number of typical sequences |
Class 15 | Asymptotic equipartition property (2): Derivation of source/channel coding theorems from AEP | Derivation of source/channel coding theorems from AEP |
Shigeichi Hirasawa, Introduction of information theory, Baifukan publishing (Japanese) ISBN: 978-4-563-01486-5
Thomas M. Cover and Joy A. Thomas, Elements of information theory (2nd Edition), John Wiley & Sons, Inc ISBN: 978-0-471-24195-9
Students' knowledge of information quantities, skills for handling them, and understanding of their application such as data compression and channel coding will be assessed. Midterm and final exams 80%, exercise problems 20%.
Students must have successfully completed both Fundamentals of ProbabilityI (MCS.T212) and Mathematical Statistics (MCS.T223) or have equivalent knowledge.