### 2017　Information Theory

Font size  SML

Undergraduate major in Mathematical and Computing Science
Instructor(s)
Kabashima Yoshiyuki  Watanabe Sumio
Class Format
Lecture
Media-enhanced courses
Day/Period(Room No.)
Tue5-6(W834)  Fri5-6(W834)
Group
-
Course number
MCS.T333
Credits
2
2017
Offered quarter
4Q
Syllabus updated
2017/11/15
Lecture notes updated
-
Language used
Japanese
Access Index

### Course description and aims

While possibly so obvious that we normally are not even aware of the fact, substances existing in the real world naturally feature attributes physically quantifiable by man in weight and length. The academic disciplines of physics, chemistry, and biology have been developed to discuss nature quantitatively and objectively by focusing on these attributes. Now, can these academic disciplines be developed into "information" existing in the abstract world? One answer to this is "information theory". Information can be discussed quantitatively and objectively by focusing on the quantity of "code length" required for recording and transmitting information. Specifically, lecture topics will include information source modeling, self-information and entropy, source coding, and channel coding.

### Student learning outcomes

Attainment target: At the end of the course, students will have the skill of quantitatively handling "information" by using notions of information quantity.
Theme: The purpose of this course is to grasp the following three issues: 1) notions of information quantities, such as self-information, entropy, joint entropy, conditional entropy, mutual information, etc., 2) elements of source coding, and 3) elements of channel coding.

### Keywords

self-information, entropy, mutual information, source coding, channel coding

### Competencies that will be developed

 ✔ Specialist skills Intercultural skills Communication skills Critical thinking skills ✔ Practical and/or problem-solving skills

### Class flow

We use both blackboards and a PC projector.

### Course schedule/Required learning

Course schedule Required learning
Class 1 What is information theory? Overview of information theory
Class 2 Models of information source Statistical properties of information source, representative models
Class 3 Entropy (1): Derivation of entropy, properties of entropy Definition of entropy, derivation of related equations and inequalities
Class 4 Entropy (2): Extension of entropy, entropy of Markov information source Entropy of extended source, joint entropy, conditional entropy, entropy of Markov information source
Class 5 Data compression/source coding Examples of coding and decoding, desired properties, code tree and prefix condition
Class 6 Craft inequality and lower bound of average code length Craft inequality, lower bound of average code length
Class 7 Source coding theorem Shannon code and Fano code, source coding theorem, Huffman code
Class 8 Test level of understanding with exercise problems and summary of the first part of the course - Solve exercise problems covering the contents of classes 1–7. Test level of understanding and self-evaluate achievement for classes 1–7.
Class 9 Models of communication channel Statistical properties of channel models, representative channel models
Class 10 Mutual information and channel capacity Relationship between mutual information and communication, evaluation of communication capacity for representative models
Class 11 Channel coding/error correcting codes Channel coding and its relevant parameters, decoding and error rates
Class 12 Asymptotic equipartition property and typical sequences Asymptotic equipartition property and typical set, typical set and source coding theorem, jointly typical set
Class 13 Channel coding theorem (I) Derivation of the channel coding theorem
Class 14 Channel coding theorem (II) Proof of the converse theorem to the channel coding theorem
Class 15 Linear codes Mechanism of error detection/correction, linear codes

### Textbook(s)

Slides for lectures will be distributed via OCW-i.

### Reference books, course materials, etc.

Shigeichi Hirasawa, Introduction of information theory, Baifukan publishing (Japanese) ISBN: 978-4-563-01486-5
Thomas M. Cover and Joy A. Thomas, Elements of information theory (2nd Edition), John Wiley & Sons, Inc ISBN: 978-0-471-24195-9

### Assessment criteria and methods

Students' knowledge of information quantities, skills for handling them, and understanding of their application such as data compression and channel coding will be assessed. Midterm and final exams 80%, exercise problems 20%.

### Related courses

• MCS.T212 ： Fundamentals of Probability
• MCS.T223 ： Mathematical Statistics
• MCS.T312 ： Markov Analysis
• MCS.T332 ： Data Analysis

### Prerequisites (i.e., required knowledge, skills, courses, etc.)

Students must have successfully completed both Fundamentals of ProbabilityI (MCS.T212) and Mathematical Statistics (MCS.T223) or have equivalent knowledge.