2022 Information Theory

Font size  SML

Register update notification mail Add to favorite lecture list
Academic unit or major
Undergraduate major in Mathematical and Computing Science
Instructor(s)
Takabe Satoshi 
Class Format
Lecture    (Face-to-face)
Media-enhanced courses
Day/Period(Room No.)
Tue1-2(S224)  Fri1-2(S224)  
Group
-
Course number
MCS.T333
Credits
2
Academic year
2022
Offered quarter
3Q
Syllabus updated
2022/4/20
Lecture notes updated
-
Language used
Japanese
Access Index

Course description and aims

“Information” that we treat occasionally is not usually an entity. However, information theory can treat information mathematically by quantifying information. Moreover, information theory can show the mathematical achievable bounds for data compression and data transmission by using the notions of entropy and mutual information. These findings have been had large influences on recent information processing and transmission technologies.
The aim of this lecture is to understand the relation of information theory to recent information processing and transmission technologies by learning three basic points of information theory: (1) basic concepts such as entropy, (2) source coding for data compression, and (3) channel coding for data transmission.

Student learning outcomes

The student will treat information mathematically by learning the following:
(1) basic concepts such as entropy and mutual information
(2) source coding for data compression
(3) channel coding for data transmission

Keywords

self-information, entropy, mutual information, source coding, channel coding

Competencies that will be developed

Specialist skills Intercultural skills Communication skills Critical thinking skills Practical and/or problem-solving skills

Class flow

The student will learn each idea, theorem, and its proof according to slides for the course. Some practice for solving problems will be also performed.

Course schedule/Required learning

  Course schedule Required learning
Class 1 What is information theory? Learn the basics of information theory and communication model.
Class 2 Entropy and its properties Understand entropy and its various relationships.
Class 3 Mutual information Learn mutual information and KL divergence.
Class 4 Markov chain and entropy rate Learn Markov chain and related entropy rate.
Class 5 Data compression/source coding Learn basic concepts of source coding for data compression.
Class 6 Source coding theorem Understand source coding theorem as a mathematical achievable bound of data compression.
Class 7 Huffman coding Learn Huffman coding as an example of source coding.
Class 8 Channel and capacity Learn channel as a model of data transmission and its capacity.
Class 9 Channel coding theorem Understand the concept of channel coding theorem as a mathematical achievable bound of data transmission.
Class 10 Typical sequences and proof of channel coding theorem Understand proof of channel coding theorem based on the concept of typical sequences
Class 11 Linear codes Learn linear codes as basic examples of error-correcting codes.
Class 12 Maximum likelihood decoding Learn maximum likelihood decoding for linear codes.
Class 13 Information theory for continuous variables Learn information theory for continuous random variables.
Class 14 Summary and outlook Summarize learned concepts in this lecture and learn some concepts of advanced information theory.

Out-of-Class Study Time (Preparation and Review)

Students are encouraged to spend approximately 100 minutes preparing for class and another 100 minutes reviewing class content afterwards (including assignments) for each class by referring to textbooks and other course material.

Textbook(s)

Slides for lectures will be distributed.

Reference books, course materials, etc.

Jun-ichi Inoue, Beginners Guide: Information Theory, Pleiades publishing (Japanese), ISBN: 978-4-903814-17-9
Thomas M. Cover and Joy A. Thomas, Elements of information theory (2nd Edition), John Wiley & Sons, Inc ISBN: 978-0-471-24195-9

Assessment criteria and methods

Students' knowledge of information quantities, skills for handling them, and understanding of their application such as data compression and channel coding will be assessed based on report assignments.

Related courses

  • MCS.T212 : Fundamentals of Probability
  • MCS.T223 : Mathematical Statistics
  • MCS.T312 : Markov Analysis
  • MCS.T332 : Data Analysis

Prerequisites (i.e., required knowledge, skills, courses, etc.)

Students must have successfully completed both Fundamentals of ProbabilityI (MCS.T212) and Mathematical Statistics (MCS.T223) or have equivalent knowledge.

Page Top