• RSS
  • LinkedIn
  • Facebook
  • Twitter

This course is a graduate and senior-undergraduate-level introduction to information theory. Information theory is probably the most elegant mathematical theories, with the most direct and significant engineering impacts to our life in the information age. Starting from the first paper by C. E. Shannon in 1948, information theory has found its applications in many areas, including statistics, computer science, biology, economics, etc. Even though the course will attempt to cover as many aspects of information theory as possible, the focus will be on the direct applications of information theory in digital communications.

Arguably the most important part of learning information theory is to learn a new way of thinking about engineering problems. In this sense, this course is beneficial not only to communication majored students, but also to students in other engineering disciplines.

As an advanced course, our lectures proceed with a combination of intuitive thinking and rigorous mathematical treatments. It is hoped that the methodology taught in this course will be helpful in the research of all attendees.

Prerequisite: As an advanced course, the assumption is that the students have sufficient training in rigorous mathematical reasoning. Basic knowledge of the probability theory, at the level of ENG 200, is an absolute must. You are recommended to review and evaluate yourself on your fluency of probability and linear algebra using the following text book.
D. Bertsekas and J. Tsitsiklis, Introduction to Probability
Lecture Hours: Tuesdays 13:40-16:30, Room 207 and Thursdays 13:40-16:30, Room 207
Office Hours: Tuesdays 10:00-12:00 and Thursdays 10:00-12:00
Textbook:
Elements of Information Theory, Second Edition, Thomas M. Cover & Joy A. Thomas,Wiley 2006.
Reference Texts::
1. Information Theory and Reliable Communication, Robert G. Gallager, Wiley, 1968.
2. Information Theory, Inference and Learning Algorithms, David J.C. McKay, Cambridge University Press, 2003.

Tentative Calendar
Lecture 1. Introduction, Entropy, Mutual Information
Lecture 2. Jensen’s Inequality, Log-Sum Inequality, Data Processing Inequality, Fano’s Inequality
Lecture 3. Markov Chains
Lecture 5. Entropy Rate, Hidden Markov Models
Lecture 6. Asymptotic Equipartition Property
Lecture 7. Data Compression, Kraft Inequality, Optimal Codes
Lecture 8. Huffman Codes, Shannon-Fano-Elias Coding, Arithmetic Coding
Lecture 9. Channel Capacity,Symmetric Channels
Lecture 10. Channel Coding Theorem
Lecture 11. Hamming Codes, Feedback Capacity, Joint Source Channel Coding Theorem
Lecture 12. Differential Entropy
Lecture 13. Gaussian Channel, Band-Limited Channels
Lecture 14. Parallel Gaussian Channels, Channels with Colored Gaussian Noise
Lecture 15. Gaussian Channels with Feedback
Lecture 16. Maximum Entropy Distributions, Spectrum Estimation, Entropy Rates of a Gaussian Process, Burg’s Theorem
Lecture 17. Quantization, Rate Distortion Function
Lecture 18. Rate Distortion Theorem
Lecture 19. Gaussian Multiple User Channels, Multiple Access Channel
Lecture 20. Encoding of Correlated Sources, Slepian-Wolf Encoding and Multiple Access Channels
Lecture 21. Broadcast Channel
Lecture 22. Relay Channel
Lecture 23. Source Coding and Rate Distortion with Side Information
Lecture 24. General Multi-Terminal Networks
Lecture 25. Law of Large Numbers, Universal Source Coding
Lecture 26. Large Deviation Theory, Sanov’s Theorem, Conditional Limit Theorem
Lecture 27. Hypothesis Testing, Stein’s Lemma, Chernoff Bound
Lecture 28. Lempel-Ziv Coding, Cramer-Rao Inequality