Information theory is the science of operations on data such as compression, storage, and communication. The goal of this course is to introduce the principles of information and coding theory. These include a fundamental understanding of data compression and reliable communication over noisy channels. The course introduces the concepts of information measures, entropy, mutual information, and channel capacity, which are the fundamental basis of the mathematical theory of communication.
In this course you will learn about:
- Properties of information measures: entropy and typical sequences
- Lossless source coding: uniquely decodable codes, lossless source coding theorem, Huffman codes, Arithmetic codes, universal lossless compression, Lempel-Ziv codes.
- Channel Coding: Mutual information and channel capacity, Noisy channel coding theorem, Gaussian channel capacity and waterfilling, principles of error correction codes.
- Basics of multi-terminal communication: Jointly typical sequences