(In English)

Information Theory

Theoretical limits of communications (Winter Semester)

Synopsis

Information theory is the science of operations on data such as compression, storage, and communication. The goal of this course is to introduce the principles of information and coding theory. These include a fundamental understanding of data compression and reliable communication over noisy channels. The course introduces the concepts of information measures, entropy, mutual information, and channel capacity, which are the fundamental basis of the mathematical theory of communication.

Educational Objectives

In this course you will learn about:

- Properties of information measures: entropy and typical sequences
- Lossless source coding: uniquely decodable codes, lossless source coding theorem, Huffman codes, Arithmetic codes, universal lossless compression, Lempel-Ziv codes.
- Channel Coding: Mutual information and channel capacity, Noisy channel coding theorem, Gaussian channel capacity and waterfilling, principles of error correction codes.
- Basics of multi-terminal communication: Jointly typical sequences

Course Information

6 ECTS Credits

Lecturer Dr. Vahid Aref
Time Slot Wednesday, 14:00-15:30
Lecture Hall 2.348 (ETI2)
Weekly Credit Hours 2
Lecturer Moustafa Ebada and Ahmed Elkelesh
Time Slot Thursday, 14:00-15:30
Lecture Hall 2.348 (ETI2)
Weekly Credit Hours 2
To the top of the page