Information Theory

  Vorlesung Übungen
Lecturer Dr.-Ing. Vahid Aref Moustafa Ebada, Sebastian Cammerer
Dates Wed 2:00 pm - 3:30 pm Tue 2:00 pm - 3:30 pm
Lecture hall Pfaffenwaldring 47, room 2.348 Pfaffenwaldring 47, room 2.348
Extent 4 credit hours, 6 credit points
Language English
Turnus Every second semester (Summer)
Learning Outcome Information theory is the science of operations on data such as compression, storage, and communication. The goal of this course is to introduce the principles of information and coding theory. These include a fundamental understanding of data compression and reliable communication over noisy channels. The course introduces the concepts of information measures, entropy, mutual information, and channel capacity, which are the fundamental basis of the mathematical theory of communication.
  • Properties of information measures: entropy and typical sequences
  • Lossless source coding: uniquely decodable codes, lossless source coding theorem, Huffman codes, Arithmetic codes, universal lossless compression, Lempel-Ziv codes.
  • Channel Coding: Mutual information and channel capacity, Noisy channel coding theorem, Gaussian channel capacity and waterfilling, principles of error correction codes.
  • Basics of multi-terminal communication: Jointly typical sequences
  • T. M. Cover and J. A. Thomas, Elements of Information Theory, Wiley, 2006.
  • R. G. Gallager, Information Theory and Reliable Communication, Wiley, 1968.
  • David J. C. MacKay, "Information Theory, Inference, and Learning Algorithms", Cambridge University Press, 2003
Materials ILIAS
Example Figure


Shannon's famous channel capacity limit explained using the sphere packing bound

Module description in LSF