Synopsis
Information theory is the science of operations on data such as compression, storage, and communication. The goal of this course is to introduce the principles of information and coding theory. These include a fundamental understanding of data compression and reliable communication over noisy channels. The course introduces the concepts of information measures, entropy, mutual information, and channel capacity, which are the fundamental basis of the mathematical theory of communication.
Contents and Educational Objectives
At the heart of todays digital infrastructure lies a fundamental question: How do we represent, transmit, and extract information in the presence of uncertainty and noise?
With the increasing prevalence of artificial intelligence and machine learning systems, we should immediately append a second, equally pressing question: What does this mathematical foundation tell us about modern learning algorithms?
This course develops the mathematical framework needed to answer both questions. It introduces three tightly connected perspectives:
- Information theory provides the mathematical language to quantify uncertainty, information, and randomness. It tells us what is theoretically possible: What are the ultimate limits of data compression? How reliably can information be transmitted across a noisy channel? These limits form the cornerstone of all modern communication, storage, and learning systems.
- Coding (here: mathematical error-control codes, not programming) turns possibility into engineering practice. It shows how to design explicit algorithms and codes that approach (or sometimes even achieve) these fundamental limits. Here, abstract quantities become concrete constructions: error-correcting codes, encoding/decoding strategies, and the mathematics behind their performance. Coding reveals not only how close we can get to Shannons limits, but also why it is computationally hard.
- Learning builds a bridge from classical information and coding theory to todays data-driven methods. Learning can be viewed as extracting, compressing, and representing information from data under uncertainty. Concepts such as entropy, mutual information, and generalization bounds provide deep insights into the behavior of modern machine learning algorithms.
Course Information
6 ECTS Credits
Lectures
Lecturer | Prof. Dr.-Ing. Stephan ten Brink and PD Dr.-Ing. Christian Senger |
Time Slot | Thursday, 14:00-15:30 |
Lecture Hall | 2.348 (ETI2) |
Weekly Credit Hours | 2 |
Exercises
Lecturer | Jannis Clausius, Simon Obermüller, Daniel Tandler |
Time Slot | Wednesday, 14:00-15:30 |
Lecture Hall | 2.348 (ETI2) |
Weekly Credit Hours | 2 |
Stephan ten Brink
Prof. Dr.-Ing.Director of the Institute

Christian Senger
PD Dr.-Ing.Deputy Director