+49 711 685-67941
Channel Coding: Spatially Coupled LDPC Codes and Polar Codes
In order to simulate such low error probabilities, we currently develop a simulation cluster based on graphic cards (GPU) and the Nvidia CUDA programming language. This setup allows to simulate an arbitrary LDPC code / polar code with bit error rates (BER) up to 10 -9 - 10 - 10 within a feasible period of time.
Deep learning-based channel decoding is doomed by the curse of dimensionality: for a short code of length N=100 and rate r= 0.5, 250 different codewords exist, which are far too many to fully train any neural network (NN) in practice. The only way that a NN can be trained for practical blocklengths is, if it learns some form of decoding algorithm which can infer the full codebook from training on a small fraction of codewords. However, to be able to learn a decoding algorithm, the code itself must have some structure which is based on a simple encoding rule, like in the case of convolutional or algebraic codes. The goal of our work is to shed some light on the question whether structured codes are easier to “learn” than random codes, and whether a NN can decode codewords that it has never seen during training.
Always looking for Studentes interested in
- Channel Coding (LDPC codes, Polar codes)
- Machine learning/Deep Learning for Communications
- GPU Programming (C/C++ and CUDA)
Winter term 2015/2016: Exercises Modern Error Correction
Summer term 2016: Übungen Nachrichtentechnik 2
Winter term 2016/2017: Exercises Modern Error Correction