Kies de Nederlandse taal
Course module: 201700080
Information Theory and Statistics 
Course info
Course module201700080
Credits (ECTS)5
Course typeCourse
Language of instructionEnglish
Contact J. Goseling
Contactperson for the course J. Goseling
Examiner J. Goseling
Academic year2020
Starting block
Application procedureYou apply via OSIRIS Student
Registration using OSIRISYes
After passing the assessment of this course a student:
  1. knows Shannon entropy and mutual information, is able to perform computations involving these information measures and is able to estimate these measures from a data set,
  2. knows the Huffman, Lempel-Ziv and CTW data compression algorithms, and understands the limits on data compression,
  3. understands the connection between machine learning and data compression and is able to quantify performance limits on machine learning algorithms,
  4. is able to quantify the optimal performance that can be expected from a classification system in terms of information measures,
  5. knows how linear block codes can be used to achieve reliable communication over a noisy channel, understands en-/de-coding of a graph-based code and understands the limits on reliable communication.
Information theory is a mathematical theory dealing with the fundamental principles of storing, processing and transmitting data. The first half of this course covers the core concepts of information theory, including entropy and mutual information. These are then used to derive fundamental limits on data compression and communication. The second half of this course focusses on applications of information theory to statistics and machine learning. In particular, information theory will be used to develop performance limits on machine learning algorithms.
Assumed previous knowledge
Basic probability theory, for instance from:
• B-TI M4,
• B-EE M8,
• B-TW M3 or
• 201400221 Probability Theory and Statistics (premaster M-CSC).
Participating study
Master Computer Science
Participating study
Master Applied Mathematics
Participating study
Master Electrical Engineering
Required materials
MacKay, David JC. Information theory, inference and learning algorithms. Cambridge university press, 2003. ISBN-13: 978-0521642989 (Also available online: )
Recommended materials
Instructional modes



1) Written exam (65 %), 2) homework (15 %), 3) report, based on project or reading assignment (20 %)

Kies de Nederlandse taal