
After passing the assessment of this course a student:
 knows Shannon entropy and mutual information, is able to perform computations involving these information measures and is able to estimate these measures from a data set,
 knows the Huffman, LempelZiv and CTW data compression algorithms, and understands the limits on data compression,
 understands the connection between machine learning and data compression and is able to quantify performance limits on machine learning algorithms,
 is able to quantify the optimal performance that can be expected from a classification system in terms of information measures.


Information theory is a mathematical theory dealing with the fundamental principles of storing, processing, and transmitting data. The first half of this course covers the core concepts of information theory, including entropy and mutual information. These are then used to derive fundamental limits on data compression and communication. The second half of this course focuses on applications of information theory to statistics and machine learning. In particular, information theory will be used to develop performance limits on machine learning algorithms.
Assessment
1) Written exam (60 %, needs >= 5.5)
2) homework (20 %)
3) report, based on project or reading assignment (20 %)



 Assumed previous knowledgeBasic probability theory, for instance from: • BAM M4, • BEE M8, • BCS/BIT M4 or • 202001177 Probability Theory and Statistics (premaster MCS). 
Master Applied Mathematics 
Master Electrical Engineering 
  Required materialsHandouts 
 Recommended materialsInstructional modesTestsTest


 