Close Help Print
 Course module: 202001281
 202001281Signals with Information
 Course info
Course module202001281
Credits (ECTS)5
Course typeHonours
Language of instructionEnglish
Contact persondr.ir. G. Meinsma
E-mailg.meinsma@utwente.nl
Lecturer(s)
 Lecturer dr. P. van Adrichem - Rotteveel Contactperson for the course dr.ir. G. Meinsma Lecturer dr.ir. G. Meinsma Lecturer dr. F.L. Schwenninger
Starting block
 1B
RemarksThis course is part of the Bachelor Honours programme
Application procedureYou apply via OSIRIS Student
Registration using OSIRISYes
 Aims
 body { font-size: 9pt; font-family: Arial } table { font-size: 9pt; font-family: Arial } Afterwards the student is able to prove the efficiency of FFT and how it can be used to speed up multiplication of long integers; explain the use of linear algebra and FFT in JPEG and wavelet theory; model “information” and to prove basic theorems from the field of information theory; prove and apply the Buckingham pi theorem.
 Content
 body { font-size: 9pt; font-family: Arial } table { font-size: 9pt; font-family: Arial } Fast Fourier Transform (FFT) The impact of this in 1965 invented linear operation is unbelievable big. Each laptop calculates for example 250 thousand FFTs per second! And if you would like to multiply two large numbers you also use FFT. JPEG uses FFT as well.   Wavelets With Fourier, you write the signal as the sum of everlasting harmonic functions. That is quite strange when the signal (e.g. a piece of music) is finitely long. You can look at wavelets as an extension of Fourier, but then one that is closer to the musical notation: the building blocks are of finite length, but we still have limited frequencies. We illustrate wavelets for images and we shall see that spectacular compression ratios can be achieved.   Information theory This is maybe the most beautiful example of the force of mathematical modelling. A file with only zeros is easy to compress and if the zeros and ones are alternating often, the compressing becomes more difficult. But how can we understand this mathematically? In this part, we provide the basis of the information theory of Claude Shannon. We will see that there is a natural measure for the lack of structure called entropy and that this entropy is equal to the optimal compression ratio. You would also like to send files, say, from your router to a laptop and you have a limited amount of information that you can send per unit time. That is what we call (channel)capacity. How do you model this and optimize it? This is what we going to cover too and we make the connection with entropy.   Mathematical models often simplify considerably if we exploit the dimensions (such as “length” and “mass”). The central result in the field is the Buckingham pi-theorem. We prove this result and apply it on a number of applications, such as crowd modelling and walking dinosaurs. Understanding dimensional analysis really helps to improve our understanding mathematical models.
Required materials
-
Recommended materials
-
Instructional modes
Other
 Presence duty Yes

Tests
 Exam
 Close Help Print