Fast Fourier Transform (FFT)
The impact of this in 1965 invented linear operation is unbelievable big. Each laptop calculates for example 250 thousand FFTs per second! And if you would like to multiply two large numbers you also use FFT. JPEG uses FFT as well.
With Fourier, you write the signal as the sum of everlasting harmonic functions. That is quite strange when the signal (e.g. a piece of music) is finitely long. You can look at wavelets as an extension of Fourier, but then one that is closer to the musical notation: the building blocks are of finite length, but we still have limited frequencies. We illustrate wavelets for images and we shall see that spectacular compression ratios can be achieved.
This is maybe the most beautiful example of the force of mathematical modelling. A file with only zeros is easy to compress and if the zeros and ones are alternating often, the compressing becomes more difficult. But how can we understand this mathematically? In this part, we provide the basis of the information theory of Claude Shannon. We will see that there is a natural measure for the lack of structure called entropy and that this entropy is equal to the optimal compression ratio.
You would also like to send files, say, from your router to a laptop and you have a limited amount of information that you can send per unit time. That is what we call (channel)capacity. How do you model this and optimize it? This is what we going to cover too and we make the connection with entropy.
Mathematical models often simplify considerably if we exploit the dimensions (such as “length” and “mass”). The central result in the field is the Buckingham pi-theorem. We prove this result and apply it on a number of applications, such as crowd modelling and walking dinosaurs. Understanding dimensional analysis really helps to improve our understanding mathematical models.