After following this course, students are expected to:
- Be able to formulate a deep learning problem mathematically by choosing an adequate deep network architecture and operators (Modeling).
- Be able to derive the backpropagation principle for deep learning networks and to verify its relation to stochastic gradient descent (Optimization).
- Be able to use and extend a state-of-the-art software package for deep learning in Python and Matlab to solve a real-world example (Programming).
- Be able to derive and apply a variational auto-encoder and a generative adversarial network for a specific problem of data interpolation (Data Compression).
- Be able to validate a given deep learning model based on robustness and generalization properties (Understanding).
|
|
In recent years, in big data science and machine learning, deep artificial neural networks have won numerous contests, e.g. in computer vision, pattern recognition or medical imaging, often reaching human performance. Although, many excellent practical toolboxes, e.g. for convolutional neural networks, exist, many people see deep learning only as black box. This course provides students with the mathematical background needed to better understand, analyze and apply models through algorithms at the heart of deep learning. The course will equip students with new insights into deep learning theory with direct impact on deep learning applications.
The main topics include:
- Introduction to deep learning and convolutional neural networks
- Backpropagation and basic stochastic optimization
- Basic auto-encoders and data compression
- Deep architectures with dropout and sparsity
- Advanced optimization methods
- Sequential learning and recurrent neural networks
- Variational auto-encoders and generative adversarial networks
- Deep reinforcement learning
Mandatory previous knowledge:
Students need to have a solid background in multivariate calculus, linear algebra and basic programming experience in Python. Basic knowledge about numerical methods for differential equations and basic knowledge about optimization methods is mandatory, for instance from:
B-AM M6: Dynamical Systems
B-AM, B-TCS M7: Discrete Structures and Efficient Algorithms.
Recommended previous knowledge:
Basic knowledge on machine learning, e.g. via the courses Machine Learning I (201600070) or Statistical Learning (201900115), is recommended and facilitates the students’ comprehension of introductory concepts addressing deep learning practice. Students with a theoretical focus, will find skills obtained from the MSc math courses scientific computing, information theory or complex networks helpful for this.
|
 |
|