CloseHelpPrint
Kies de Nederlandse taal
Course module: 201800177
201800177
Deep Learning - From Theory to Practice
Course info
Course module201800177
Credits (ECTS)5
Course typeCourse
Language of instructionEnglish
Contact personprof.dr. C. Brune
E-mailc.brune@utwente.nl
Lecturer(s)
Contactperson for the course
prof.dr. C. Brune
Examiner
prof.dr. C. Brune
Academic year2018
Starting block
1B
Application procedureYou apply via OSIRIS Student
Registration using OSIRISYes
Aims
After following this course, students are expected to:
  1. Be able to formulate a deep learning problem mathematically by choosing an adequate deep network architecture and operators (Modeling).
  2. Be able to derive the backpropagation principle for deep learning networks and to verify its relation to stochastic gradient descent (Optimization).
  3. Be able to use and extend a state-of-the-art software package for deep learning in Python and Matlab to solve a real-world example (Programming).
  4. Be able to derive and apply a variational auto-encoder and a generative adversarial network for a specific problem of data interpolation (Data Compression).
  5. Be able to validate a given deep learning model based on robustness and generalization properties (Understanding).
Content
In recent years, in big data science and machine learning, deep artificial neural networks have won numerous contests, e.g. in computer vision, pattern recognition or medical imaging, often reaching human performance. Although, many excellent practical toolboxes, e.g. for convolutional neural networks, exist, many people see deep learning only as black box. This course provides students with the mathematical background needed to better understand, analyze and apply models through algorithms at the heart of deep learning. The course will equip students with new insights into deep learning theory with direct impact on deep learning applications.
The main topics include:
  • Introduction to deep learning and convolutional neural networks
  • Backpropagation and basic stochastic optimization
  • Basic auto-encoders and data compression
  • Deep architectures with dropout and sparsity
  • Advanced optimization methods
  • Sequential learning and recurrent neural networks
  • Variational auto-encoders and generative adversarial networks
  • Deep reinforcement learning
Mandatory previous knowledge:
Students need to have a solid background in multivariate calculus, linear algebra and basic programming experience in Matlab or Python. Basic knowledge about numerical methods for differential equations and basic knowledge about optimization methods is mandatory, for instance from:
B-TW M6: Dynamical systems
B-TW, B-TCS M7: Discrete structures and efficient algorithms.

Recommended previous knowledge:
Basic knowledge on machine learning, e.g. via the data mining topic in the Data Science course (201400174) or the Basic Machine Learning course (201600070), is recommended and facilitates the students’ comprehension of introductory concepts addressing deep learning practice. Students with a theoretical focus, will find skills obtained from the MSc math courses scientific computing, information theory or complex networks helpful for this.
Participating study
Master Applied Mathematics
Required materials
Course material
Slides and lecture notes (available digitally)
Course material
Deep Learning by Ian Goodfellow and Yoshua Bengio and Aaron Courville, MIT Press, 2016 (available digitally)
Course material
Neural Networks and Deep Learning by Michael Nielsen, Online book, 2016 (available digitally)
Course material
Learning Deep Architectures for AI by Yoshua Bengio, NOW Publishers, 2009 (available digitally)
Recommended materials
-
Instructional modes
Lecture

Tutorial

Tests
Test

Remark
1) Oral exam (60%), 2) homework (20%), 3) report, presentation based on project assignment (20%)

CloseHelpPrint
Kies de Nederlandse taal