MSc in Artificial Intelligence for the University of Amsterdam.
Find Out MoreDeep learning is primarily a study of multilayered neural networks, spanning over a great range of model architectures. This course is taught in the MSc program in Artificial Intelligence of the University of Amsterdam. In this course we study the theory of deep learning, namely of modern, multilayered neural networks trained on big data. The course is taught by Assistant Professor Xiantong Zhen. The teaching assistants are Christos Athanasiadis, Ilze Amanda Auzina, Leonard Bereska, Jim Boelrijk, Natasha Butt, Mohammad Mahdi Derakhshani, Winfried van den Dool, Yingjun Du, Alex Gabel, Mariya Hendriksen, Tom Lieberum, Phillip Lippe, Yongtuo Liu, Jie Liu, Ben Miller, Ivona Najdenkoska, Sarah Rastegar, Nadja Rutsch, Mohammadreza Salehi, Jiayi Shen, Tom van Sonsbeek, Riccardo Valperga, Haochen Wang, Zehao Xiao
November 4, 2021  11.0013.00  Lecture
This lecture introduces the structure of the Deep Learning course, and gives a short overview of the history and motivation of Deep Learning.
November 4, 2021  13.0015.00  Tutorial session
This tutorial introduces the practical sessions, the TA organizer team, etc. Afterwards, we will discuss the PyTorch machine learning framework, and introduce you to the basic concepts of Tensors, computation graphs and GPU computation. We will continue with a small handson tutorial of building your own, first neural network in PyTorch.
We also provide a crashcourse for working with the Lisa cluster, and how to setup your account for Lisa.
November 5, 2021  11.0013.00  Lecture
This lectures introduces basic concepts for Deep Feedforward Networks such linear and nonlinear modules, gradientbased learning and the backpropagation algorithm.
November 11, 2021  11.0013.00  Lecture
This lecture series discusses advanced optimizers, initialization, normalization and hyperparameter tuning.
November 11, 2021  13.0015.00  Tutorial session + TA Q&A
In this tutorial, we will discuss the role of activation functions in a neural network, and take a closer look at the optimization issues a poorly designed activation function can have.
After the presentation, there will by a TA session for Q&A for assignment 1, lecture content and more.
November 12, 2021  11.0013.00  Lecture
This lecture series discusses advanced optimizers, initialization, normalization and hyperparameter tuning.
November 18, 2021  11.0013.00  Lecture
This lecture series covers convolutional neural networks for image processing.
November 18, 2020  13.0015.00  Tutorial session + TA Q&A
In this tutorial, we will discuss the importance of proper parameter initialization in deep neural networks, and how we can find a suitable one for our network. In addition, we will review the optimizers SGD and Adam, and compare them on complex loss surfaces.
After the presentation, there will by a TA session for Q&A for assignment 1, lecture content and more.
November 19, 2021  11.0013.00  Lecture
This lecture series covers modern ConvNet architecture.
November 25, 2021  11.0013.00  Lecture
This lecture series covers Recurrent Neural Networks
November 25, 2020  13.0015.00  Tutorial session + TA Q&A
In this tutorial, we will implement three popular, modern ConvNet architectures: GoogleNet, ResNet, and DenseNet. We will compare them on the CIFAR10 dataset, and discuss the advantages that made them popular and successful across many tasks.
After the presentation, there will by a TA session for Q&A for assignment 2, lecture content and more.
November 26, 2021  13.0015.00  Online Lecture
Petar Veličković lecture on Graph Neural Networks.
December 2, 2021  11.0013.00  Lecture
Ivona's lecture on Attention and Transformers.
December 2, 2020  13.0015.00  Tutorial session + TA Q&A
In this tutorial, we will discuss the implementation of Graph Neural Networks. In the first part of the tutorial, we will implement the GCN and GAT layer ourselves. In the second part, we use PyTorch Geometric to look at nodelevel, edgelevel and graphlevel tasks.
After the presentation, there will by a TA session for Q&A for assignment 2, lecture content and more.
December 3, 2021  15.0017.00  Lecture
This lecture series discusses Generative Adversarial Networks.
December 9, 2021  11.0013.00  Lecture
This lecture series introduces the framework of variational inference and Variational Autoencoders (VAEs).
December 9, 2020  13.0015.00  Tutorial session + TA Q&A
In this tutorial, we will discuss the relatively new breakthrough architecture: Transformers. We will start from the basics of attention and multihead attention, and build our own Transformer. We will perform experiments on sequencetosequence tasks and set anomaly detection.
After the presentation, there will by a TA session for Q&A for assignment 3, lecture content and more.
December 10, 2021  15.0017.00  Online Lecture
Deep Learning Generalization.
December 16, 2021  11.0013.00  Lecture
Geometry and Structure Regularized Deep Learning.
December 16, 2020  13.0015.00  Tutorial session + TA Q&A
In this tutorial, we will discuss deep convolutional autoencoders and their applications. In the practical and lecture, you will see variational autoencoders (VAE), which add a stochastic part to vanilla autoencoders. Both have their advantages and applications, of which we visit image retrieval and compression for the vanilla auotoencoder.
After the presentation, there will by a TA session for Q&A for assignment 3, lecture content and more.
December 17, 2021  11.0013.00  Lecture
Details will follow soon.
Deadline: November 19, 2021 Multilayer perceptrons and backpropagation Documents: 

Deadline: December 3, 2021 CNNs, RNNs & Graph CNNs Documents: 

Deadline: December 17, 2021 Generative Models Documents: 

Some useful links for the course are the following:
If you have any questions or recommendations for the website or the course, you can always drop us a line! The knowledge should be free, so feel also free to use any of the material provided here (but please be so kind to cite us). In case you are a course instuctor and you want the solutions, please send us an email.