Lectures


Week 1

This lecture introduces the structure of the Deep Learning course, and gives a short overview of the history and motivation of Deep Learning.

Documents:

Lecture recordings:

This tutorial introduces the practical sessions, the TA organizer team, etc. Afterwards, we will discuss the PyTorch machine learning framework, and introduce you to the basic concepts of Tensors, computation graphs and GPU computation. We will continue with a small hands-on tutorial of building your own, first neural network in PyTorch.

Documents:

This lectures introduces basic concepts for Deep Feedforward Networks such linear and nonlinear modules, gradient-based learning and the backpropagation algorithm.

Documents:

Lecture recordings:

Week 2

This lecture series discusses advanced optimizers, initialization, normalization and hyperparameter tuning.

Documents:

Lecture recordings:

In this tutorial, we will discuss the role of activation functions in a neural network, and take a closer look at the optimization issues a poorly designed activation function can have.

After the presentation, there will by a TA session for Q&A for assignment 1, lecture content and more.

Documents:

This lecture series discusses advanced optimizers, initialization, normalization and hyperparameter tuning.

Documents:

Lecture recordings:

Week 3

This lecture series covers convolutional neural networks for image processing.

Documents:

Lecture recordings:

In this tutorial, we will discuss the importance of proper parameter initialization in deep neural networks, and how we can find a suitable one for our network. In addition, we will review the optimizers SGD and Adam, and compare them on complex loss surfaces.

After the presentation, there will by a TA session for Q&A for assignment 1, lecture content and more.

Documents:

No documents.

This lecture series covers modern ConvNet architecture.

Documents:

Lecture recordings:

Week 4

This lecture series covers Transformers

Documents:

Lecture recordings:

In this tutorial, we will implement three popular, modern ConvNet architectures: GoogleNet, ResNet, and DenseNet. We will compare them on the CIFAR10 dataset, and discuss the advantages that made them popular and successful across many tasks.

After the presentation, there will by a TA session for Q&A for assignment 2, lecture content and more.

Documents:

No documents.

Lecture on Graph Neural Networks.

Documents:

Lecture recordings:

Week 5

Lecturer in deep variational autoencoder.

Documents:

Lecture recordings:

In this tutorial, we will discuss the relatively new breakthrough architecture: Transformers. We will start from the basics of attention and multi-head attention, and build our own Transformer. We will perform experiments on sequence-to-sequence tasks and set anomaly detection.

After the presentation, there will by a TA session for Q&A for assignment 3, lecture content and more.

Documents:

No documents.

Deep Learning & The Natural Sciences

Documents:

Lecture recordings:

Week 6

Gemerative Advetsarial Networks and difusion models

Documents:

Lecture recordings:

In this tutorial, we will discuss the implementation of Graph Neural Networks. In the first part of the tutorial, we will implement the GCN and GAT layer ourselves. In the second part, we use PyTorch Geometric to look at node-level, edge-level and graph-level tasks.

After the presentation, there will by a TA session for Q&A for assignment 2, lecture content and more.

Documents:

Guest Lecture: Deep Learning for 3D (Christian Rupprecht)

Documents:

Lecture recordings:

Week 7

Self-supervised learning part I

Documents:

No documents.

Lecture recordings:

No recordings.

We will discuss Tutorial 17: Self-Supervised Learning, and have a short introduction to Causal Representation Learning.

Documents:

No documents.

Self-supervised learning part II

Documents:

No documents.

Lecture recordings:

No recordings.