The tutorials for the course Theoretical Foundations of Neural Networks are held on even weeks on Mondays from 8:00 to 9:40 in room T-115, or as announced in lectures and tutorials.
study materials
The exercise materials are in the form of Jupyter notebooks using Python. For easy installation of Python, Jupyter, and the required packages, we will use uv . The first exercise is intended to familiarize you with this technology. To replicate the exact environment in which the notebooks were created, you can use the files pyproject.toml and uv.lock.| # | name | topics | assignment | solution |
|---|---|---|---|---|
| 1 | introduction to neural networks | basics of machine learning in Python, automatic differentiation, simple neural networks, their architecture, gradient descent, activation functions, hyperparameters and their selection | notebook | notebook |
| 2 | optimization | optimization of neural networks, SGD, Momentum, Nesterov, Adagrad, Adadelta, Adam, AdamW | notebook | notebook |
| 3 | regularization | data splitting into training and test sets, overfitting, bias-variance trade-off, L1, L2 regularization, early stopping, dropout, batching, cross-validation, its variants and usage, double descent | ||
| 4 | recurrent networks | recurrent networks, natural language processing, transformers | ||
| 5 | graph neural networks | learning on graphs, recurrent neural networks on graphs, networks based on random walks, analogies with text processing, convolutions on graphs |