my courses

neural networks, machine learning and randomness [FIT]theoretical fundamentals of neural networks [FJFI]

past courses

teorie neuronových sítí [FIT]lineární algebra 1 [FJFI]lineární algebra 2 [FJFI]matematika 3 [FJFI]

theoretical fundamentals of neural networks

The tutorials for the course Theoretical Foundations of Neural Networks are held on even weeks on Mondays from 8:00 to 9:40 in room T-115, or as announced in lectures and tutorials.

study materials

The exercise materials are in the form of Jupyter notebooks using Python. For easy installation of Python, Jupyter, and the required packages, we will use uv . The first exercise is intended to familiarize you with this technology. To replicate the exact environment in which the notebooks were created, you can use the files pyproject.toml and uv.lock.
#nametopicsassignmentsolution
1introduction to neural networksbasics of machine learning in Python, automatic differentiation, simple neural networks, their architecture, gradient descent, activation functions, hyperparameters and their selectionnotebooknotebook
2optimizationoptimization of neural networks, SGD, Momentum, Nesterov, Adagrad, Adadelta, Adam, AdamWnotebooknotebook
3regularizationdata splitting into training and test sets, overfitting, bias-variance trade-off, L1, L2 regularization, early stopping, dropout, batching, cross-validation, its variants and usage, double descent
4recurrent networksrecurrent networks, natural language processing, transformers
5graph neural networkslearning on graphs, recurrent neural networks on graphs, networks based on random walks, analogies with text processing, convolutions on graphs