Course detail

Neural networks and Machine Learning

FSI-VSC Acad. year: 2025/2026 Summer semester

The course provides an introduction to the theory and methods of machine learning, focusing on their application in solving classification, regression, and clustering tasks.

Learning outcomes of the course unit

Prerequisites

Basic knowledge of statistics, optimization, and programming is expected.

Planned learning activities and teaching methods

Assesment methods and criteria linked to learning outcomes

Knowledge and skills are verified by credit and examination. Credit requirements: elaboration of given tasks. Attendance at lectures is recommended, while attendance at practical sessions is mandatory. Practical sessions that a student is unable to attend in the regular term can be made up during a substitute term. The exam is oral and covers the entire course material.

Language of instruction

Czech

Aims

The aim of the course is to familiarize students with machine learning methods and their applications in classification, regression, and clustering. Students will learn about both parametric and non-parametric classification and regression models, as well as key concepts such as error metrics, regularization, cross-validation, gradient descent, and modern approaches, including boosting and Gaussian mixture models. The course bridges theory and practice, focusing on the design and implementation of machine learning models.

Specification of controlled education, way of implementation and compensation for absences

The study programmes with the given course

Programme N-AIŘ-P: Applied Computer Science and Control, Master's
branch ---: no specialisation, 5 credits, compulsory

Type of course unit

 

Lecture

26 hours, optionally

Syllabus


  1. Introduction, supervised and unsupervised learning, regression vs. classification, dataset splitting, error metrics, loss functions, cross-validation, overfitting, regularization.

  2. Linear regression, least squares method, gradient descent, regularized least squares method.

  3. Linear classification, logistic regression, regularized logistic regression.

  4. Non-parametric models, nearest neighbors method.

  5. Decision trees for classification and regression problems.

  6. Generative models for classification, Bayes classifier, Gaussian discriminant analysis.

  7. Naive Bayes classifier.

  8. Support vector machines, kernel functions.

  9. Clustering, k-means clustering.

  10. Gaussian mixture models.

  11. Dimensionality reduction, boosting.

  12. Mathematical model of a neuron, activation functions, multilayer perceptron, forward and backward propagation.

  13. Feedforward and multilayer neural networks, recurrent networks, topologically organized neural networks.

Computer-assisted exercise

26 hours, compulsory

Syllabus


  1. Introduction to the programming environment.

  2. Least squares method and regularized least squares method.

  3. Linear classification, logistic regression, regularized logistic regression.

  4. Nearest neighbors method.

  5. Decision trees for classification and regression problems.

  6. Bayes classifier, Gaussian discriminant analysis.

  7. Naive Bayes classifier.

  8. Support vector machines.

  9. K-means clustering.

  10. Gaussian mixture models.

  11. Dimensionality reduction, boosting.

  12. Multilayer perceptron.

  13. Final assessment.