URL study guide
https://tue.osiris-student.nl/onderwijscatalogus/extern/cursus?cursuscode=2MMS80&collegejaar=2025&taal=enDescription
- Probabilistic framework of supervised learning
- Complexity regularization: overview of strategies
- Denoising smooth functions and the regressogram
- Concentration of measure, and simple applications to learning settings
- PAC bounds
- Analysis of leave-one-out cross validation in the context of nearest-neighbor classification
- Construction and performance analysis of decision trees
- Vapnik-Chervonenkis (VC) Theory
- Rademacher Complexity and generalization bounds
Successful completion of a probability theory course with contents and depth comparable with 2MBS10 or 2DDR10 is effectively necessary, as well as a reasonable level of mathematical maturity and familiarity with mathematical proofs. Knowledge of statistics is not necessary.
Specifically, it is assumed students have successfully completed a course covering at least chapters 1 to 5 of the materials in:
Baron, M., “Probability and Statistics for Computer Scientists”, ISBN 9781439875919, CRC Press, 2013
at the depth the topics are covered there, and selected topics of chapter 6 of that same book.
Relevant probability theory topics: independence, conditional probability and distributions, conditional expectation, law of total expectation, law of large numbers, central limit theorem.
Important calculus knowledge: integration and basic series results.
Objectives
- Get acquainted with the probabilistic framework of learning theory
- Use the tools presented in class to study the performance of several machine learning algorithms
- Understand the trade-off between approximation and estimation errors, and their relation to the generalization capabilities of learning algorithms
- Be equipped with background knowledge to be able to meaningfully read and criticize research articles in learning theory