Courses
Statistical analysis of graph-based learning on manifolds (Giovanni Prodi Lecture, 4+2)
Lecturer: Prof. Dr. Nicolás García Trillos
Abstract: This course is concerned with the statistical understanding of the use of graphs to solve certain learning tasks. The fundamental high-level questions that we will be investigating are the following: In what sense are graph-based learning procedures statistically optimal for the underlying estimation problems that they attempt to tackle? Are there any alternative ways to process data that could outperform existing graph-based methodologies? To answer these questions theoretically, we will explore some of the existing frameworks to study the optimality of estimators in a general statistical setting. We will focus on the concepts of minimaxity and (classical) asymptotic efficiency of estimators, but toward the end of the course we will also discuss a notion of efficiency in the sense of sensitivity to data perturbations. We will use these frameworks to explore graph-based learning on manifolds (i.e., a setting where data points are sampled from a manifold) from a well-defined statistical perspective and provide some answers to the high-level questions posited earlier. While the emphasis will be on analyzing graph Laplacians and their eigenpairs (in supervised and unsupervised settings), a lot of the discussion throughout the course is intended to spark new ideas and motivate new problems that combine PDEs, PDEs on graphs, and statistical analysis.
The course combines statistical theory and mathematical analysis. Familiarity with some probability theory, especially concentration inequalities, as provided, for example, in the courses “Mathematical Data Science and Machine Learning” or “PDEs on Graphs” is necessary.
4 h Wed, Thu 8-10, SE 30
Exercise to Statistical analysis of graph-based learning on manifolds (Giovanni Prodi Lecture)
2 h Thu 10-12, SE 30
Lecturer: Prof. Dr. Leon Bungert
Research Seminar Mathematics of machine learning and applied analysis (2 h)
on agreement
Mathematical Data Science and Machine Learning (Master of Mathematical Data Science)
Lecturer: Prof. Leon Bungert
4 h Tue 10-12, SE 30 and Thu 10-12, 00.102 (Library & Seminar Centre)
Exercise to Mathematical Data Science and Machine Learning (Master)
2 h Thu 16-18, 00.102 (Library & Seminar Centre)
Discrete Mathematics: PDEs on Graphs (Master)
Lecturer: Dr. Eloi Martinet
Theory and Applications in Learning:
This course will be devoted to the mathematical study of famous unsupervised and semi-supervised learning algorithms like Spectral Clustering and Google's PageRank. We will see that in the large data limit, these algorithms approximate certain PDEs
3 h Mon 14-16 and Thu 08-09, 01.101 (Library & Seminar Centre)
Exercise to Discrete Mathematics: PDEs on Graphs (Master)
1 h Thu 9-10, 01.101 (Library & Seminar Centre)
Lecturer: Prof. Dr. Leon Bungert
Research Seminar Mathematics of machine learning and applied analysis (2 h)
on agreement
Mathematical Foundations of Data Science II (Bachelor)
Lecturer: Prof. Leon Bungert
The lecture “Mathematical Foundations of Data Science” deals with the essential mathematical concepts that are essential for understanding and applying data science and machine learning. For this purpose, there are introductions to basic mathematical concepts of linear algebra and statistics. Specific mathematical techniques and methods used in data analysis and machine learning are also introduced. These include optimization, numerical methods, linear regression, cluster analysis, dimensionality reduction, artificial neural networks and deep learning. The lecture focuses on teaching students the basic methods and concepts of data science and how to use them for applications.
with Exercise
Machine Learning and Numerics Lab (Bachelor practicum)
Lecturer: Prof. Leon Bungert
Work Group Finite Element Methods and Physics Informed Neural Network
Lecturer: Dr. Eloi Martinet
Seminar Machine Learning (Bachelor seminar)
Lecturer: Prof. Dr. Leon Bungert und Dr. Eloi Martinet
Mathematical Foundations of Data Science (Mathematik 3 for KIDS)
(2+1 im Bachelor Mathematical Data Science)
Lecturer: Dr. Eloi Martinet
The lecture “Mathematical Foundations of Data Science” deals with the essential mathematical concepts that are essential for understanding and applying data science and machine learning. For this purpose, there are introductions to basic mathematical concepts of linear algebra and statistics. Specific mathematical techniques and methods used in data analysis and machine learning are also introduced. These include optimization, numerical methods, linear regression, cluster analysis, dimensionality reduction, artificial neural networks and deep learning. The lecture focuses on teaching students the basic methods and concepts of data science and how to use them for applications.
with Exercise
Discreet Mathematics (PDEs on Graphs: Theory and Applications in Learning) (Master)
Lecturer: Prof. Dr. Leon Bungert
with Exercise
Machine Learning with Graphs (Master Seminar)
Lecturer: Prof. Dr. Leon Bungert, Dr. Eloi Martinet
In this seminar we will discover machine learning methods that involve graphs. This includes partial differential equations on graphs and their use for semi-supervised machine learning, as well as graph neural networks for supervised learning with graph data. The seminar will cover theoretical and numerical aspects and can lead to a master's thesis in this topic.
Referenzen:
[1] Calder, J., Cook, B., Thorpe, M., & Slepcev, D. (2020, November). Poisson learning: Graph based semi-supervised learning at very low label rates. In International Conference on Machine Learning (pp. 1306-1316). PMLR.
[2] Calder, J. (2018). The game theoretic p-Laplacian and semi-supervised learning with few labels. Nonlinearity, 32(1), 301.
[3] Bungert, L., Calder, J., & Roith, T. (2023). Uniform convergence rates for Lipschitz learning on graphs. IMA Journal of Numerical Analysis, 43(4), 2445-2495.
[4] Bronstein, M. M., Bruna, J., LeCun, Y., Szlam, A., & Vandergheynst, P. (2017). Geometric deep learning: going beyond euclidean data. IEEE Signal Processing Magazine, 34(4), 18-42.
[5] Bronstein, M. M., Bruna, J., Cohen, T., & Veličković, P. (2021). Geometric deep learning: Grids, groups, graphs, geodesics, and gauges. arXiv preprint arXiv:2104.13478.
[6] Xia, F., Sun, K., Yu, S., Aziz, A., Wan, L., Pan, S., & Liu, H. (2021). Graph learning: A survey. IEEE Transactions on Artificial Intelligence, 2(2), 109-127.
[7] Song, Z., Yang, X., Xu, Z., & King, I. (2022). Graph-based semi-supervised learning: A comprehensive review. IEEE Transactions on Neural Networks and Learning Systems.
Mathematical Foundations of Data Science II (2+1 in the Bachelor Mathematical Data Science)
Dozent: Prof. Leon Bungert
The lecture “Mathematical Foundations of Data Science” deals with the essential mathematical concepts that are essential for understanding and applying data science and machine learning. For this purpose, there are introductions to basic mathematical concepts of linear algebra and statistics. Specific mathematical techniques and methods used in data analysis and machine learning are also introduced. These include optimization, numerical methods, linear regression, cluster analysis, dimensionality reduction, artificial neural networks and deep learning. The lecture focuses on teaching students the basic methods and concepts of data science and how to use them for applications.
with Exercise
Numerical Mathematics and Applied Analysis (Work group)
Lecturer: Dr. Eloi Martinet
In the vast majority of theoretical and real-world applications, the solution to a partial differential equation can not be computed analytically. The aim of this course is to explore two methods allowing to compute approximate solutions. The first, ”traditional” one, is the Finite Element Method. The second one makes use of the recent developments in Deep Learning leading to the so called ”Physics Informed Neural Networks”. In a first part, we will study the fundamental tools needed to solves elliptic partial differential equations, such as weak derivatives and Sobolev spaces. We will show how to approximate PDEs using basic finite elements tools. In the second part, we present the definition of au Neural Network and derive the proof of the so-called "Universal Approximation Theorem". Using backpropagation, we show how a network can be trained to solve some partial differential equations.
![[Translate to Englisch:] Studierende im Hörsaal während einer Vorlesung](/fileadmin/_processed_/e/a/csm_Studierende_Hoersaal_Vorlesung_e2b7be0284.jpg)