Oberseminar "Mathematik des Maschinellen Lernens und Angewandte Analysis" - M.Sc. Sara-Viola Kuntz
Expressivity of Neural ODEs and Neural DDEs
| Date: | 12/03/2025, 2:15 PM - 3:15 PM |
| Category: | event |
| Location: | Hubland Nord, Geb. 40, 01.003 |
| Organizer: | Lehrstuhl für Mathematik III (Maschinelles Lernen) |
| Speaker: | Sara-Viola Kuntz, Technische Universität München |
Besides classical feed-forward neural networks, neural ordinary differential equations (neural ODEs) have gained particular interest in recent years. Neural ODEs can be interpreted as the infinite-depth limit of residual neural networks (ResNets). Similarly, neural delay differential equations (neural DDEs) can be interpreted as the infinite-depth limit of densely connected residual neural networks (DenseResNets). In contrast to traditional ResNet architectures, DenseResNets are feed-forward neural networks that allow for shortcut connections across all layers, introducing additional memory in the network.
In the first part of this talk, we study the input-output dynamics of neural ODEs via Morse functions. Depending on the dimension of the phase space, the input-output map has different properties regarding the existence and regularity of critical points. The established theorems allow us to formulate results on the universal embedding and universal approximation property of neural ODEs.
In the second part of the presentation, we explore how the memory capacity in neural DDEs, given by the product Kτ of Lipschitz constant and delay, influences its expressivity. Non-augmented neural DDEs with a small memory capacity Kτ behave similarly to neural ODEs and also lack the universal approximation property. In contrast, if the memory capacity Kτ is sufficiently large or the phase space is augmented, we can establish the universal approximation property for neural DDEs.
