Mini-Workshop: Probabilistic Perspectives in Neural Network-Based Machine Learning
Steffen Dereich
Universität Münster, GermanyAymeric Dieuleveut
École Polytechnique, Palaiseau, FranceSebastian Kassing
Technische Universität Berlin, GermanySophie Langer
Ruhr-Universität Bochum, Germany

Abstract
Artificial neural networks (ANNs) have emerged as a powerful tool in modern machine learning, yet their mathematical foundations remain only partially understood. A key challenge is the inherently stochastic nature of ANN training: optimization occurs in high-dimensional parameter spaces with complex loss landscapes, influenced by stochastic initialization and noisy gradient updates. Understanding these dynamics requires probabilistic methods and asymptotic frameworks. This workshop explored recent advances in stochastic training dynamics, emphasizing probabilistic techniques and limit theorems. By bringing together researchers from probability, optimization, and deep learning theory, this workshop laid the groundwork for new directions in understanding neural network training from a stochastic perspective.
Cite this article
Steffen Dereich, Aymeric Dieuleveut, Sebastian Kassing, Sophie Langer, Mini-Workshop: Probabilistic Perspectives in Neural Network-Based Machine Learning. Oberwolfach Rep. 22 (2025), no. 4, pp. 2673–2700
DOI 10.4171/OWR/2025/50