Mathematical Foundations of Machine Learning

  • Peter L. Bartlett

    University of California, Berkeley, USA
  • Cristina Butucea

    CREST - ENSAE, Palaiseau, France
  • Johannes Schmidt-Hieber

    University of Twente, Enschede, Netherlands

Abstract

Machine learning has achieved remarkable successes in various applications, but there is wide agreement that a mathematical theory for deep learning is missing. Recently, some first mathematical results have been derived in different areas such as mathematical statistics and statistical learning. Any mathematical theory of machine learning will have to combine tools from different fields such as nonparametric statistics, high-dimensional statistics, empirical process theory and approximation theory. The main objective of the workshop was to bring together leading researchers contributing to the mathematics of machine learning.

A focus of the workshop was on theory for deep neural networks. Mathematically speaking, neural networks define function classes with a rich mathematical structure that are extremely difficult to analyze because of non-linearity in the parameters. Until very recently, most existing theoretical results could not cope with many of the distinctive characteristics of deep networks such as multiple hidden layers or the ReLU activation function. Other topics of the workshop are procedures for quantifying the uncertainty of machine learning methods and the mathematics of data privacy.

Cite this article

Peter L. Bartlett, Cristina Butucea, Johannes Schmidt-Hieber, Mathematical Foundations of Machine Learning. Oberwolfach Rep. 18 (2021), no. 1, pp. 853–894

DOI 10.4171/OWR/2021/15