A rigorous framework for the mean field limit of multilayer neural networks

  • Phan-Minh Nguyen

    Stanford University, Stanford, USA; The Voleon Group, Berkeley, USA
  • Huy Tuan Pham

    Stanford University, Stanford, USA
A rigorous framework for the mean field limit of multilayer neural networks cover
Download PDF

This article is published open access under our Subscribe to Open model.

Abstract

We develop a mathematically rigorous framework for multilayer neural networks in the mean field regime. As the network’s widths increase, the network’s learning trajectory is shown to be well captured by a meaningful and dynamically nonlinear limit (the mean field limit), which is characterized by a system of ODEs. Our framework applies to a broad range of network architectures, learning dynamics and network initializations. Central to the framework is the new idea of a neuronal embedding, which comprises of a non-evolving probability space that allows to embed neural networks of arbitrary widths. Using our framework, we prove several properties of large-width multilayer neural networks. Firstly we show that independent and identically distributed initializations cause strong degeneracy effects on the network’s learning trajectory when the network’s depth is at least four. Secondly we obtain several global convergence guarantees for feedforward multilayer networks under a number of different setups. These include two-layer and three-layer networks with independent and identically distributed initializations, and multilayer networks of arbitrary depths with a special type of correlated initializations that is motivated by the new concept of bidirectional diversity. Unlike previous works that rely on convexity, our results admit non-convex losses and hinge on a certain universal approximation property, which is a distinctive feature of infinite-width neural networks and it is shown to hold throughout the training process. Aside from being the first known results for global convergence of multilayer networks in the mean field regime, they demonstrate flexibility of our framework and incorporate several new ideas and insights that depart from the conventional convex optimization wisdom.

Cite this article

Phan-Minh Nguyen, Huy Tuan Pham, A rigorous framework for the mean field limit of multilayer neural networks. Math. Stat. Learn. 6 (2023), no. 3/4, pp. 201–357

DOI 10.4171/MSL/42