Randomly initialized EM algorithm for two-component Gaussian mixture achieves near optimality in iterations

  • Yihong Wu

    Yale University, New Haven, USA
  • Harrison H. Zhou

    Yale University, New Haven, USA
Randomly initialized EM algorithm for two-component Gaussian mixture achieves near optimality in $O(\sqrt{n})$ iterations cover
Download PDF

This article is published open access under our Subscribe to Open model.

Abstract

We analyze the classical EM algorithm for parameter estimation in the symmetric two-component Gaussian mixtures in dimensions. We show that, even in the absence of any separation between components, provided that the sample size satisfies , the randomly initialized EM algorithm converges to an estimate in at most iterations with high probability, which is at most in Euclidean distance from the true parameter and within logarithmic factors of the minimax rate of . Both the nonparametric statistical rate and the sublinear convergence rate are direct consequences of the zero Fisher information in the worst case. Refined pointwise guarantees beyond worst-case analysis and convergence to the MLE are also shown under mild conditions.

This improves the previous result of Balakrishnan, Wainwright, and Yu (2017), which requires strong conditions on both the separation of the components and the quality of the initialization, and that of Daskalakis, Tzamos, and Zampetakis (2017), which requires sample splitting and restarting the EM iteration.

Cite this article

Yihong Wu, Harrison H. Zhou, Randomly initialized EM algorithm for two-component Gaussian mixture achieves near optimality in iterations. Math. Stat. Learn. 4 (2021), no. 3/4, pp. 143–220

DOI 10.4171/MSL/29