This book chapter is published open access.
This paper presents some second- and higher-order Gaussian anticoncentration inequalities in high dimension and error bounds in Slepian’s comparison theorem for the distribution functions of the maxima of two Gaussian vectors. The anticoncentration theorems are presented as upper bounds for the sum of the absolute values of the partial derivatives of a certain order for the joint distribution function of a Gaussian vector or weighted sums of such absolute values. Compared with the existing results where the covariance matrix of the entire Gaussian vector is required to be invertible, the bounds for the th derivatives developed in this paper require only the invertibility of the covariance matrices of all subsets of random variables. The second-order anticoncentration inequality is used to develop comparison theorems for the joint distribution functions of Gaussian vectors or, equivalently, the univariate distribution functions of their maxima via Slepian’s interpolation. The third- and higher-order anticoncentration inequalities are motivated by recent advances in the central limit theorem and consistency of bootstrap for the maximum component of a sum of independent random vectors in high dimension and related applications in statistical inference and machine learning.