Convex optimization of graph Laplacian eigenvalues
Stephen Boyd
Stanford University, USA
A subscription is required to access this book chapter.
Abstract
We consider the problem of choosing the edge weights of an undirected graph so as to maximize or minimize some function of the eigenvalues of the associated Laplacian matrix, subject to some constraints on the weights, such as nonnegativity, or a given total value. In many interesting cases this problem is convex, i.e., it involves minimizing a convex function (or maximizing a concave function) over a convex set. This allows us to give simple necessary and sufficient optimality conditions, derive interesting dual problems, find analytical solutions in some cases, and efficiently compute numerical solutions in all cases.
In this overview we briefly describe some more specific cases of this general problem, which have been addressed in a series of recent papers.
- Fastest mixing Markov chain. Find edge transition probabilities that give the fastest mixing (symmetric, discrete-time) Markov chain on the graph.
- Fastest mixing Markov process. Find the edge transition rates that give the fastest mixing (symmetric, continuous-time) Markov process on the graph.
- Absolute algebraic connectivity. Find edge weights that maximize the algebraic connectivity of the graph (i.e., the smallest positive eigenvalue of its Laplacian matrix). The optimal value is called the absolute algebraic connectivity by Fiedler.
- Minimum total effective resistance. Find edge weights that minimize the total effective resistance of the graph. This is same as minimizing the average commute time from any node to any other, in the associated Markov chain.
- Fastest linear averaging. Find weights in a distributed averaging network that yield fastest convergence.
- Least steady-state mean-square deviation. Find weights in a distributed averaging network, driven by random noise, that minimizes the steady-state mean-square deviation of the node values.