Speaker
Description
The problem of optimizing of the spectral radius over a convex set of matrices is notoriously hard. The spectral radius is neither convex nor concave. It can be non-Lipschitz either, which makes the standard gradient-type methods inapplicable. We consider the case of nonnegative matrices, when the spectral radius becomes the Perron eigenvalue. For the matrix sets with the product structure (with row or column uncertainty), we consider the spectral simplex method and the greedy method [1,2], which demonstrate a surprising efficiency even in high dimensions (several thousands). For polyhedra of nonnegative matrices, we present and discuss an alternating algorithm. Applications to the graph theory, the neural search, and the population dynamics are discussed.
[1] Protasov, V.Y., Spectral simplex method
Mathematical Programming, 2016, 156 (1-2), pp. 485–511
[2] Cvetković, A., Protasov, V.Yu. The greedy strategy for optimizing the Perron eigenvalue, Mathematical Programming, 2022, 193(1)