Speaker
Description
In this talk I will consider the approximate computation of the von Neumann entropy of a large density matrix $A$ (i.e., a symmetric positive semidefinite matrix with unit trace), defined as $\text{trace}(f(A))$, where $f(z) = -z \log(z)$, which is an important task in several fields, such as quantum information theory and network science.
The approximation of $\text{trace}(f(A))$ can be reduced to the computation of a certain number of quadratic forms $x^T f(A) x$ and matrix vector products $f(A) x$, using either a probing method based on graph colorings [1] or a stochastic trace estimator [2]; in turn, these quantities can be efficiently approximated using polynomial and rational Krylov subspace methods [3].
After introducing these techniques, I will present some error bounds and heuristics for the special case of the von Neumann entropy, obtained by exploiting an integral expression of the entropy function. The analysis leads to algorithms that, given an input tolerance $\epsilon$, aim to compute an approximation of the entropy with relative accuracy $\epsilon$, using either theoretical bounds or heuristic estimates as stopping criteria.
The methods are tested on several density matrices from network theory to demonstrate their effectiveness.
[1] A. Frommer, C. Schimmel, and M. Schweitzer, Analysis of probing techniques for sparse approximation and trace estimation of decaying matrix functions, SIAM Journal on Matrix Analysis and Applications, 2021.
[2] R. A. Meyer, C. Musco, C. Musco, and D. P. Woodruff, Hutch++: optimal stochastic trace estimation, Symposium on Simplicity in Algorithms (SOSA), 2021.
[3] S. Güttel, Rational Krylov approximation of matrix functions: numerical methods and optimal pole selection, GAMM-Mitteilungen, 2013.
[4] M. Benzi, M. Rinelli, I. Simunec, Computation of the von Neumann entropy of large matrices via trace estimation and rational Krylov methods, arXiv preprint arXiv:2212.09642, 2022.