It’s natural to combine Krylov subspace methods with variance reduction techniques for trace estimation. This can lead to improvements over Stochastic Lanczos Quadrature, which is based on the Girard-Hutchinson estimator. Krylov aware methods take this one step further, by making use of interaction between Krylov subspace methods and randomized low-rank approximation techniques to more efficiently produce low-rank approximations to matrix functions Chen & Hallman, 2023Persson et al., 2025.
Let’s consider a Randomized-SVD type approach to obtaining a low-rank approximation to , where products with are approximated via Krylov subspace methods.
The first key observation is that is contained in some block Krylov subspace
where is the number of steps taken by the KSM.
Let be an orthonormal basis for . We can compute using the same number of matrix-vector products with as were used to compute . However, , so we may expect that is a better approximation to than .
However, since has many more columns than , at first glance computing an approximation to seems to require many more products than computing an approximation to .
The second key observation that because is a basis for a Krylov subspace, we can efficiently compute an approximation to . In particular, we have the following:
Proof
This means that quantities such as can be approximated from . This requires just additional matrix vector products with , rather than .
While this idea is simple, our algorithm can often perform much better than if matrix-vector products with are treated as a black box. This is particularly true for functions like , where we may require to be quite large to obtain a good approximation to , but even the basis obtained with (which is basically like sketching ) works well.
Similar ideas are explored in Persson & Kressner, 2023Persson et al., 2025 where it is shown that sketching instead of is a good idea for operator monotone functions.
- Chen, T., & Hallman, E. (2023). Krylov-Aware Stochastic Trace Estimation. SIAM Journal on Matrix Analysis and Applications, 44(3), 1218–1244. 10.1137/22m1494257
- Persson, D., Chen, T., & Musco, C. (2025). Randomized block-Krylov subspace methods for low-rank approximation of matrix functions. https://arxiv.org/abs/2502.01888
- Persson, D., & Kressner, D. (2023). Randomized Low-Rank Approximation of Monotone Matrix Functions. SIAM Journal on Matrix Analysis and Applications, 44(2), 894–918. 10.1137/22m1523923
- Persson, D., Meyer, R. A., & Musco, C. (2025). Algorithm-Agnostic Low-Rank Approximation of Operator Monotone Matrix Functions. SIAM Journal on Matrix Analysis and Applications, 46(1), 1–21. 10.1137/23m1619435