Skip to article frontmatterSkip to article content

A Gaussian matrix can be applied to A\vec{A} in O(ndk)O(ndk) operations. In a number of applications we require kdk \geq d, so the cost of applying S\vec{S} to A\vec{A} will exceed the cost of the downstream application. However, even in such cases, Gaussian sketching matrices play an important role in RandNLA.

In many cases, the behavior of algorithms when other sketching distributions are used closely mimics the Gaussian case. However, owing to the many nice properties of Gaussian matrices, it is often easier to analyze the performance of algorithms using Gaussian sketches.

Orthogonal Invariance

Spectral Properties

For a large Gaussian matrix GGaussian(k,n)\vec{G} \sim \operatorname{Gaussian}(k,n) with k/nk/n fixed, the eigenvalues values of GTG\vec{G}^\T\vec{G} follow the Marchenko-Pastur distribution.

Non-asymptotic bounds are also available; see e.g. Vershynin, 2010.

Using the equivalent characterization of subspace embeddings in terms of singular values from Theorem 2.1, it’s clear that Gaussian matrices can be used as subspace embeddings.

When nn and dd are large, then Theorem 2.3 suggests that we actually get the subspace embedding property for kd/ε2k \approx d/\varepsilon^2; i.e. we can take the constant in the O()O(\cdot) notation to be 1. Since other mixing sketches often behave very similarly to Gaussian sketches with respect to the embedding dimension, this observation is useful to get a handle on the exact embedding dimension required in practice for a number of applications.

Other properties

Gaussian matrices also satisfy a number of other useful properties. A particularly useful one is the following:

A proof can be found in Tropp & Webber, 2023.

References
  1. Vershynin, R. (2010). Introduction to the non-asymptotic analysis of random matrices.
  2. Tropp, J. A., & Webber, R. J. (2023). Randomized algorithms for low-rank matrix approximation: Design, analysis, and applications. https://arxiv.org/abs/2306.12418