Many applications can be characterized by the requirement for a subset of the singular values/vectors of a given matrix. Three distinct approaches exist for this problem:
Compute the full SVD and truncate the parts that are not needed. This is wasteful, and often times, prohibitively expensive.
Transform the singular value problem into a symmetric eigenvalue problem, apply an iterative eigensolver to compute the relevant part of the spectrum, back-transform the eigenvalue solutions into the desired singular value solutions. This is the most popular approach for large SVD problems, especially where the data matrix is sparse.
Attack directly the singular value problem via some specialized solver:
JD-SVD method of Hochstenbach. This is analogous to the Jacobi-Davidson eigensolver. It operates by using a Newton method to compute an update to the current
approximation that attempts to set the SVD problem residuals to zero. This is wrapped by a Davidson-type two-sided subspace acceleration strategy, which serves
to improve the rate of convergence and globalize the Newton method. JD-SVD is best applied to finding either the largest or smallest singular values.
The computation of the dominant singular vectors can be characterized as the maximization of a particular function over a Riemannian manifold. This allows the application of
a number of Riemannian optimization techniques. See the GenRTR page for more information. There is currently no analogous method for computing the
smallest singular values.
There exist a number of efforts which compute the dominant singular vectors via a neural network. Try this or this.
A family of so-called low-rank incremental SVD methods allow the approximation of the dominant or dominated singular subspaces in a pass-efficient manner. These methods are the focus of this webpage.
Incremental SVD Publications
The most recent talk of Baker provides a good introduction the the familiy of incremental/streaming SVD methods:
M. Brand. "Incremental singular value decomposition of uncertain data with missing values". In Proceedings of the 2002 European Conference on Computer Vision, 2002.
IncPACK contains implementations of the low-rank incremental SVD methods for
the approximation of either the largest or smallest singular values of a
matrix, along with the associated singular vectors.
Currently, the package provides a single solver, an implementation of the of
the Sequential Karhunen-Loeve (Levy and Lindenbaum, 2000). This solver, as
described by the authors, computes approximations for the left singular
subspace. It has been modified to compute also the right singular subspace, and
to improve on these approximations via multiple passes through the data matrix.
IncPACK currently provides implementations in MATLAB, released under an open-source modified BSD license. An MPI-based implementation in C++
is available in Trilinos
in the RBGen package (development branch only). A beta-implementation for NVIDIA GPUs using CUDA is available below.