Patrick Wolfe (with Mohamed-Ali Belabbas)
Title: The Nystrom
Extension and Spectral Methods in Learning: A New Algorithm for
Low-Rank Approximation of Quadratic Forms
Abstract:
Spectral methods are of fundamental importance in statistical
learning, as they underlie algorithms from classical principal
components analysis to more recent approaches that exploit manifold
structure. In most cases, the core technical problem can be
reduced to computing a low-rank approximation to a positive-definite
kernel. Using traditional methods, such an approximation can be
obtained with computational complexity that scales as the cube of the
number of training examples. For the growing number of applications
dealing with very large or high-dimensional data sets, however, these
techniques are too costly. A known alternative is the Nystrom extension
from finite element methods. While its application to machine
learning has previously been suggested in the literature, we introduce
here what is, to the best of our knowledge, the first randomized
algorithm of this type to yield a relative
approximation error bound. Our results follow from a new class of
algorithms for the approximation of matrix products, which reveal
connections between classical linear algebraic quantities such as Schur
complements and techniques from theoretical computer science such as
the notion of volume sampling.