Data-Driven Linear Complexity Low-Rank Approximation of General Kernel Matrices: A Geometric Approach

12/24/2022
by   Difeng Cai, et al.
0

A general, rectangular kernel matrix may be defined as K_ij = κ(x_i,y_j) where κ(x,y) is a kernel function and where X={x_i}_i=1^m and Y={y_i}_i=1^n are two sets of points. In this paper, we seek a low-rank approximation to a kernel matrix where the sets of points X and Y are large and are not well-separated (e.g., the points in X and Y may be “intermingled”). Such rectangular kernel matrices may arise, for example, in Gaussian process regression where X corresponds to the training data and Y corresponds to the test data. In this case, the points are often high-dimensional. Since the point sets are large, we must exploit the fact that the matrix arises from a kernel function, and avoid forming the matrix, and thus ruling out most algebraic techniques. In particular, we seek methods that can scale linearly, i.e., with computational complexity O(m) or O(n) for a fixed accuracy or rank. The main idea in this paper is to geometrically select appropriate subsets of points to construct a low rank approximation. An analysis in this paper guides how this selection should be performed.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset