Subspace Learning with Partial Information
The goal of subspace learning is to find a k-dimensional subspace of R^d, such that the expected squared distance between instance vectors and the subspace is as small as possible. In this paper we study subspace learning in a partial information setting, in which the learner can only observe r < d attributes from each instance vector. We propose several efficient algorithms for this task, and analyze their sample complexity
READ FULL TEXT