Adaptive Metric Dimensionality Reduction

02/12/2013
by   Lee-Ad Gottlieb, et al.
0

We study adaptive data-dependent dimensionality reduction in the context of supervised learning in general metric spaces. Our main statistical contribution is a generalization bound for Lipschitz functions in metric spaces that are doubling, or nearly doubling. On the algorithmic front, we describe an analogue of PCA for metric spaces: namely an efficient procedure that approximates the data's intrinsic dimension, which is often much lower than the ambient dimension. Our approach thus leverages the dual benefits of low dimensionality: (1) more efficient algorithms, e.g., for proximity search, and (2) more optimistic generalization bounds.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/29/2015

Foundations of Coupled Nonlinear Dimensionality Reduction

In this paper we introduce and analyze the learning scenario of coupled ...
research
09/27/2022

Linear Dimensionality Reduction

These notes are an overview of some classical linear methods in Multivar...
research
07/29/2013

Borel Isomorphic Dimensionality Reduction of Data and Supervised Learning

In this project we further investigate the idea of reducing the dimensio...
research
10/31/2018

Unsupervised Dimension Selection using a Blue Noise Spectrum

Unsupervised dimension selection is an important problem that seeks to r...
research
11/08/2017

Dimension Estimation Using Random Connection Models

Information about intrinsic dimension is crucial to perform dimensionali...
research
03/17/2023

An evaluation framework for dimensionality reduction through sectional curvature

Unsupervised machine learning lacks ground truth by definition. This pos...
research
07/05/2021

Randomized Dimensionality Reduction for Facility Location and Single-Linkage Clustering

Random dimensionality reduction is a versatile tool for speeding up algo...

Please sign up or login with your details

Forgot password? Click here to reset