Batch Sparse Recovery, or How to Leverage the Average Sparsity

07/23/2018
by   Alexandr Andoni, et al.
0

We introduce a batch version of sparse recovery, where the goal is to report a sequence of vectors A_1',...,A_m' ∈R^n that estimate unknown signals A_1,...,A_m ∈R^n using a few linear measurements, each involving exactly one signal vector, under an assumption of average sparsity. More precisely, we want to have (1) ∑_j ∈ [m]A_j- A_j'_p^p< C ·{∑_j ∈ [m]A_j - A_j^*_p^p} for predetermined constants C > 1 and p, where the minimum is over all A_1^*,...,A_m^*∈R^n that are k-sparse on average. We assume k is given as input, and ask for the minimal number of measurements required to satisfy (1). The special case m=1 is known as stable sparse recovery and has been studied extensively. We resolve the question for p =1 up to polylogarithmic factors, by presenting a randomized adaptive scheme that performs Õ(km) measurements and with high probability has output satisfying (1), for arbitrarily small C > 1. Finally, we show that adaptivity is necessary for every non-trivial scheme.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset