Robust One-Bit Recovery via ReLU Generative Networks: Improved Statistical Rates and Global Landscape Analysis

08/14/2019
by   Shuang Qiu, et al.
2

We study the robust one-bit compressed sensing problem whose goal is to design an algorithm that faithfully recovers any sparse target vector θ_0∈R^d uniformly m quantized noisy measurements. Under the assumption that the measurements are sub-Gaussian random vectors, to recover any k-sparse θ_0 (k≪ d) uniformly up to an error ε with high probability, the best known computationally tractable algorithm requires m≥Õ(klog d/ε^4) measurements. In this paper, we consider a new framework for the one-bit sensing problem where the sparsity is implicitly enforced via mapping a low dimensional representation x_0 ∈R^k through a known n-layer ReLU generative network G:R^k→R^d. Such a framework poses low-dimensional priors on θ_0 without a known basis. We propose to recover the target G(x_0) via an unconstrained empirical risk minimization (ERM) problem under a much weaker sub-exponential measurement assumption. For such a problem, we establish a joint statistical and computational analysis. In particular, we prove that the ERM estimator in this new framework achieves an improved statistical rate of m=Õ(kn log d /ε^2) recovering any G(x_0) uniformly up to an error ε. Moreover, from the lens of computation, despite non-convexity, we prove that the objective of our ERM problem has no spurious stationary point, that is, any stationary point are equally good for recovering the true target up to scaling with a certain accuracy. Furthermore, our analysis also shed lights on the possibility of inverting a deep generative model under partial and quantized measurements, complementing the recent success of using deep generative models for inverse problems.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset