Multi-Layer Bilinear Generalized Approximate Message Passing

07/01/2020
by   Qiuyun Zou, et al.
0

In this paper, we extend the bilinear generalized approximate message passing (BiG-AMP) approach, originally proposed for high-dimensional generalized bilinear regression, to the multi-layer case for the handling of cascaded matrix-factorization problems arising in relay communications among others. Assuming statistically independent matrix entries with known priors, we derive a new algorithm called ML-BiGAMP, which approximates the general sum-product loopy belief propagation (LBP) in the high-dimensional limit with a substantial reduction in computational complexity. We demonstrate that, in large system limits, the mean squared error (MSE) of proposed algorithm could be fully characterized via a set of simple one-dimensional equations termed state evolution (SE), which further reveals that the fixed point equations of this algorithm match perfectly with those of the exact MMSE estimator as predicted by the replica method. The exact MMSE estimator is well known to be Bayes optimal; however, due to the exponential complexity, it is infeasible to high-dimensional applications. Alternatively, the proposed (approximate) algorithm is with low complexity and could attain the same optimal MSE performance as the exact estimator in the asymptotic regime. As an illustration of the new algorithm, we then consider a particular problem of signal detection for two-hop amplify-and-forward relay communications. In that case, the proposed algorithm particularizes as a joint channel and data detection method, capable of recovering signals with high precision even in the absence of instantaneous channel state information.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset