Learning High-Dimensional Gaussian Graphical Models under Total Positivity without Tuning Parameters

06/12/2019
by   Yuhao Wang, et al.
0

We consider the problem of estimating an undirected Gaussian graphical model when the underlying distribution is multivariate totally positive of order 2 (MTP2), a strong form of positive dependence. Such distributions are relevant for example for portfolio selection, since assets are usually positively dependent. A large body of methods have been proposed for learning undirected graphical models without the MTP2 constraint. A major limitation of these methods is that their consistency guarantees in the high-dimensional setting usually require a particular choice of a tuning parameter, which is unknown a priori in real world applications. We here show that an undirected graphical model under MTP2 can be learned consistently without any tuning parameters. This is achieved by a constraint-based estimator that infers the structure of the underlying graphical model by testing the signs of the empirical partial correlation coefficients. We evaluate the performance of our estimator in simulations and on financial data.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset