Learning Topic Models and Latent Bayesian Networks Under Expansion Constraints

09/24/2012
by   Daniel Hsu, et al.
0

Unsupervised estimation of latent variable models is a fundamental problem central to numerous applications of machine learning and statistics. This work presents a principled approach for estimating broad classes of such models, including probabilistic topic models and latent linear Bayesian networks, using only second-order observed moments. The sufficient conditions for identifiability of these models are primarily based on weak expansion constraints on the topic-word matrix, for topic models, and on the directed acyclic graph, for Bayesian networks. Because no assumptions are made on the distribution among the latent variables, the approach can handle arbitrary correlations among the topics or latent factors. In addition, a tractable learning method via ℓ_1 optimization is proposed and studied in numerical experiments.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset