Propositional Knowledge Representation in Restricted Boltzmann Machines

05/31/2017
by   Son N. Tran, et al.
0

Representing symbolic knowledge into a connectionist network is the key element for the integration of scalable learning and sound reasoning. Most of the previous studies focus on discriminative neural networks which unnecessarily require a separation of input/output variables. Recent development of generative neural networks such as restricted Boltzmann machines (RBMs) has shown a capability of learning semantic abstractions directly from data, posing a promise for general symbolic learning and reasoning. Previous work on Penalty logic show a link between propositional logic and symmetric connectionist networks, however it is not applicable to RBMs. This paper proposes a novel method to represent propositional formulas into RBMs/stack of RBMs where Gibbs sampling can be seen as maximising satisfiability. It also shows a promising use of RBMs to learn symbolic knowledge through maximum likelihood estimation.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset