Maximum Probability Principle and Black-Box Priors

10/21/2019
by   Amir Emad Marvasti, et al.
0

We present an axiomatic way of assigning probabilities to black box models. In particular, we quantify an upper bound for probability of a model or in terms of information theory, a lower bound for amount of information that is stored in a model. In our setup, maximizing probabilities of models is equivalent to removing assumptions or information stored in the model. Furthermore, we represent the problem of learning from an alternative view where the underlying probability space is considered directly. In this perspective both the true underlying model and the model at hand are events. Consequently, the problem of learning is represented as minimizing the probability of the symmetric difference of the model and the true underlying model.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset