On the relative asymptotic expressivity of inference frameworks

04/20/2022
by   Vera Koponen, et al.
0

Let σ be a first-order signature and let 𝐖_n be the set of all σ-structures with domain [n] = {1, …, n}. By an inference framework we mean a class 𝐅 of pairs (ℙ, L), where ℙ = (ℙ_n : n = 1, 2, 3, …) and each ℙ_n is a probability distribution on 𝐖_n, and L is a logic with truth values in the unit interval [0, 1]. The inference frameworks we consider contain pairs (ℙ, L) where ℙ is determined by a probabilistic graphical model and and L expresses statements about, for example, (conditional) probabilities or (arithmetic or geometric) averages. We define asymptotic expressivity of inference frameworks: 𝐅' is asymptotically at least as expressive as 𝐅 if for every (ℙ, L) ∈𝐅 there is (ℙ', L') ∈𝐅' such that ℙ is asymptotically total-variation-equivalent to ℙ' and for every φ(x̅) ∈ L there is φ'(x̅) ∈ L' such that φ'(x̅) is asymptotically equivalent to φ(x̅) with respect to ℙ. This relation is a preorder and we describe a (strict) partial order on the equivalence classes of some inference frameworks that seem natural in the context of machine learning and artificial intelligence. Our analysis includes Conditional Probability Logic (CPL) and Probability Logic with Aggregation functions (PLA) introduced in earlier work. We also define a sublogic coPLA of PLA in which the aggregation functions satisfy additional continuity constraints and show that coPLA gives rise to asymptotically strictly less expressive inference frameworks than PLA.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset