Water from Two Rocks: Maximizing the Mutual Information

02/24/2018
by   Yuqing Kong, et al.
0

Our goal is to forecast ground truth Y using two sources of information X_A,X_B, without access to any data labeled with ground truth. That is, we are aiming to learn two predictors/hypotheses P_A^*,P_B^* such that P_A^*(X_A) and P_B^*(X_B) provide high quality forecasts for ground truth Y, without labeled data. We also want to elicit a high quality forecast for Y from the crowds and pay the crowds immediately, without access to Y. We build a natural connection between the learning question and the mechanism design question and deal with them using the same information theoretic approach. Learning: With a natural assumption---conditioning on Y, X_A and X_B are independent, we reduce the learning question to an optimization problem _P_A,P_BMIG^f(P_A,P_B) such that solving the learning question is equivalent to picking the P_A^*,P_B^* that maximize MIG^f(P_A,P_B)---the f-mutual information gain between P_A and P_B. Moreover, we apply our results to the "learning with noisy labels" problem to learn a predictor that forecasts the ground truth label rather than the noisy label with some side information, without pre-estimating the relationship between the ground truth labels and noisy labels. Mechanism design: We design mechanisms that elicit high quality forecasts without verification and have instant rewards for agents by assuming the agents' information is independent conditioning on Y. In the single-task setting, we propose a forecast elicitation mechanism where truth-telling is a strict equilibrium, in the multi-task setting, we propose a family of forecast elicitation mechanisms where truth-telling is a strict equilibrium and pays better than any other equilibrium.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset