An information upper bound for probability sensitivity

06/05/2022
by   Jiannan Yang, et al.
0

Uncertain input of a mathematical model induces uncertainties in the output and probabilistic sensitivity analysis identifies the influential inputs to guide decision-making. Of practical concern is the probability that the output would, or would not, exceed a threshold, and the probability sensitivity depends on this threshold which is often uncertain. The Fisher information and the Kullback-Leibler divergence have been recently proposed in the literature as threshold-independent sensitivity metrics. We present mathematical proof that the information-theoretical metrics provide an upper bound for the probability sensitivity. The proof is elementary, relying only on a special version of the Cauchy-Schwarz inequality called Titu's lemma. Despite various inequalities exist for probabilities, little is known of probability sensitivity bounds and the one proposed here is new to the present authors' knowledge. The probability sensitivity bound is extended, analytically and with numerical examples, to the Fisher information of both the input and output. It thus provides a solid mathematical basis for decision-making based on probabilistic sensitivity metrics.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset