An information theoretic necessary condition for perfect reconstruction
This article proposes a new information theoretic necessary condition for reconstructing a discrete random variable X based on the knowledge of a set of discrete functions of X. The reconstruction condition is derived from the Shannon's Lattice of Information (LoI) <cit.> and two entropic metrics proposed respectively by Shannon and Rajski. This theoretical material being relatively unknown and/or dispersed in different references, we provide a complete and synthetic description of the LoI concepts like the total, common and complementary informations with complete proofs. The two entropic metrics definitions and properties are also fully detailled and showed compatible with the LoI structure. A new geometric interpretation of the Lattice structure is then investigated that leads to a new necessary condition for reconstructing the discrete random variable X given a set { X_0,...,X_n-1} of elements of the lattice generated by X. Finally, this condition is derived in five specific examples of reconstruction of X from a set of deterministic functions of X: the reconstruction of a symmetric random variable from the knowledge of its sign and of its absolute value, the reconstruction of a binary word from a set of binary linear combinations, the reconstruction of an integer from its prime signature (Fundamental theorem of arithmetics) and from its reminders modulo a set of coprime integers (Chinese reminder theorem), and the reconstruction of the sorting permutation of a list from a set of 2-by-2 comparisons. In each case, the necessary condition is shown compatible with the corresponding well-known results.
READ FULL TEXT