Applications of Common Information to Computing Functions

03/30/2021
by   Derya Malak, et al.
0

We design a low complexity distributed compression scheme for computing arbitrary functions of sources with discrete alphabets. We use a helper-based method that extends the definition of the Gács-Körner-Witsenhausen (GKW) common information to functional common information. The helper relaxes the combinatorial structure of GKW by partitioning the joint source distribution into nests imposed by the function, which ensures hierarchical cooperation between the sources for effectively distributed computing. By contrasting our approach's performance with existing efficient techniques, we demonstrate the rate savings in recovering function and source data. Permutation invariant functions are prevalent in learning and combinatorial optimization fields and most recently applied to graph neural networks. We consider the efficient compression for computing permutation invariant functions in a network with two sources and one decoder. We use a bipartite graph representation, where two disjoint sets of vertices (parts) denote the individual source graphs and the edge weights capture the joint source distribution. We compress bipartite graphs by creating connected components determined by the function's distribution, accounting for the edge symmetries, and eliminating the low probability edges. We separately encode those edges and send them as refinements. Our approach can substitute high complexity joint decoding techniques and inform neural networks to reduce the computing time and reduce complexity.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset