The Group Theoretic Roots of Information I: permutations, symmetry, and entropy

08/23/2019
by   David J. Galas, et al.
0

We propose a new interpretation of measures of information and disorder by connecting these concepts to group theory in a new way. Entropy and group theory are connected here by their common relation to sets of permutations. A combinatorial measure of information and disorder is proposed, in terms of integers and discrete functions, that we call the integer entropy. The Shannon measure of information is the limiting case of a richer, more general conceptual structure that reveals relations among finite groups, information, and symmetries. It is shown that the integer entropy converges uniformly to the Shannon entropy when the group includes all permutations, the Symmetric group, and the number of objects increases without bound. The harmonic numbers have a well-known combinatorial meaning as the expected number of disjoint, non-empty cycles in permutations of n objects, and since integer entropy is defined in terms of the expected value of the number of cycles over the set of permutations, it also has a clear combinatorial meaning. Since all finite groups are isomorphic to subgroups of the Symmetric group, every finite group has a corresponding information functional, analogous to the Shannon entropy and a number series, analogous to the harmonic numbers. The Cameron-Semeraro cycle polynomial is used to analyze the integer entropy for finite groups, and to characterize the series analogous to the Harmonic numbers. Broken symmetries and conserved quantities are linked through the cycle properties of the groups, and we define an entropy functional for every finite group.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset