Organized Complexity: is Big History a Big Computation?

09/16/2016
by   Jean-Paul Delahaye, et al.
0

The concept of "logical depth" introduced by Charles H. Bennett (1988) seems to capture, at least partially, the notion of organized complexity, so central in big history. More precisely, the increase in organized complexity refers here to the wealth, variety and intricacy of structures, and should not be confused with the increase of random complexity, formalized by Kolmogorov (1965). If Bennett is right in proposing to assimilate organized complexity with "computational content", then the fundamental cause of the increase of complexity in the universe is the existence of computing mechanisms with memory, and able to cumulatively create and preserve computational contents. In this view, the universe computes, remembers its calculations, and reuses them to conduct further computations. Evolutionary mechanisms are such forms of cumulative computation with memory and we owe them the organized complexity of life. Language, writing, culture, science and technology can also be analyzed as computation mechanisms generating, preserving and accelerating the increase in organized complexity. The main unifying theme for big history is the energy rate density, a metric based on thermodynamics. However useful, this metric does not provide much insight into the role that information and computation play in our universe. The concept of "logical depth" provides a new lens to examine the increase of organized complexity. We argue in this paper that organized complexity is a valid and useful way to make sense of big history. Additionally, logical depth has a rigorous formal definition in theoretical computer science that hints at a broader research program to quantify complexity in the universe.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset