Learning to Predict Requires Integrated Information
Embodied agents are regularly faced with the challenge to learn new tasks. In order to do that they need to be able to predict their next sensory state by forming an internal world model. We theorize that agents require a high value of information integration to update that world model in light of new information. This can be seen in the context of Integrated Information Theory, which provides a quantitative approach to consciousness and can be applied to neural networks. We use the sensorimotor loop to model the interactions among the agent's brain, body and environment. Thereby we can calculate various information theoretic measures that quantify different information flows in the system, one of which corresponds to Integrated Information. Additionally we are able to measure the interaction among the body and the environment, which leads to the concept of Morphological Computation. Previous research reveals an antagonistic relationship between Integrated Information and Morphological Computation. A morphology adapted well to a task can reduce the necessity for Integrated Information significantly. This creates the problem that embodied intelligence is correlated with reduced conscious experience. Here we propose a solution to this problem, namely that the agents need Integrated Information to learn. We support our hypothesis with results from a simple experimental setup in which the agents learn by using the em-algorithm.
READ FULL TEXT