A Metamodel and Framework for AGI

08/28/2020
by   Hugo Latapie, et al.
0

Can artificial intelligence systems exhibit superhuman general intelligence, but in critical ways, lack the “intelligence” (i.e. the ability to adapt to the environment under limited knowledge and resources [27, 30]) of a simple worm, or single-celled organism? The answer is clearly yes for narrow AI systems. Moreover, artificial general intelligence (”AGI”) systems designed to address a wide variety of problems can also learn from the simplest living organisms. For example, worms [ref1] rapidly learn to reliably avoid electrical shocks and move towards food. Single-celled organisms and even plants [ref2] also exhibit similar behaviors. It has been shown that all living organisms and some lifeless cytoplasm-like electro-colloidal substances naturally implement a physical metamodel of the world3 [18]. The DFRE metamodel exhibits some important fundamental knowledge preserving properties such as clear distinctions between symmetric and antisymmetric relations, and the ability to differentiate and store knowledge at different levels of abstraction. In this paper, we introduce the DFRE metamodel that incorporates these capabilities and demonstrate how this approach benefits AGI in specific ways such as managing combinatorial explosion, enabling cumulative, distributed and federated learning. We posit that preserving the structure of knowledge is critical for higher intelligences that manage increasingly higher levels of abstraction, be they human or artificial. This is the key lesson learned from applying AGI subsystems to complex realworld problems that require continuous learning and adaptation. This work is inspired by the state-of-the-art approaches to AGI championed by Pei Wang, Kristinn Thorisson, Ben Goertzel, the granular computing community, as well as Alfred Korzybski’s general semantics [18]

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset