Emergence of Numeric Concepts in Multi-Agent Autonomous Communication

11/04/2019
by   Shangmin Guo, et al.
0

With the rapid development of deep learning, most of current state-of-the-art techniques in natural langauge processing are based on deep learning models trained with argescaled static textual corpora. However, we human beings learn and understand in a different way. Thus, grounded language learning argues that models need to learn and understand language by the experience and perceptions obtained by interacting with enviroments, like how humans do. With the help of deep reinforcement learning techniques, there are already lots of works focusing on facilitating the emergence of communication protocols that have compositionalities like natural languages among computational agents population. Unlike these works, we, on the other hand, focus on the numeric concepts which correspond to abstractions in cognition and function words in natural language. Based on a specifically designed language game, we verify that computational agents are capable of transmitting numeric concepts during autonomous communication, and the emergent communication protocols can reflect the underlying structure of meaning space. Although their encodeing method is not compositional like natural languages from a perspective of human beings, the emergent languages can be generalised to unseen inputs and, more importantly, are easier for models to learn. Besides, iterated learning can help further improving the compositionality of the emergent languages, under the measurement of topological similarity. Furthermore, we experiment another representation method, i.e. directly encode numerals into concatenations of one-hot vectors, and find that the emergent languages would become compositional like human natural languages. Thus, we argue that there are 2 important factors for the emergence of compositional languages.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset