Holarchic Structures for Decentralized Deep Learning - A Performance Analysis

05/07/2018
by   Evangelos Pournaras, et al.
0

Structure plays a key role in learning performance. In centralized computational systems, hyperparameter optimization and regularization techniques such as dropout are computational means to enhance learning performance by adjusting the deep hierarchical structure. However, in decentralized deep learning by the Internet of Things, the structure is an actual network of autonomous interconnected devices such as smart phones that interact via complex network protocols. Adjustments in the learning structure are a challenge. Uncertainties such as network latency, node and link failures or even bottlenecks by limited processing capacity and energy availability can significantly downgrade learning performance. Network self-organization and self-management is complex, while it requires additional computational and network resources that hinder the feasibility of decentralized deep learning. In contrast, this paper introduces reusable holarchic learning structures for exploring, mitigating and boosting learning performance in distributed environments with uncertainties. A large-scale performance analysis with 864000 experiments fed with synthetic and real-world data from smart grid and smart city pilot projects confirm the cost-effectiveness of holarchic structures for decentralized deep learning.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset