Locally Self-Adjusting Hypercubic Networks
In a prior work (ICDCS 2017), we presented a distributed self-adjusting algorithm DSG for skip graphs. DSG performs topological adaption to communication pattern to minimize the average routing costs between communicating nodes. In this work, we present a distributed self-adjusting algorithm (referred to as DyHypes) for topological adaption in hypercubic networks. One of the major differences between hypercubes and skip graphs is that hypercubes are more rigid in structure compared skip graphs. This property makes self-adjustment significantly different in hypercubic networks than skip graphs. Upon a communication between an arbitrary pair of nodes, DyHypes transforms the network to place frequently communicating nodes closer to each other to maximize communication efficiency, and uses randomization in the transformation process to speed up the transformation and reduce message complexity. We show that, as compared to DSG, DyHypes reduces the transformation cost by a factor of O( n), where n is the number of nodes involved in the transformation. Moreover, despite achieving faster transformation with lower message complexity, the combined cost (routing and transformation) of DyHypes is at most a n factor more than that of any algorithm that conforms to the computational model adopted for this work. Similar to DSG, DyHypes is fully decentralized, conforms to the CONGEST model, and requires O( n) bits of memory for each node, where n is the total number of nodes.
READ FULL TEXT