Engineering and Experimentally Benchmarking a Container-based Edge Computing System
While edge computing is envisioned to superbly serve latency sensitive applications, the implementation-based studies benchmarking its performance are few and far between. To address this gap, we engineer a modular edge cloud computing system architecture that is built on latest advances in containerization techniques, including Kafka, for data streaming, Docker, as application platform, and Firebase Cloud, as realtime database system. We benchmark the performance of the system in terms of scalability, resource utilization and latency by comparing three scenarios: cloud-only, edge-only and combined edge-cloud. The measurements show that edge-only solution outperforms other scenarios only when deployed with data located at one edge only, i.e., without edge computing wide data synchronization. In case of applications requiring data synchronization through the cloud, edge-cloud scales around a factor 10 times better than cloud-only, until certain number of concurrent users in the system, and above this point, cloud-only scales better. In terms of resource utilization, we observe that whereas the mean utilization increases linearly with the number of user requests, the maximum values for the memory and the network I/O heavily increase when with an increasing amount of data.
READ FULL TEXT