Caching in the Clouds: Optimized Dynamic Cache Instantiation in Content Delivery Systems

03/11/2018
by   Niklas Carlsson, et al.
0

By caching content at geographically distributed servers, content delivery applications can achieve scalability and reduce wide-area network traffic. However, each deployed cache has an associated cost. When the request rate from the local region is sufficiently high this cost will be justified, but as the request rate varies, for example according to a daily cycle, there may be long periods when the benefit of the cache does not justify the cost. Cloud computing offers a solution to problems of this kind, by supporting the dynamic allocation and release of resources according to need. In this paper, we analyze the potential benefits from dynamically instantiating caches using resources from cloud service providers. We develop novel analytic caching models that accommodate time-varying request rates, transient behavior as a cache fills following instantiation, and selective cache insertion policies. Using these models, within the context of a simple cost model, we then develop bounds and compare policies with optimized parameter selections to obtain insights into key cost/performance tradeoffs. We find that dynamic cache instantiation has the potential to provide substantial cost reductions in some cases, but that this potential is strongly dependent on the object popularity skew. We also find that selective "Cache on k-th request" cache insertion policies can be even more beneficial in this context than with conventional edge caches.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset