Date of Original Version
USENIX is committed to Open Access to the research presented at our events. Papers and proceedings are freely available to everyone once the event begins.
Abstract or Description
Everyone loves a large caching tier in their multitier cloud-based web service because it both alleviates database load and provides lower request latencies. Even when load drops severely, administrators are reluctant to scale down their caching tier. This paper makes the case that (i) scaling down the caching tier is viable with respect to performance, and (ii) the savings are potentially huge; e.g., a 4x drop in load can result in 90% savings in the caching tier size.
Proceedings of the USENIX Workshop on Hot Topics in Cloud Computing (HotCloud), 2012.