Randomized competitive algorithms for generalized caching

N. Bansal, N. Buchbinder, J. Naor

    Research output: Chapter in Book/Report/Conference proceedingConference contributionAcademicpeer-review

    23 Citations (Scopus)

    Abstract

    We consider online algorithms for the generalized caching problem. Here we are given a cache of size k and pages with arbitrary sizes and fetching costs. Given a request sequence of pages, the goal is to minimize the total cost of fetching the pages into the cache. We give an online algorithm with competitive ratio O(log2k), which is the first algorithm for the problem with competitive ratio sublinear in k. We also give improved O(log k)-competitive algorithms for the special cases of the Bit Model and Fault model. In the Bit Model, the fetching cost is proportional to the size of the page and in the Fault model all fetching costs are uniform. Previously, an O(log2 k)-competitive algorithm due to Irani [14] was known for both of these models. Our algorithms are based on an extension of the primal-dual framework for online algorithms which was developed by Buchbinder and Naor [7]. We first generate an O(log k)-competitive fractional algorithm for the problem. This is done by using a strengthened LP formulation with knapsack-cover constraints, where exponentially many constraints are added upon arrival of a new request. Second, we round online the fractional solution and obtain a randomized online algorithm. Our techniques provide a unified framework for caching algorithms and are substantially simpler than those previously used.
    Original languageEnglish
    Title of host publicationProceedings of the 40th Annual ACM Symposium on Theory of Computing (STOC'08, Victoria BC, Canada, May 17-20, 2008)
    Place of PublicationNew York
    PublisherAssociation for Computing Machinery, Inc
    Pages235-244
    ISBN (Print)978-1-60558-047-0
    Publication statusPublished - 2008

    Fingerprint Dive into the research topics of 'Randomized competitive algorithms for generalized caching'. Together they form a unique fingerprint.

    Cite this