Integrated prefetching and caching in single and parallel systems

被引:6
|
作者
Albers, S [1 ]
Büttner, M [1 ]
机构
[1] Univ Freiburg, Inst Informat, D-79110 Freiburg, Germany
关键词
magnetic disks; prefetching; caching; approximation algorithms; linear program;
D O I
10.1016/j.ic.2005.01.003
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
We study integrated prefetching and caching in single and parallel disk systems. In the first part of the paper, we investigate approximation algorithms for the single disk problem. There exist two very popular approximation algorithms called Aggressive and Conservative for minimizing the total elapsed time. We give a refined analysis of the Aggressive algorithm, improving the original analysis by Cao et al. We prove that our new bound is tight. Additionally, we present a new family of prefetching and caching strategies and give algorithms that perform better than Aggressive and Conservative. In the second part of the paper, we investigate the problem of minimizing stall time in parallel disk systems. We present a polynomial time algorithm for computing a prefetching/caching schedule whose stall time is bounded by that of an optimal solution. The schedule uses at most 2(D - 1) extra memory locations in cache. This is the first polynomial time algorithm that, using a small amount of extra resources, computes schedules whose stall times are bounded by that of optimal schedules not using extra resources. Our algorithm is based on the linear programming approach of [Journal of the ACM 47 (2000) 969]. However, in order to achieve minimum stall times, we introduce the new concept of synchronized schedules in which fetches on the D disks are performed completely in parallel. (c) 2005 Elsevier Inc. All rights reserved.
引用
收藏
页码:24 / 39
页数:16
相关论文
共 50 条
  • [41] A trace-driven analysis of parallel prefetching algorithms for parallel and distributed systems
    Bin, Cai
    Hu, Diqing
    Xie, Changsheng
    EIGHTH INTERNATIONAL CONFERENCE ON HIGH-PERFORMANCE COMPUTING IN ASIA-PACIFIC REGION, PROCEEDINGS, 2005, : 273 - 280
  • [42] A Novel Adaptive Data Prefetching Scheme in Satellite-ground Integrated Networks with Edge Caching
    Liu, Song
    Liang, Chengchao
    2022 27TH ASIA PACIFIC CONFERENCE ON COMMUNICATIONS (APCC 2022): CREATING INNOVATIVE COMMUNICATION TECHNOLOGIES FOR POST-PANDEMIC ERA, 2022, : 615 - 620
  • [43] Object Caching and Prefetching in Distributed Virtual Walkthrough
    Rynson W. H. Lau
    Jimmy H. P. Chim
    Mark Green
    Hong Va Leong
    Antonio Si
    Real-Time Systems, 2001, 21 : 143 - 164
  • [44] Opportunities and Challenges for Caching and Prefetching on Mobile Devices
    Cao, Pei
    2015 THIRD IEEE WORKSHOP ON HOT TOPICS IN WEB SYSTEMS AND TECHNOLOGIES (HOTWEB), 2015, : 49 - 53
  • [45] Signature caching in parallel object database systems
    Norvåg, K
    INFORMATION AND SOFTWARE TECHNOLOGY, 2002, 44 (06) : 331 - 341
  • [46] Mining Web logs for Prediction in Prefetching and Caching
    Songwattana, Areerat
    THIRD 2008 INTERNATIONAL CONFERENCE ON CONVERGENCE AND HYBRID INFORMATION TECHNOLOGY, VOL 2, PROCEEDINGS, 2008, : 1006 - 1011
  • [47] Proxy caching based on patching scheme and prefetching
    Park, YW
    Baek, KH
    Chung, KD
    ADVANCES IN MUTLIMEDIA INFORMATION PROCESSING - PCM 2001, PROCEEDINGS, 2001, 2195 : 558 - 565
  • [48] Object caching and prefetching in distributed virtual walkthrough
    Lau, RWH
    Chim, JHP
    Green, M
    Leong, HV
    REAL-TIME SYSTEMS, 2001, 21 (1-2) : 143 - 164
  • [49] On the Computational Aspect of Coded Caching With Uncoded Prefetching
    Michos, Sotirios K.
    Diamantoulakis, Panagiotis D.
    Georgiadis, Leonidas
    Karagiannidis, George K.
    IEEE TRANSACTIONS ON INFORMATION THEORY, 2023, 69 (03) : 1486 - 1508
  • [50] Web Caching and Prefetching: What, Why, and How?
    Sulaiman, Sarina
    Shamsuddin, Siti Mariyam
    Abraham, Ajith
    Sulaiman, Shahida
    INTERNATIONAL SYMPOSIUM OF INFORMATION TECHNOLOGY 2008, VOLS 1-4, PROCEEDINGS: COGNITIVE INFORMATICS: BRIDGING NATURAL AND ARTIFICIAL KNOWLEDGE, 2008, : 2805 - +