Integrated prefetching and caching in single and parallel systems

被引:6
|
作者
Albers, S [1 ]
Büttner, M [1 ]
机构
[1] Univ Freiburg, Inst Informat, D-79110 Freiburg, Germany
关键词
magnetic disks; prefetching; caching; approximation algorithms; linear program;
D O I
10.1016/j.ic.2005.01.003
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
We study integrated prefetching and caching in single and parallel disk systems. In the first part of the paper, we investigate approximation algorithms for the single disk problem. There exist two very popular approximation algorithms called Aggressive and Conservative for minimizing the total elapsed time. We give a refined analysis of the Aggressive algorithm, improving the original analysis by Cao et al. We prove that our new bound is tight. Additionally, we present a new family of prefetching and caching strategies and give algorithms that perform better than Aggressive and Conservative. In the second part of the paper, we investigate the problem of minimizing stall time in parallel disk systems. We present a polynomial time algorithm for computing a prefetching/caching schedule whose stall time is bounded by that of an optimal solution. The schedule uses at most 2(D - 1) extra memory locations in cache. This is the first polynomial time algorithm that, using a small amount of extra resources, computes schedules whose stall times are bounded by that of optimal schedules not using extra resources. Our algorithm is based on the linear programming approach of [Journal of the ACM 47 (2000) 969]. However, in order to achieve minimum stall times, we introduce the new concept of synchronized schedules in which fetches on the D disks are performed completely in parallel. (c) 2005 Elsevier Inc. All rights reserved.
引用
收藏
页码:24 / 39
页数:16
相关论文
共 50 条
  • [21] Energy efficient prefetching and caching
    Papathanasiou, AE
    Scott, NL
    USENIX ASSOCIATION PROCEEDINGS OF THE GENERAL TRACK 2004 USENIX ANNUAL TECHNICAL CONFERENCE, 2004, : 255 - 268
  • [22] An Automatic Prefetching and Caching System
    Lewis, Joshua
    Alghamdi, Mohammed
    Al Assaf, Maen
    Ruan, Xiaojun
    Ding, Zhiyang
    Qin, Xiao
    2010 IEEE 29TH INTERNATIONAL PERFORMANCE COMPUTING AND COMMUNICATIONS CONFERENCE (IPCCC), 2010, : 180 - 187
  • [23] A session key caching and prefetching scheme for secure communication in cluster systems
    Lee, Manhee
    An, Baik Song
    Kim, Eun Jung
    JOURNAL OF PARALLEL AND DISTRIBUTED COMPUTING, 2010, 70 (07) : 732 - 742
  • [24] Parallel I/O Prefetching Using MPI File Caching and I/O Signatures
    Byna, Surendra
    Chen, Yong
    Sun, Xian-He
    Thakur, Rajeev
    Gropp, William
    INTERNATIONAL CONFERENCE FOR HIGH PERFORMANCE COMPUTING, NETWORKING, STORAGE AND ANALYSIS, 2008, : 350 - +
  • [25] Implementation and performance of integrated application-controlled file caching, prefetching, and disk scheduling
    Cao, P
    Felten, EW
    Karlin, AR
    Li, K
    ACM TRANSACTIONS ON COMPUTER SYSTEMS, 1996, 14 (04): : 311 - 343
  • [26] Optimal Model of Web Caching and Prefetching
    Shi, Lei
    Zhang, Yan
    Lin, Wei
    PROCEEDINGS OF INTERNATIONAL SYMPOSIUM ON COMPUTER SCIENCE AND COMPUTATIONAL TECHNOLOGY (ISCSCT 2009), 2009, : 250 - 253
  • [27] Caching and Prefetching Strategies for SPARQL Queries
    Lorey, Johannes
    Naumann, Felix
    SEMANTIC WEB: ESWC 2013 SATELLITE EVENTS, 2013, 7955 : 46 - 65
  • [28] Advanced prefetching and caching of models with PrefetchML
    Daniel, Gwendal
    Sunye, Gerson
    Cabot, Jordi
    SOFTWARE AND SYSTEMS MODELING, 2019, 18 (03): : 1773 - 1794
  • [29] Software caching vs. prefetching
    Aggarwal, A
    ACM SIGPLAN NOTICES, 2003, 38 (02) : 263 - 268
  • [30] Integrated document caching and prefetching in storage hierarchies based on Markov-chain predictions
    Achim Kraiss
    Gerhard Weikum
    The VLDB Journal, 1998, 7 : 141 - 162