Benchmarking Heterogeneous Cloud Functions

被引:10
|
作者
Malawski, Maciej [1 ]
Figiela, Kamil [1 ]
Gajek, Adam [1 ]
Zima, Adam [1 ]
机构
[1] AGH Univ Sci & Technol, Dept Comp Sci, Krakow, Poland
关键词
Cloud computing; FaaS; Cloud functions; Performance evaluation;
D O I
10.1007/978-3-319-75178-8_34
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
Cloud Functions, often called Function-as-a-Service (FaaS), pioneered by AWS Lambda, are an increasingly popular method of running distributed applications. As in other cloud offerings, cloud functions are heterogeneous, due to different underlying hardware, run-time systems, as well as resource management and billing models. In this paper, we focus on performance evaluation of cloud functions, taking into account heterogeneity aspects. We developed a cloud function benchmarking framework, consisting of one suite based on Serverless Framework, and one based on HyperFlow. We deployed the CPU-intensive benchmarks: Mersenne Twister and Linpack, and evaluated all the major cloud function providers: AWS Lambda, Azure Functions, Google Cloud Functions and IBM OpenWhisk. We make our results available online and continuously updated. We report on the initial results of the performance evaluation and we discuss the discovered insights on the resource allocation policies.
引用
收藏
页码:415 / 426
页数:12
相关论文
共 50 条
  • [31] Graph database benchmarking on cloud environments with XGDBench
    Dayarathna, Miyuru
    Suzumura, Toyotaro
    AUTOMATED SOFTWARE ENGINEERING, 2014, 21 (04) : 509 - 533
  • [32] Benchmarking Scalability of Cloud-Native Applications
    Henning, Sören
    Hasselbring, Wilhelm
    Lecture Notes in Informatics (LNI), Proceedings - Series of the Gesellschaft fur Informatik (GI), 2023, P-332 : 59 - 60
  • [33] A Data Generator for Cloud-Scale Benchmarking
    Rabl, Tilmann
    Frank, Michael
    Sergieh, Hatem Mousselly
    Kosch, Harald
    PERFORMANCE EVALUATION, MEASUREMENT AND CHARACTERIZATION OF COMPLEX SYSTEMS, 2011, 6417 : 41 - 56
  • [34] Graph database benchmarking on cloud environments with XGDBench
    Miyuru Dayarathna
    Toyotaro Suzumura
    Automated Software Engineering, 2014, 21 : 509 - 533
  • [35] Automated Performance Benchmarking Platform of IaaS Cloud
    Liu, Xu
    Fang, Dongxu
    Xu, Peng
    2021 IEEE 20TH INTERNATIONAL CONFERENCE ON TRUST, SECURITY AND PRIVACY IN COMPUTING AND COMMUNICATIONS (TRUSTCOM 2021), 2021, : 1402 - 1405
  • [36] A Case Study on Benchmarking IoT Cloud Services
    Gruenberg, Kevin
    Schenck, Wolfram
    CLOUD COMPUTING - CLOUD 2018, 2018, 10967 : 398 - 406
  • [37] Duet Benchmarking: Improving Measurement Accuracy in the Cloud
    Bulej, Lubomir
    Horky, Vojtech
    Tuma, Petr
    Farquet, Francois
    Prokopec, Aleksandar
    PROCEEDINGS OF THE ACM/SPEC INTERNATIONAL CONFERENCE ON PERFORMANCE ENGINEERING (ICPE'20), 2020, : 100 - 107
  • [38] Demystifying Cloud Benchmarking Paradigm An In depth View
    Vedam, Venu
    Vemulapati, Jayanti
    2012 IEEE 36TH ANNUAL COMPUTER SOFTWARE AND APPLICATIONS CONFERENCE (COMPSAC), 2012, : 416 - 421
  • [39] BIM Cloud Score: Benchmarking BIM Performance
    Du, Jing
    Liu, Rui
    Issa, Raja R. A.
    JOURNAL OF CONSTRUCTION ENGINEERING AND MANAGEMENT, 2014, 140 (11)
  • [40] High Availability Benchmarking for Cloud Management Infrastructure
    Liu, Xiao Xi
    Qiu, Jian
    Zhang, Jian Ming
    PROCEEDINGS 2014 INTERNATIONAL CONFERENCE ON SERVICE SCIENCES (ICSS 2014), 2014, : 163 - 168