Automated Usability Evaluation of Parallel Programming Constructs (NIER Track)

被引:0
|
作者
Pankratius, Victor [1 ]
机构
[1] Karlsruhe Inst Technol, D-76128 Karlsruhe, Germany
关键词
Empirical software engineering; parallel programming; usability; tools and environments;
D O I
暂无
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
Multicore computers are ubiquitous, and proposals to extend existing languages with parallel constructs mushroom. While everyone claims to make parallel programming easier and less error-prone, empirical language usability evaluations are rarely done in-the-field with many users and real programs. Key obstacles are costs and a lack of appropriate environments to gather enough data for representative conclusions. This paper discusses the idea of automating the usability evaluation of parallel language constructs by gathering subjective and objective data directly in every software engineer's IDE. The paper presents an Eclipse prototype suite that can aggregate such data from potentially hundreds of thousands of programmers. Mismatch detection in subjective and objective feedback as well as construct usage mining can improve language design at an early stage, thus reducing the risk of developing and maintaining inappropriate constructs. New research directions arising from this idea are outlined for software repository mining, debugging, and software economics.
引用
收藏
页码:936 / 939
页数:4
相关论文
共 50 条
  • [41] Building parallel programming language constructs in the AbleC extensible C compiler framework A PPoPP Tutorial
    Carlson, Travis
    Van Wyk, Eric
    PROCEEDINGS OF THE 24TH SYMPOSIUM ON PRINCIPLES AND PRACTICE OF PARALLEL PROGRAMMING (PPOPP '19), 2019, : 443 - 446
  • [42] AUTOMATED PROGRAMMING OF DIGITAL-FILTERS FOR PARALLEL-PROCESSING IMPLEMENTATION
    WERTER, MJ
    WILLSON, AN
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS II-ANALOG AND DIGITAL SIGNAL PROCESSING, 1994, 41 (04): : 285 - 294
  • [43] Automated data collection for usability evaluation in early stages of application development
    Tao, Yonglei
    WSEAS: ADVANCES ON APPLIED COMPUTER AND APPLIED COMPUTATIONAL SCIENCE, 2008, : 135 - +
  • [44] A Review of Automated Website Usability Evaluation Tools: Research Issues and Challenges
    Namoun, Abdallah
    Alrehaili, Ahmed
    Tufail, Ali
    DESIGN, USER EXPERIENCE, AND USABILITY: UX RESEARCH AND DESIGN, DUXU 2021, PT I, 2021, 12779 : 292 - 311
  • [45] Usability Evaluation of Dialogue Designs for Voiceprint Authentication in Automated Telephone Banking
    Gunson, Nancie
    Marshall, Diarmid
    McInnes, Fergus
    Morton, Hazel
    Jack, Mervyn
    INTERNATIONAL JOURNAL OF TECHNOLOGY AND HUMAN INTERACTION, 2014, 10 (02) : 59 - 77
  • [46] Webjig: An Automated User Data Collection System for Website Usability Evaluation
    Kiura, Mikio
    Ohira, Masao
    Matsumoto, Ken-ichi
    HUMAN-COMPUTER INTERACTION, PT I, 2009, 5610 : 277 - 286
  • [47] An Evaluation of the Impact of Automated Programming Hints on Performance and Learning
    Marwan, Samiha
    Williams, Joseph Jay
    Price, Thomas
    ICER '19 - PROCEEDINGS OF THE 2019 ACM CONFERENCE ON INTERNATIONAL COMPUTING EDUCATION RESEARCH, 2019, : 61 - 70
  • [48] Accelerated parallel genetic programming tree evaluation with OpenCL
    Augusto, Douglas A.
    Barbosa, Helio J. C.
    JOURNAL OF PARALLEL AND DISTRIBUTED COMPUTING, 2013, 73 (01) : 86 - 100
  • [49] Parallel programming and performance evaluation with the URSA tool family
    Park, I
    Voss, M
    Armstrong, B
    Eigenmann, R
    INTERNATIONAL JOURNAL OF PARALLEL PROGRAMMING, 1998, 26 (05) : 541 - 561
  • [50] Parallel Programming and Performance Evaluation with the URSA Tool Family
    Insung Park
    Michael Voss
    Brian Armstrong
    Rudolf Eigenmann
    International Journal of Parallel Programming, 1998, 26 : 541 - 561