Understanding peer review of software engineering papers

被引:7
|
作者
Ernst, Neil A. [1 ]
Carver, Jeffrey C. [2 ]
Mendez, Daniel [3 ,4 ]
Torchiano, Marco [5 ]
机构
[1] Univ Victoria, Victoria, BC, Canada
[2] Univ Alabama, Tuscaloosa, AL USA
[3] Blekinge Inst Technol, Karlskrona, Sweden
[4] Fortiss GmbH, Munich, Germany
[5] Politecn Torino, Turin, Italy
关键词
Peer review; Interview; Survey;
D O I
10.1007/s10664-021-10005-5
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
Context Peer review is a key activity intended to preserve the quality and integrity of scientific publications. However, in practice it is far from perfect. Objective We aim at understanding how reviewers, including those who have won awards for reviewing, perform their reviews of software engineering papers to identify both what makes a good reviewing approach and what makes a good paper. Method We first conducted a series of interviews with recognised reviewers in the software engineering field. Then, we used the results of those interviews to develop a questionnaire used in an online survey and sent out to reviewers from well-respected venues covering a number of software engineering disciplines, some of whom had won awards for their reviewing efforts. Results We analyzed the responses from the interviews and from 175 reviewers who completed the online survey (including both reviewers who had won awards and those who had not). We report on several descriptive results, including: Nearly half of award-winners (45%) are reviewing 20+ conference papers a year, while 28% of non-award winners conduct that many. The majority of reviewers (88%) are taking more than two hours on journal reviews. We also report on qualitative results. Our findings suggest that the most important criteria of a good review is that it should be factual and helpful, which ranked above others such as being detailed or kind. The most important features of papers that result in positive reviews are a clear and supported validation, an interesting problem, and novelty. Conversely, negative reviews tend to result from papers that have a mismatch between the method and the claims and from papers with overly grandiose claims. Further insights include, if not limited to, that reviewers view data availability and its consistency as being important or that authors need to make their contribution of the work very clear in their paper. Conclusions Based on the insights we gained through our study, we conclude our work by compiling a proto-guideline for reviewing. One hope we associate with our work is to contribute to the ongoing debate and contemporary effort to further improve our peer review models in the future.
引用
收藏
页数:29
相关论文
共 50 条
  • [31] Understanding the Value of Software Engineering Technologies
    Green, Phillip, II
    Menzies, Tim
    Williams, Steven
    El-Rawas, Oussama
    2009 IEEE/ACM INTERNATIONAL CONFERENCE ON AUTOMATED SOFTWARE ENGINEERING, PROCEEDINGS, 2009, : 52 - 61
  • [32] Automated software engineering: supporting understanding
    Jackson, Michael
    AUTOMATED SOFTWARE ENGINEERING, 2008, 15 (3-4) : 275 - 281
  • [33] Engineering the software for understanding climate change
    Department of Computer Science, University of Toronto, Canada
    不详
    Comput. Sci. Eng., 2009, 6 (65-74):
  • [34] Review of papers describing neuroinformatics software
    De Schutter E.
    Ascoli G.A.
    Kennedy D.N.
    Neuroinformatics, 2009, 7 (4) : 211 - 212
  • [35] Optimizing Peer Review of Software Code
    Sliz, Piotr
    Morin, Andrew
    SCIENCE, 2013, 341 (6143) : 236 - 237
  • [36] Extracting information from experimental software engineering papers
    Cruzes, Daniela
    Mendonca, Manoel
    Basili, Victor
    Shull, Forrest
    Jino, Mario
    SCCC 2007: XXVI INTERNATIONAL CONFERENCE OF THE CHILEAN SOCIETY OF COMPUTER SCIENCE, PROCEEDINGS, 2007, : 105 - +
  • [37] Writing Good Software Engineering Research Papers: Revisited
    Theisen, Christopher
    Dunaiski, Marcel
    Williams, Laurie
    Visser, Willem
    PROCEEDINGS OF THE 2017 IEEE/ACM 39TH INTERNATIONAL CONFERENCE ON SOFTWARE ENGINEERING COMPANION (ICSE-C 2017), 2017, : 402 - 402
  • [38] Fundamentals of Software Engineering (selected papers of FSEN 2013)
    Hojjat, Hossein
    Sirjani, Marjan
    Arbab, Farhad
    SCIENCE OF COMPUTER PROGRAMMING, 2015, 112 : 1 - 2
  • [39] Writing good software engineering research papers - Minitutorial
    Shaw, M
    25TH INTERNATIONAL CONFERENCE ON SOFTWARE ENGINEERING, PROCEEDINGS, 2003, : 726 - 736
  • [40] A cautionary note on checking software engineering papers for plagiarism
    Kaner, Cem
    Fiedler, Rebecca L.
    IEEE TRANSACTIONS ON EDUCATION, 2008, 51 (02) : 184 - 188