Underreporting in Psychology Experiments: Evidence From a Study Registry

被引:75
作者
Franco, Annie [1 ]
Malhotra, Neil [2 ]
Simonovits, Gabor [1 ]
机构
[1] Stanford Univ, Dept Polit Sci, Stanford, CA 94305 USA
[2] Stanford Univ, Grad Sch Business, Stanford, CA 94305 USA
关键词
research methods; multiple comparisons; experiments; disclosure; research transparency; FILE DRAWER; SCIENCE; INCENTIVES; TRUTH;
D O I
10.1177/1948550615598377
中图分类号
B84 [心理学];
学科分类号
04 ; 0402 ;
摘要
Many scholars have raised concerns about the credibility of empirical findings in psychology, arguing that the proportion of false positives reported in the published literature dramatically exceeds the rate implied by standard significance levels. A major contributor of false positives is the practice of reporting a subset of the potentially relevant statistical analyses pertaining to a research project. This study is the first to provide direct evidence of selective underreporting in psychology experiments. To overcome the problem that the complete experimental design and full set of measured variables are not accessible for most published research, we identify a population of published psychology experiments from a competitive grant program for which questionnaires and data are made publicly available because of an institutional rule. We find that about 40% of studies fail to fully report all experimental conditions and about 70% of studies do not report all outcome variables included in the questionnaire. Reported effect sizes are about twice as large as unreported effect sizes and are about 3 times more likely to be statistically significant.
引用
收藏
页码:8 / 12
页数:5
相关论文
共 16 条
[1]   Finding the missing science: The fate of studies submitted for review by a human subjects committee [J].
Cooper, H ;
DeNeve, K ;
Charlton, K .
PSYCHOLOGICAL METHODS, 1997, 2 (04) :447-452
[2]   The New Statistics: Why and How [J].
Cumming, Geoff .
PSYCHOLOGICAL SCIENCE, 2014, 25 (01) :7-29
[3]   Business Not as Usual [J].
Eich, Eric .
PSYCHOLOGICAL SCIENCE, 2014, 25 (01) :3-6
[4]   Underreporting in Political Science Survey Experiments: Comparing Questionnaires to Published Results [J].
Franco, Annie ;
Malhotra, Neil ;
Simonovits, Gabor .
POLITICAL ANALYSIS, 2015, 23 (02) :306-312
[5]   Publication bias in the social sciences: Unlocking the file drawer [J].
Franco, Annie ;
Malhotra, Neil ;
Simonovits, Gabor .
SCIENCE, 2014, 345 (6203) :1502-1505
[6]   The Statistical Crisis in Science [J].
Gelman, Andrew ;
Loken, Eric .
AMERICAN SCIENTIST, 2014, 102 (06) :460-465
[7]   Measuring the Prevalence of Questionable Research Practices With Incentives for Truth Telling [J].
John, Leslie K. ;
Loewenstein, George ;
Prelec, Drazen .
PSYCHOLOGICAL SCIENCE, 2012, 23 (05) :524-532
[8]   Let's Put Our Money Where Our Mouth Is If Authors Are to Change Their Ways, Reviewers (and Editors) Must Change With Them [J].
Maner, Jon K. .
PERSPECTIVES ON PSYCHOLOGICAL SCIENCE, 2014, 9 (03) :343-351
[9]   Promoting Transparency in Social Science Research [J].
Miguel, E. ;
Camerer, C. ;
Casey, K. ;
Cohen, J. ;
Esterling, K. M. ;
Gerber, A. ;
Glennerster, R. ;
Green, D. P. ;
Humphreys, M. ;
Imbens, G. ;
Laitin, D. ;
Madon, T. ;
Nelson, L. ;
Nosek, B. A. ;
Petersen, M. ;
Sedlmayr, R. ;
Simmons, J. P. ;
Simonsohn, U. ;
Van der Laan, M. .
SCIENCE, 2014, 343 (6166) :30-31
[10]   Registered Reports A Method to Increase the Credibility of Published Results [J].
Nosek, Brian A. ;
Lakens, Daniel .
SOCIAL PSYCHOLOGY, 2014, 45 (03) :137-141