Revisiting the file drawer problem in meta-analysis: An assessment of published and nonpublished correlation matrices

Abstract

The file drawer problem rests on the assumption that statistically non-significant results are less likely to be published in primary-level studies and less likely to be included in meta-analytic reviews, thereby resulting in upwardly biased meta-analytically derived effect sizes. We conducted 5 studies to assess the extent of the file drawer problem in nonexperimental research. In Study 1, we examined 37,970 correlations included in 403 matrices published in Academy of Management Journal (AMJ), Journal of Applied Psychology (JAP), and Personnel Psychology (PPsych) between 1985 and 2009 and found that 46.81% of those correlations are not statistically significant. In Study 2, we examined 6,935 correlations used as input in 51 meta-analyses published in AMJ, JAP, PPsych, and elsewhere between 1982 and 2009 and found that 44.31% of those correlations are not statistically significant. In Study 3, we examined 13,943 correlations reported in 167 matrices in nonpublished manuscripts and found that 45.45% of those correlations are not statistically significant. In Study 4, we examined 20,860 correlations reported in 217 matrices in doctoral dissertations and found that 50.78% of those correlations are not statistically significant. In Study 5, we compared the average magnitude of a sample of 1,002 correlations from Study 1 (published articles) versus 1,224 from Study 4 (dissertations) and found that they were virtually identical (i.e., .2270 and .2279, respectively). In sum, our 5 studies provide consistent empirical evidence that the file drawer problem does not produce an inflation bias and does not pose a serious threat to the validity of meta-analytically derived conclusions as is currently believed. © 2012 Wiley Periodicals, Inc.

Publication Title

Personnel Psychology

Share

COinS