Revisiting the file drawer problem in meta-analysis


The file drawer problem is considered one of the biggest threats to the validity of meta-analytic conclusions. The assumption is that null results are less likely to be published in primary-level studies and, hence, less likely to be included in meta-analytic reviews, thereby resulting in an upwardly biased sample of primary-level effect size estimates and upwardly biased meta-analytically derived effect sizes. We conducted three studies to assess the extent of the file drawer problem. In Study 1, we examined correlation matrices including 50,841 effect sizes published in Journal of Applied Psychology (JAP) and Personnel Psychology (PPsych) between 1985 and 2009 and found that 47.76% of those correlations are not statistically significant. In Study 2, we examined 6,935 correlations used as input in 51 meta-analyses published in Academy of Management Journal, Journal of Management, JAP, PPsych, and Strategic Management Journal between 1982 and 2009 and found that 45.25% are not statistically significant. In Study 3, we examined 167 non-published correlation matrices including 27,886 effect sizes from non-published manuscripts written by management and applied psychology scholars and found that 47.12% of those correlations are not statistically significant. These results challenge the assumption that meta-analytically derived effect sizes are upwardly biased because they rely on primary-level effect sizes that are mostly statistically significant. Thus, our results have important implications for substantive theory and practice because they indicate that, contrary to a long-held and established belief, the file drawer problem may not pose a serious threat to the validity of meta-analytically derived conclusions.

Publication Title

Academy of Management 2011 Annual Meeting - West Meets East: Enlightening. Balancing. Transcending, AOM 2011