Methods for addressing publication bias in school psychology journals: A descriptive review of meta-analyses from 1980 to 2019


Although meta-analyses are often used to inform practitioners and researchers, the resulting effect sizes can be artificially inflated due to publication bias. There are a number of methods to protect against, detect, and correct for publication bias. Currently, it is unknown to what extent scholars publishing meta-analyses within school psychology journals use these methods to address publication bias and whether more recently published meta-analyses more frequently utilize these methods. A historical review of every meta-analysis published to date within the most prominent school psychology journals (N = 10) revealed that 88 meta-analyses were published from 1980 to early 2019. Exactly half of them included grey literature, and 60% utilized methods to detect and correct for publication bias. The most common methods were visual analysis of a funnel plot, Orwin's failsafe N, Egger's regression, and the trim and fill procedure. None of these methods were used in more than 20% of the studies. About half of the studies incorporated one method, 20% incorporated two methods, 7% incorporated three methods, and none incorporated all four methods. These methods were most evident in studies published recently. Similar to other fields, the true estimates of effects from meta-analyses published in school psychology journals may not be available, and practitioners may be utilizing interventions that are, in fact, not as strong as believed. Practitioners, researchers employing meta-analysis techniques, education programs, and editors and peer reviewers in school psychology should continue to guard against publication bias using these methods.

Publication Title

Journal of School Psychology