This study assessed the transparency and reproducibility of psychological meta-analyses by conducting a meta-review that sampled 150 studies from Psychological Bulletin to extract information about each review’s transparent and reproducible reporting practices.
Systematic review and meta-analysis are possible as viable research techniques only through transparent reporting of primary research; thus, one might expect meta-analysts to demonstrate best practice in their reporting of results and have a high degree of transparency leading to reproducibility of their work. This assumption has yet to be fully tested in the psychological sciences. The results of the current study revealed that authors reported on average 55 percent of criteria and that transparent reporting practices increased over the three decades studied (b = 1.09, SE = 0.24, t = 4.519, p < .001). Review authors consistently reported eligibility criteria, effect-size information, and synthesis techniques. Review authors, however, on average, did not report specific search results, screening and extraction procedures, and most importantly, effect-size and moderator information from each individual study. Far fewer studies provided statistical code required for complete analytical replication. This study argues that the field of psychology and research synthesis in general should require review authors to report these elements in a transparent and reproducible manner. (publisher abstract modified)