Registered Replication Reports are open for submissions!
Science is broken; let’s fix it. This has been my mantra for some years now, and today we are launching an initiative aimed squarely at one of science’s biggest problems. The problem is called publication bias or the file-drawer problem and it’s resulted in what some have called a replicability crisis.
When researchers do a study and get negative or inconclusive results, those results usually end up in file drawers rather than published. When this is true for studies attempting to replicate already-published findings, we end up with a replicability crisis where people don’t know which published findings can be trusted.
To address the problem, Dan Simons and I are introducing a new article format at the journal Perspectives on Psychological Science (PoPS). The new article format is called Registered Replication Reports (RRR). The process will begin with a psychological scientist interested in replicating an already-published finding. They will explain to we editors why they think replicating the study would be worthwhile (perhaps it has been widely influential but had few or no published replications). If we agree with them, they will be invited to submit a methods section and analysis plan and submit it to we editors. The submission will be sent to reviewers, preferably the authors of the original article that was proposed to be replicated. These reviewers will be asked to help the replicating authors ensure their method is nearly identical to the original study. The submission will at that point be accepted or rejected, and the authors will be told to report back when the data comes in. The methods will also be made public and other laboratories will be invited to join the replication attempt. All the results will be posted in the end, with a meta-analytic estimate of the effect size combining all the data sets (including the original study’s data if it is available). The Open Science Framework website will be used to post some of this. The press release is here, and the details can be found at the PoPS website.
Professor Daniel J. Simons (University of Illlinois) and I are co-editors for the RRRs. The chief editor of Perspectives on Psychological Science is Barbara A. Spellman (University of Virginia), and leadership and staff at the Association for Psychological Science, especially Eric Eich and Aime Ballard, have also played an important role (see their press release).
Three features make RRRs very different from the usual way that science gets published:
1. Preregistration of replication study design and analysis plan and statistics to be conducted BEFORE the data is collected.
- Normally researchers have a disincentive to do replication studies because they usually are difficult to publish. Here we circumvent the usual obstacles to replications by giving researchers a guarantee (provided they meet the conditions agreed during the review process) that their replication will be published, before they do the study.
- There will be no experimenter degrees of freedom to analyse the data in multiple ways until a significant but likely spurious result is found. This is particularly important for complex designs or multiple outcome variables, where those degrees of freedom allow one to always achieve a significant result. Not here.
2. Study is sent for review to the original author on the basis of the plan, BEFORE the data come in.
- Unlike standard replication attempts where the author of the published, replicated study sees it only after the results come in, we will catch the replicated author at an early stage. Many will provide constructive feedback to help perfect the planned protocol so it has the best chance of replicating the already-published target effect.
3. The results will not be presented as a “successful replication” or “failed replication”. Rarely is any one data set definitive by itself, so we will concentrate on making a cumulative estimate of the relevant effect’s size, together with a confidence interval or credibility interval.
- This will encourage people to make more quantitative theories aimed at predicting a certain effect size, rather than only worrying about whether the null hypothesis can be rejected (as we know, the null hypothesis is almost never true, so can almost always be rejected if one gets enough data).
This initiative is the latest in a long journey for me. Ten years ago, thinking that allowing the posting of comments on published papers would result in flaws and missed connections to come to light much earlier, David Eagleman and I published a letter to that effect in Nature and campaigned (unsuccessfully) for commenting to be allowed on PubMed abstracts.
Since then, we’ve seen that even where comments are allowed, few scientists make them, probably because there is little incentive to do so and doing it would risk antagonising their colleagues. In 2007 I became an academic editor and advisory board member for PLoS ONE, which poses fewer obstacles to publishing replication studies than do most journals. I’m lucky to have gone along on the ride as PLoS ONE rapidly became the largest journal in the world (I resigned my positions at PLoS ONE to make time for the gig at PoPS). But despite the general success of PLoS ONE, replication studies were still few and far between.
In 2011, Hal Pashler, Bobbie Spellman, Sean Kang and I started PsychFileDrawer, a website for researchers to post notices about replication studies. This has enjoyed some success, but it seems without the carrot of a published journal article, few researchers will upload results, or perhaps even conduct replication studies.
Finally with this Perspectives on Psychological Science initiative, a number of things have come together to overcome the main obstacles to publication studies: fear of antagonising other researchers and the uphill battle required to get the study published. Some other worthy efforts to encourage replication studies are happening at Cortex and BMC Psychology.
If you’re interested in proposing to conduct a replication study for eventual publication, check out the instructions and then drop us a line at replicationseditor @ psychologicalscience.org!