Recently, some press has been generated following the publication of an journal article authored by the Open Science Collaboration. The Collaboration consisted of hundreds of contributors and volunteers who duplicated 98 (*) studies that had been published in top-tier psychology journals. The key outcome of the Collaboration’s paper was that the majority of studies could not be replicated.
So, what does this say about the quality and practice of psychological research? For some, it represents a harbinger of the apocalypse. For others, it is merely a benign demonstration of the ultimate self-correcting nature of science.
The Open Science Collaboration was a signature project developed by the Center for Open Science. The Center’s mission is to promote openness, transparency, and reproducibility in science. Its aim is to encourage the sharing of data and research protocols and ideally improve the validity of empirical research.
For sure, it is disappointing that many psychological studies failed to replicate. Yet, in a more broader sense, the Open Science Collaboration is amazing in its sheer scale of participants and the generosity (and courage) of the original researchers who shared their research data and protocols.
However, an irony lurks in how the Open Science Collaboration published its results.
The Open Science Collaboration results were published in the journal Science. Science is the prestigious flagship journal of the American Association for the Advancement of Science (AAAS). In their own words, AAAS is dedicated to advancing science for the benefit of all people.
A brief summary of the Collaboration’s results and a link to the full text can be found on the AAAS website. It looks like this:
No problem, I understand. This is an important journal and publishing in it promises attention and a wide dissemination of your results (sort of). And it would certainly be a major coup to have your manuscript accepted to Science. Now, access to this article is no big deal if you are a member of the American Association for the Advancement of Science or belong to an institution that subscribes to the journal.
But not everyone fits those categories. And it is a little hard to swallow a lecture on openness and transparency when the authors themselves (all 270 of them) place their work behind a paywall.
I suspect some of the authors of the Collaboration also saw the irony of a restricted article that promoted openness and transparency. I say this because you can find the full text of this article with a little navigation on Research Gate.
Let me save you the trouble. I have included the full paper for download below. I am confident you will only use it for personal or educational purposes.
What else would you do with it?
- Open Science Collaboration. (2015). Estimating the reproducibility of psychological science. Science, 349(6251), aac4716.
[*] I understand that the abstract and summary for this article indicates that 100 studies were replicated. However, from my read of the paper, 98 studies were successfully replicated with two studies replicated twice. (See second page of the paper, first column, last paragraph.) So that makes 100 replications from 98 studies, not 100 studies replicated.