The Reproducers: Heroes of Science | The Body of Evidence

The Reproducers: Heroes of Science

The headlines might have been grim--Gizmodo's read "A Lot of Published Psychology Results are Bullshit"--but I want to see these lemons for the lemonade that they are.

This impressive achievement is signed "Open Science Collaboration", which reminded me of that other famous letter signed "The Breakfast Club". This time around, they're all "brains". Their website describes this Collaboration as "a loose network of researchers, professionals, citizen scientists, and others with an interest in open science, metascience, and good scientific practices." This fantastic initiative is supported by the Center for Open Science which aims to "foster the openness, integrity, and reproducibility of scientific research", which may sound like a given in science, but it really, really isn't.

Biases are rampant in scientific research: scientists tend to select the best experiments ("best of three... no, best of four! Best of five?") and only publish positive results. Moreover, scientists are human. It is easy to be lured by prestige, power, and money, even in academia, and to nudge your research away from the truth and down the path that will lead to assured glory. So how are you, citizen, to know which scientific result is worth heralding and which is worth ignoring?

Replication.

A replication is the act of taking a scientist's recipe and redoing their experiment, as close to the original as possible, to see if the result is the same. Put this way, it seems like a no-brainer; there is, tragically, very little money and no prestige to be made from replications, which is why the scientific literature is full of false positives that are never corrected.

What the Open Science Collaboration did is to take 100 published studies in psychology and, over the course of many years, try to replicate the main experiment at the centre of each of them. They played fairly, by contacting the original scientists and sending them the replication protocol for approval. They registered the protocol publicly to make sure this whole endeavour was as transparent as possible.

And what did they find?

When looking at whether the replication showed the same positive effect ("we see an association between X and Y" or "this intervention works") in a way that was statistically significant, they found that this happened for only 36.1% of studies.

Not good; but this is only one piece of the puzzle. It's a "yes" or "no" piece. They also looked at how big of an effect these original studies claimed. This is called "effect size". Did they get the same magnitude of effect in the replication study as in the original paper? That success rate was only 47.4%, with the effect sizes of the replication studies tending to be smaller than the original ones. This decline in effect size was not worse in any discipline of psychology; however, that first finding, whether the effect was there or not, was much more damning for social psychology (25% reproducibility) than for cognitive psychology (50% reproducibility).

As the authors of this study write, replications can fail as well. Just because an original study says "yes" and a replication study says "no", it does not follow that the original study must be wrong. Moreover, they state that "a healthy discipline will have many false starts as it confronts the limits of present understanding." But these numbers should make us pause.

Another rock star in the world of reproducibility, Dr. John Ioannidis, came to where I worked a few years ago and exposed these problems to our researchers, staff, physicians, and students. The atmosphere on the way out of the auditorium was palpably uncomfortable. But we went back to our lab benches and continued our experiments because the show must go on.

I don't expect this paper, which you can read in its entirety here, will have the electrical jolt effect it deserves. However, the more of these papers are published and publicized by the media, the better to change the climate of science.

Scientists actively engaging in reproducibility experiments should be applauded as heroes of science and need to serve as role models for young scientists. Science is not about publishing in glorious journals; it is about following a rigorous methodology to move ever closer to the truth.

I will leave you with this nugget of wisdom from the authors' discussion: "Innovation points out paths that are possible; replication points out paths that are likely; progress relies on both."