A recent study has found that a last year’s meta-analysis that had revealed psychology studies may have huge replication issues has its very own replication errors.
In 2015, a group of 270 researchers known as the Open Science Collaboration (OSC) reported that more than half of the 100 studies they had reviewed couldn’t be replicated.
As a result, OSC had made international headlines and sparked worldwide concerns over a “replication crisis” in psychological research. But a recently published study challenges the group’s conclusions, highlighting some blatant replication errors in the 2015 study.
A team of researchers from Harvard and Virginia universities found several flaws in the meta-analysis.
First, the 2015 study didn’t take into account the possibility of statistical errors in both replication studies and the original studies. Statistical errors alone are enough to explain why replication attempts of original studies sometimes fail.
Second, the research team found that some studies have replication issues simply because replicators employ other methods of carrying out research than the authors of the original research. In other words, replicators alone cause their own research to fail.
Finally, the authors of the meta-analysis employed a “low-powered” design which made even studies that have been proven to be highly replicable to have a low replication rate.
All these three aspects basically compromised the OSC’s conclusions from start. The authors of the recent study provided more details on the flaws in the journal Science.
THe latest study’s authors recalled that last year they were shocked to learn that psychological science may undergo a replication crisis. So, they planned to learn whether the OSC were right. At a closer look, however, they noticed flaws especially in how OSC replicators picked the 100 research papers.
Prof. Gary King of Harvard University explained that whenever a study tries to make estimations on a specific population of studies, scientists should either base their analysis on randomly selected studies from that population or perform statistical corrections if they didn’t. Surprisingly, OSC scientists did none of the two.
Additionally, replicators failed to include in their analysis studies that used some of the best-in-science methods to draw their conclusions. Plus, replicators were allowed to select at will the research papers they planned to replicate.
If these flaws were noticed in a study on people instead of psychological studies, no respectable peer-reviewed journal would have published the research paper, King also said.
Image Source: hannaharendtcenter.org