中国A片

Holding a mirror up to science

A bid to reproduce key studies is less a tale of failure than of researchers’ willingness to put their own practices to the test

九月 3, 2015
Handheld mirror laid on fabric

Four years ago, the academic world emitted a collective gasp when it was revealed that high-flying Dutch social psychologist Diederik Stapel had invented most of the data that had made his name.

Reflecting on how it could have happened, the Dutch investigating committee pointed to “a general neglect of fundamental scientific standards” by editors and reviewers of the journals that published his papers. There were certain aspects of social psychology, the committee added, that “should be deemed undesirable or even incorrect from the perspective of academic standards and scientific integrity”.

Other scandals and failed replications quickly followed, prompting Nobel prizewinning psychologist Daniel Kahneman to write an open letter to students studying “social priming effects”, warning them that there was a “train wreck looming” in their field that could scupper their job prospects. He also called for independent attempts to replicate five priming effects.

Kahneman came in for heavy criticism from some social psychologists for unfairly singling out their discipline. In an article for Times 中国A片, Miles Hewstone and Wolfgang Stoebe accused him of overreacting and failing to foresee the “immeasurable damage” his letter would do to social psychology (“Social psychology is primed but not suspect”, 28 February 2013).

This is the background to the landmark collective, rigorous attempt by psychologists, begun in 2011, to replicate not five but 100 prominent studies in their field, as we report in our news pages this week. The results of the Reproducibility Project: Psychology, presented last week in the journal Science, are certainly depressing: replication was possible in less than half of cases – and less still in social psychology specifically.

But it is important to make several points. First, as our cover feature this week makes clear, concerns about reproducibility long predate the Stapel scandal, and extend far beyond psychology, into most of biomedicine and quantitative social science. Second, it is not clear what we should read into replication failures, because, some argue, it is an inevitable concomitant of the scientific method. Most importantly, we should be heartened that researchers increasingly are facing up to the issue; as the feature reveals, replication attempts are just one of several initiatives to have been launched recently.

If science had a head of PR, that person would warn against replication attempts. Failures inevitably play into the hands of science’s many detractors, and the psychology findings will doubtless be used by, for instance, climate change deniers to bolster their claim that the proclamations of scientists can’t be trusted.

Yet, despite these dangers, 350 psychologists were still sufficiently inspired by the scientific ideal to devote untold effort to uncovering the truth, come what may. Of course, as Brian Nosek, the initiative’s coordinator, points out, their efforts are just “a drop in the bucket” in terms of discovering the extent of reproducibility across science. But if researchers in other fields add their own drops, the resulting pool could be revealing.

One thing is clear: the drip-drip of doubt about the reliability of the scientific literature is no good to anyone. It is imperative that scientists do their utmost to probe the realities not merely of the world around them, but of their own endeavour, too.

paul.jump@tesglobal.com

请先注册再继续

为何要注册?

  • 注册是免费的,而且十分便捷
  • 注册成功后,您每月可免费阅读3篇文章
  • 订阅我们的邮件
注册
Please 登录 or 注册 to read this article.
ADVERTISEMENT