Reproducibility

Reproducibility

Reproducing the STAP Stem Cell Method

When the stimulus-triggered acquisition of pluripotency (STAP) stem cell papers were published there was tremendous excitement in the scientific community. The papers described a seemingly simple method to reprogram differentiated somatic cells into pluripotency  – a process that usually involves the addition of multiple transcription factors.

The controversy around the papers comes from two separate issues. The initial controversy concerns the images submitted by the authors. First, an image used in Dr. Obokata’s doctoral thesis may have also been used in the Nature papers. However, the image from her thesis was from different experiments and time periods than those reported in the Nature paper. Secondly, a lane in their genomic analysis gel seems to be spliced. Lastly, images from two different placentas look nearly identical. Questionable images are a red flag, and this may be what causes the papers to be retracted.

But the larger issue brought up by these papers is reproducibility, which is much more complex. While it is terrific to see the crowdsourced replication attempts reported on the Knoepfler blog, the attempts did not use the same cells as those reported in the original studies, thus limiting interpretation of the attempts as replications.

This issue emphasizes that how replication studies are conducted is hugely important. With both the Reproducibility Initiative and the Reproducibility Project: Cancer Biology we have adopted a series of best practices that were created by the Center for Open Science to ensure fair and direct replication. Very briefly these are:

  • Conduct a direct replication (using the same materials and methods as closely as possible, including any additional controls as necessary)
  • Obtain input from the original author on our proposed replication protocol
  • Pre-register our protocols in the Open Science Framework
  • Use power calculations to ensure our replication sample size is sufficient to detect the reported effect with at least 80% power
  • Use expert, independent labs from the Science Exchange network with extensive expertise in the techniques being replicated
  • Publish all protocols, results, and data in the Open Science Framework for review by any interested party

One final point, although it may seem like the scientific system rapidly ‘corrected’ itself, this is a highly unusual case. I think this is because the papers are methods papers, not findings papers. Other researchers rushed to reproduce the method, because they wanted to use it for their own research (which is why the replications reported in the Knoepfler blog are not direct replications). It is much less common for data findings to be rapidly corrected by the research community; only 33% of researchers who could not reproduce scientific findings published their failure.

We believe that direct replication to confirm key exploratory results is a critical step in research workflow that is highly underfunded. Ultimately, funding confirmation studies, while perhaps not as ‘exciting’ as exploratory studies, will provide a much greater impact on the research field because it enables researchers to build effectively upon robust experimental results.

At this point, with the retraction of the papers seeming imminent, I think it is worth performing a direct replication to determine if the method works with the same starting materials and conditions used in the original study. Even if the technique is not as robust as originally hoped, if the specific conditions reported are reproducible, this is an important finding that may allow the technique to be extended. If we can find a funder to support a direct replication, the Reproducibility Initiative will conduct a high-fidelity, independent replication of the original studies. Interested parties can contact me at [email protected] to discuss this further.

[about_box image=”http://thebenchapp.s3.amazonaws.com/wp-content/uploads/2012/03/Elizabeth-80.png”]Elizabeth Iorns is the CEO of Science Exchange and Director of the Reproducibility Initiative. Elizabeth conceived the idea for Science Exchange while an Assistant Professor at the University of Miami and as CEO she drives the company’s vision, strategy and growth. She is passionate about creating a new way to foster scientific collaboration that will break down existing silos, democratize access to scientific expertise and accelerate the speed of scientific discovery. Elizabeth has a Ph.D. in Cancer Biology from the Institute of Cancer Research in London, and conducted postdoctoral research in Cancer Biology from the University of Miami’s Miller School of Medicine where her research focused on identifying mechanisms of breast cancer development and progression.[/about_box]

ABOUT THE AUTHOR
author

Elizabeth Iorns

CEO and Co-Founder

5 thoughts on “Reproducing the STAP Stem Cell Method

  1. Dr Iorns,

    Thanks for the clear summary of the situation of the STAP Stem Cell Method.

    This makes indeed a great showcase for the need to introduce reproducibility verification as part of the standard workflow of scientific research.

    Definitely a good project to be supported by philanthropic foundations, along the lines that the New York Times illustrated today:
    http://www.nytimes.com/2014/03/16/science/billionaires-with-big-ideas-are-privatizing-american-science.html?_r=0

    Your list of best practices are also very informative, a great guide to be followed by those among us who appreciate the importance of reproducibility verification.

    Thanks for the great work that you and your team do at the Science Exchange.

    Luis

  2. Dear. Dr Iorns,
    Cannot agree more that a reproducibility initiative would be of great use in scientific research, either if the findings are of great impact or not. This initiative may also be useful in the peer-review process, which may have been questioned after the articles were published.
    Felix

Leave a Reply

Your email address will not be published.