The project is a collaboration between Science Exchange and the Center for Open Science (COS) to independently replicate key experiments from high-impact, published cancer biology studies. Unlike other assessments of reproducibility, the RP:CB studies and their results are completely open to the public.
The RP:CB studies highlight some of the practical considerations associated with replicating an existing study. For example, the RP:CB studies tackle the questions:
How do we define “replicate”?
What are the minimum requirements for reporting to enable a replication study?
How much time do replication studies take?
How much do replication studies cost?
The preliminary results of the RP:CB project, as eloquently summarized in The Atlantic, indicate that replication studies are lengthy and difficult.
Are the resources required for replication studies worth the benefits? Undoubtedly.
High-profile reports, from researchers at Amgen, Bayer, and elsewhere, illustrate the industry’s concerns that this lack of reproducibility might be driving the low success rate of drug candidates. Despite the costs of irreproducibility, researchers have few incentives to replicate studies. Results from replication studies have reduced chances of being published in traditional journals and are rarely prioritized for grant funding. The Reproducibility Project: Cancer Biology is helping initiate a cultural shift in the research community to motivate scientists to perform independent replication.
Our mission at Science Exchange is to facilitate collaboration between the world’s best scientific labs.We hope to play a big part in that cultural shift.
Still have questions? Download our FAQthat answers the most-asked questions on this project.
My TEDMED talk about scientific reproducibility was released today, so I wanted to take the opportunity to provide some additional thoughts about the importance of replication studies.
Every year, billions of dollars are spent funding biomedical research, resulting in more than one million new publications presenting promising new results. This research is the foundation upon which new therapies will be developed to enhance health, lengthen life, and reduce the burdens of illness and disability.
In order to build upon this foundational research, these results must be reproducible. Simply put, this means that when an experiment is repeated, similar results are observed. Over the last five years, multiple groups have raised concerns over the reproducibility of biomedical studies, with some estimates indicating only ~20% of published results may be reproducible (Scott et al. 2008, Gordon et al. 2007, Prinz et al. 2011, Steward et al. 2012, Begley and Ellis 2012). The National Institutes of Health (NIH), the largest public funder of biomedical research, has stated, “There remains a troubling frequency of published reports that claim a significant result, but fail to be reproducible. As a funding agency, the NIH is deeply concerned about this problem”.
Despite the growing concern over lack of reproducibility, funding for replication studies, the only way to determine reproducibility, is still absent. With no funding systematically allocated to such studies, scientists almost never conduct replication studies. It would be interesting to obtain the exact numbers, but it appears that last year the NIH allocated $0 to funding replication studies, out of a $30B+ budget. In the absence of replication studies, scientists end up wasting precious time and resources trying to build on a vast, unreliable body of knowledge.
It is easy to see why funders might shy away from funding replication studies. Funders want to demonstrate their “impact,” and it is tempting for them to solely focus on funding novel exploratory findings that can more easily be published in high profile journals. This is a mistake. Funders should instead focus on how to truly achieve their stated goals of enhancing health, lengthening life, and reducing the burdens of illness and disability. Although allocating a portion of funding towards replication studies would divert funds from new discoveries, it would enable scientists to efficiently determine which discoveries were robust and reproducible and which were not. This would allow more rapid advancements by allowing scientists to build upon the most promising findings and avoid wasting their time and funding pursuing non-robust results.
Some researchers find the idea of replicating previous studies unnecessary or even offensive. However, it is the responsibility of the scientific community, including funders, to work as quickly and cost effectively as possible to make progress. Introducing replication studies as part of the process provides an effective way to enable this.
If you would like to see funding specifically allocated for replication studies, please register your support. We will share this information with funders in the hope that it will encourage them to establish funding programs specifically for replication studies to improve the speed and efficiency of progress in biomedical research.
by Elizabeth Iorns, Ph.D.
CEO and Co-Founder
About Science Exchange
Science Exchange is the world’s leading marketplace for outsourced research. The Science Exchange network of 3000+ scientific service providers has run the experiments for the major replication studies that have been conducted to date including the largest biomedical replication study undertaken (Reproducibility Project: Cancer Biology). Additional details are available here: https://www.scienceexchange.com/applications/reproducibility
We won four grants to reanalyze four published journal articles in the field of public health. These grants covere four subjects: Cash transfers and sexually transmitted diseases, the necessary training of healthcare providers, circumcision and HIV and the affect of the US government’s spending in Africa on mortality. The grants come from 3ie, which is funded by the Gates Foundation. The work will be performed by a lab listed on Science Exchange: the University of Nebraska’s Center for Collaboration on Research Design and Analysis.
The grants cover various aspects of healthcare in developing countries. The first grant will analyze a paper published in Lancet in 2012 by Sarah Baird, Richard Garfein, Craig McIntosh and Berk Ozler. This paper, Effect of a Cash Transfer Programme for Schooling on Prevalence of HIV and Herpes Simplex Type 2 in Malawi: A Cluster Randomized Trial, showed that direct cash transfers decreased the prevalence of HIV and herpes simplex virus 2 (HSV-2) as well as the sexual behavior of the young women receiving transfers for 18 months. The second grant will reanalyze the 2012 paper published in Lancet, Task shifting of antiretroviral treatment from doctors to primary-care nurses in South Africa (STRETCH): a pragmatic, parallel, cluster-randomised trial. This paper examined the affect of using nurses, instead of scarce doctors to administer anti-retroviral treatment to patients with HIV. They found that STRETCH was not inferior to standard care and supports expanding the pool of ART prescribers beyond doctors to nurses. The third grant seeks to replicate the paper from 2011, Effect of circumcision of HIV-negative men on transmission of human papillomavirus to HIV-negative. This paper addresses an important question about HPV prevention and it evaluates male circumcision as a HPV prevention strategy among rural African HIV-negative women who lack both resources and vaccines that cover the existing high-risk HPV genotypes. The results from Wawer et al. (2011) provide strong support for use of male circumcision for HPV prevention and cervical neoplasia in HIV negative female partners. The fourth grant will examine a study published in JAMA in 2012 titled, HIV Development Assistance and Adult Mortality in Africa. This study investigates the relationship between increased funding to countries receiving aid through the President’s Emergency Plan for AIDS relief (PEPFAR) and adult mortality more generally. PEPFAR is the initiative developed by President George W Bush which increased funding to select countries from 2004 to 2010. The main finding of the paper is that PEPFAR countries had dramatically lower mortality than non-PEPFAR countries.
Dr. Nicole Perfito, Science Exchange lead for these projects says, “Using Science Exchange to gather the resources to reanalyze these experiments means it can be done faster and cheaper than would normally occur.”
The analysis of these papers will bring extra scientific rigor to the study of health in developing countries. The University of Nebraska’s Center for Collaboration on Research Design and Analysis will do the analysis of these journal articles under the direction of Dr. Nicole Perfito at Science Exchange. Once the results are complete, these replicated studies will be published in peer-reviewed journals.
February 9, 2015 | Posted by Reproducibility Project Core Team in Reproducibility |
“Reproducibility is actually the heart of science. The fact that not everything is reproducible is not a surprise.” – Eric Lander, head of the Broad Institute at MIT in a recent Washington Post article.
“We’re always in a gray area between perfect truth and complete falsehood,” The best researchers can do is edge closer to truth. – Giovanni Parmigiani, Dana-Farber Cancer Institute in a recent ScienceNews article
The Reproducibility Project, a collaboration between Science Exchange and the Center for Open Science, is independently replicating some of the most impactful studies in cancer biology. Along the way, not only will the collaboration shepherd 50 studies through the process of replication and meta-analysis, but it will also help to mature the discussion around reproducibility more generally. Where do the biggest challenges lie? What are some of the key predictors of whether experiments are reproducible? The answer to these questions will be critical as the reproducibility initiative gains traction.
Since December, experimental work has begun on four more replication studies, and three more Registered Reports have been published in eLife (with a fourth* accepted and on the way):
In addition, Sean Morrison, director of the Children’s Medical Institute at the University of Texas–Southwestern and a senior editor at eLife, has written an editorial introducing the Reproducibility Project: Cancer Biology, highlighting the role this project could play in beginning to reform scientific discovery methods to maximize reproducibility. He notes that:
“to be responsible stewards of the public’s investment in this work we have to maximize the pace of discovery and the efficiency with which discoveries get translated to the benefit of patients. By gauging the fraction of high-impact results that are not reproducible, we can consider what further steps should be taken to promote good science….[M]easuring the magnitude of the problem with efforts like the Reproducibility Project: Cancer Biology is an important step in the right direction” (2).
We are proud to announce today that we have partnered with the Prostate Cancer Foundation (PCF), with funding from the Movember Foundation, to reproduce findings that have implications for prostate cancer patients. We will be collaborating with PCF to identify faster, high-impact biomedical findings that that can improve early detection and new cures.
PCF’s Chief Science Officer, Dr. Soule stated “This first-in-field foundation initiative is all about getting the smart stuff to patients quicker. We will see an acceleration of progress due to the mobilization of resources against the robust findings.”
Our Software Engineer Michael Kompanets with last year’s Movember mustache.
Science Exchange has a been long-time fan and supporter of PCF and the Movember Foundation (see picture to the right), so we are thrilled to be working with them to incorporate replication studies into their funding strategy. We will be utilizing the best practices that we’ve established for our Reproducibility Project: Cancer Biology to enable confirmation of high potential exploratory research results. Our hope is that by identifying robust reproducible results, we can accelerate prostate cancer research.
“The Movember Foundation is committed to accelerating the translation of promising discoveries into new tests and treatments,” said Paul Villanti, Executive Director of Programs, Movember Foundation. “Through quicker validation of the science, and if the science is true, we can help find new cures and prevent prostate cancer in more men at a faster rate. The Movember Foundation is confident that this initiative will play an important role in supporting this goal.” Read the rest of this entry »
October 14, 2014 | Posted by Reproducibility Project Core Team in Reproducibility |
There has been growing concern in the scientific community over the last several years about a lack of reproducible results in the biomedical research community. Recently, two large pharmaceutical companies (Amgen and Bayer) announced that they could only reproduce a small fraction of published preclinical cancer biology studies. These results have shocked the scientific community, and have lead to calls mandating an overhaul of both funding and publishing practices to address the crisis. The NIH, as well as the journals Nature and Science, are all proposing strategies to help improve the research process.
However, a major question remains: Why weren’t these experiments reproducible? Valid arguments exist suggesting scientists are falling prey to poor experimental design, flawed statistical analysis, and/or biased data interpretation, all of which can prevent their results from being replicable. However, there are many innocuous reasons why a particular experiment might fail to replicate the original results, from errors or changes in the protocol, to a lack of expertise in performing a particular technique, to unknown factors that produce variability in results. Unfortunately, it’s hard to draw conclusions from the Amgen and Bayer studies because these companies made none of their data or methods public.
The birth of the Reproducibility Project: Cancer Biology
We believe that in order to really understand the crisis in reproducibility, including its prevalence, scope and underlying causes, we need a large dataset of actual replication experiments. These replications must be conducted in a rigorously empirical fashion, using detailed protocols as close to the original study as possible, and conducted by expert scientists trained in the original techniques. Most importantly, the details of these replication datasets must be freely available to everyone.
We are excited to announce that eLife has joined our partnership with the Center for Open Science to work on the Reproducibility Project: Cancer Biology!
eLife is an open access journal co-founded by the Howard Hughes Medical Institute, the Wellcome Trust and the Max Plank Institute. We are proud to have the work of the RP:CB published through them.
Each study in the RP:CB will undergo two rounds of review and publication. The first round will present the proposed replication plan to the public in the form of a Registered Report. This Registered Report will ensure that the proposed protocols have been reviewed by scientific and statistical experts prior to experimental work commencing. The completed work and all data will then be published as a Replication study. All data generated will be freely available to the public through eLife’s open access platform. Registered Reports are now under review by the eLife Board of reviewing editors and will be published in the eLife journal as available.
The Reproducibility Project: Cancer Biology aims to replicate key findings from 50 high profile papers from the field of cancer biology.
“We need an objective way to evaluate reproducibility,” said Randy Scheckman, who is the Editor-in-Chief of eLife and a Nobel prize winning cell biologist at the University of California- Berkeley. “This project is a valuable opportunity to generate a high-quality dataset to address questions about reproducibility constructively and rigorously.”
This week we are featuring a guest post on how peer review can improve reproducibility. Check out Aimee Whitcroft from Publons’ thoughts below.
There has been much talk over the last few years about the fact that most research, particularly in the medical fields, may not be reproducible – a stunning waste of time and resources.
At Publons, we’ve been following the crisis closely, and we at think improved peer review is a vital first step towards reproducibility in academic and scientific literature. Read the rest of this entry »
When the stimulus-triggered acquisition of pluripotency (STAP) stem cellpapers were published there was tremendous excitement in the scientific community. The papers described a seemingly simple method to reprogram differentiated somatic cells into pluripotency – a process that usually involves the addition of multiple transcription factors.
The controversy around the papers comes from two separate issues. The initial controversy concerns the images submitted by the authors. First, an image used in Dr. Obokata’s doctoral thesis may have also been used in the Nature papers. However, the image from her thesis was from different experiments and time periods than those reported in the Nature paper. Secondly, a lane in their genomic analysis gel seems to be spliced. Lastly, images from two different placentas look nearly identical. Questionable images are a red flag, and this may be what causes the papers to be retracted.
But the larger issue brought up by these papers is reproducibility, which is much more complex. While it is terrific to see the crowdsourced replication attempts reported on the Knoepfler blog, the attempts did not use the same cells as those reported in the original studies, thus limiting interpretation of the attempts as replications. Read the rest of this entry »