First open-access reproducibility project reveals roadblocks to performing replication studies

January 19, 2017 | Posted by Team in Reproducibility, Research |

Reproducibility has re-emerged at the forefront of public awareness this week, as the first five replication studies executed by the Reproducibility Project: Cancer Biology (RP:CB) have just been published in the open-access journal eLife.

The project is a collaboration between Science Exchange and the Center for Open Science (COS) to independently replicate key experiments from high-impact, published cancer biology studies. Unlike other assessments of reproducibility, the RP:CB studies and their results are completely open to the public.

The RP:CB studies highlight some of the practical considerations associated with replicating an existing study. For example, the RP:CB studies tackle the questions:

  • How do we define “replicate”?
  • What are the minimum requirements for reporting to enable a replication study?
  • How much time do replication studies take?
  • How much do replication studies cost?

The preliminary results of the RP:CB project, as eloquently summarized in The Atlantic, indicate that replication studies are lengthy and difficult.

Are the resources required for replication studies worth the benefits? Undoubtedly.

High-profile reports, from researchers at Amgen, Bayer, and elsewhere, illustrate the industry’s concerns that this lack of reproducibility might be driving the low success rate of drug candidates. Despite the costs of irreproducibility, researchers have few incentives to replicate studies. Results from replication studies have reduced chances of being published in traditional journals and are rarely prioritized for grant funding. The Reproducibility Project: Cancer Biology is helping initiate a cultural shift in the research community to motivate scientists to perform independent replication.

Our mission at Science Exchange is to facilitate collaboration between the world’s best scientific labs.We hope to play a big part in that cultural shift.

Still have questions? Download our FAQ that answers the most-asked questions on this project.

Humanized Antibodies: Key Technology for Effective Immunotherapies

November 1, 2016 | Posted by Christina Cordova in Research |


Harnessing the power of the immune system for therapeutic use in human disease is not a new idea, but recent advances in biotechnology have brought new precision to the way physicians and researchers approach therapy development.  Monoclonal antibodies (mAbs) have offered real progress toward fighting many autoimmune diseases and several forms of cancer, turning immunotherapy into a multibillion dollar segment of the biopharmaceutical industry.  An estimated 37 million people are afflicted with
cancer or an autoimmune disease in the United States alone, making advances in these therapies impactful for improving survival rate and quality of life for millions of patients world-wide.  As more antigens are linked to cancer, promising mAb therapies are emerging which target and block certain cancer-specific antigens.  These antigens are often functional parts of the cancer cells, or aid in the function of cells and expedite cancer growth.  MAbs are also developed to target cancer cells in the body by attaching to them, thus marking them to be eliminated by the body’s immune system.  Conjugated mAbs use specific antibodies as a homing device to deliver a deadly dose of cancer-killing agents or radioactive substances to cancerous cells in the body.  Autoimmune disorders often manifest with a concentrated attack on a specific organ system caused by immune reactivity to particular self antigens.  Identifying these antigens as the targets of mAb therapies could offer significant progress in treating diseases including multiple sclerosis, psoriasis, rheumatoid arthritis, Crohn’s disease and ulcerative colitis.

128px-Antibody.svgAntibody therapy as we know it today began in 1975, when scientists Cesar Milstein and Georges J. F. Kohler pioneered technology to produce monoclonal antibodies by creating the first hybridoma.  To produce hybridoma cells, scientists inject mice with an antigen linked with the particular immune response they are interested in triggering.  Mice are then screened for production of the desired antibodies and if a sufficient level is detected, B cells (the type of cells that produce antibodies) are harvested from the spleen to be used in the hybridoma.  Spleen cells on their own have a very limited lifespan, so they must be fused with immortal myeloma cells to increase their longevity and ability to reproduce.  This resulting hybrid cell can multiply indefinitely and is capable of producing antibodies at a volume large enough to be used for therapeutic or diagnostic applications. These initial antibodies were murine, meaning both cell lines were derived from mice.  However, differences between mouse and human immune systems caused clinical failure of many murine antibody therapies due to immunogenicity.  This undesired response to immunotherapy happens when the antibody being introduced is seen as a foreign protein by the body’s immune system and prompts a sever immune response in the patient.  Unlike vaccines, activating the immune system in this way can render mAbs ineffective or trigger an allergic reaction in the body such as anaphylaxis, or cause the rapid release of proinflammatory cytokines, known as cytokine release syndrome.

To decrease the chance of immunogenicity, chimeric antibodies were developed which fused murine antibody variable (antigen binding) regions with human antibody constant (effector) regions.  Lower immunogenicity allows chimeric antibodies to be used in biotherapeutics, assay development, and diagnostics.  As antibody engineering technology improved, the first humanized antibodies were created hoping to fully address the issue of immunogenic response in patient populations.  However, immunogenicity still proves to be an obstacle in immunotherapies, prompting the FDA to publish a guidance document for the industry on immunogenicity assessment for therapeutic protein products.  For biopharmaceutical companies seeking to launch new immunotherapies, the production and validation of humanized antibodies is a critical component in drug research.  There are several methods of humanization employed in antibody engineering:

  • CDR grafting – Combines antibody variables called complementarity-determining regions (CDRs) which determine where antibodies bind to a particular antigen, with human constants.  Antibody specificity and antigen affinity are retained by utilizing residues associated with antigen binding. This results in an antibody that is mostly human, with only CDRs from nonhuman origin.
  • Phage display – A process of using simple organisms, such as bacteriophages, to display antibodies or antibody fragments which are genetically fused to the phage coat protein.  The bacteriophage are genetically engineered through repeated cycles of antigen-guided selection, used to create a human phage display library, and then screened for binding affinity to a specific antigen.
  • Transgenic animals – Mice are genetically engineered with introduced human antibody heavy and light chain gene sequences, along with targeted modification of endogenous mouse antibody genes in order to suppress their expression.  What results is a transgenic mouse which can produce fully human antibody repertoires.

Antibody engineering techniques vary depending on the target antigen and application, however robust characterization is an essential part of successful antibody production.  Assays to determine appropriate end-use effectiveness include screening for a cross-reaction with other protein species, checking for affinity requirements, application-specific viability such as immunohistochemistry, and inclusion of control studies at each stage.  Due to the complexity of antibody engineering and rigor required in mAb production, working with knowledgeable collaborators is key in the success of humanization service projects.  

Science Exchange offers access to experienced service providers specializing in the mAb production techniques mentioned here, as well as thousands of other experiment types.  Visit our marketplace to start your antibody engineering project today.

 

Mass Spec: Shedding Light on Cancer Biomarkers with Century-Old Technology

October 5, 2016 | Posted by Christina Cordova in Research, Stories, Uncategorized |

Imagine telling the inventor of the radio that the technology he discovered was now found in almost every kitchen in America, and that you used it to make your popcorn last night. He’d probably be surprised, and maybe you are, too.  Sound far-fetched? Many aspects of modern life rely on technology that was first identified by 19th century physicists and then adapted to new applications. This not only includes microwave ovens from the example above, but state-of-the-art lab equipment which is poised to change the way researchers treat cancer. It might be hard to imagine cutting-edge discoveries in proteomics or precision medicine are the result of technology first conceived over a hundred years ago, but that’s what a new application called proteomic mass spectrometry imaging is doing for cancer diagnostic tests.

Many life scientists utilize research tools built on principles first explored and defined by physics, and mass spectrometry is a particularly impactful example. The technology we now use to measure mass-to-charge ratios of ions for the purpose of molecular analysis was first developed by J.J. Thomson on an instrument called a parabolic spectrograph in 1913. The spectrograph generated ions in gas discharge tubes, then passed the ions through parallel electric and magnetic fields. Subjecting the ions to these fields forced them to move in certain parabolic trajectories which would then be recorded on a photographic plate, as seen in the rather beautiful image below.

Discovery_of_neon_isotopesIt was Thomson’s research at the end of the 19th century that lead to the discovery of the electron, work that eventually won him the Nobel Prize in physics in 1906. To hear a 77 year-old Thomson talk about that research (and how very small electrons are at around the 2:50 mark), watch this video filmed in 1934.

Besides the name change (there aren’t any spectrographs in labs these days), mass spectrometry has come a long way technologically. Advances by subsequent researchers made the technology more precise and the resulting output more accurate. In 1920 the first modern mass spectrometer was developed by Arthur Dempster, of uranium isotope fame, and by the 1970s scientists had begun experimenting with joining liquid chromatography techniques to the process. In 1989 the first LC-MS instrument was launched, securing it as a ubiquitous technique now in its third decade of use. The staying power of this technology is due to its versatility; it is able to directly analyze any biological molecule receptive to ionization. Scientists can use LC-MS to better understand the molecular structure of everything from wastewater to skin cream. The data collected during analysis can inform evaluation of product effectiveness, environmental toxins, or the function of a protein. For this reason it provides valuable research applications in environmental analysis, consumer products, agriculture, and in this case, precision medicine.

Now a bona fide buzzword, the concept of precision medicine was catapulted into the social vernacular in 2015 when President Obama announced the Precision Medicine Initiative in his State of the Union Address. In practice, precision medicine isn’t entirely new; physicians and researchers have long understood the importance of individualized factors in treating or diagnosing patients. The concept of blood type matching and bone marrow donation registries are both examples of precision medicine we have accepted as standard treatments. Advances in biotechnology are ushering in a new emphasis on specialized medicine and carry with it the hope of more effective diagnostics and treatments for ailments like cardiovascular disease and cancer. Much of this promise rests on discoveries being made in the field of proteomics, particularly about the role of proteins in healthy cells versus diseased cells. The form, function, and interaction of these proteins can indicate the presence of disease, identify molecular therapeutic targets, and help define molecular disease taxonomies for future research. Finding a measurable indicator for any of these biological states is called a biomarker, making it the focus of many proteomics and cancer researchers.

It turns out, a very familiar technology is proving to be the best tool for unlocking the largely unknown world of proteins. LC-MS breaks down the complicated protein structures from their three dimensional form, and then into even smaller units called peptides. The quantitative analysis of these peptides makes it possible for scientists to identify protein expression profiles associated with certain cancers. Clinically viable biomarker panels could greatly increase early detection and definitive disease identification in patients, both of which are known to improve patient survival rate. This specificity in diagnosis allows patients and physicians to be better informed when making treatment decisions by understanding the disease on a molecular level. Biomarkers can improve standard differential diagnosis descriptions, which up to now have largely included physical symptoms that manifest at later stages of disease development, like metastasis. Some diseases like malignant melanoma present in very cryptic ways, making them difficult to diagnose, even for highly trained dermatopathologists. Inconclusive biopsy results or histological features that are also found in non-cancerous moles complicate diagnosis and can lead to costly mistakes in the course of treatment for such a common and potentially deadly disease. According to the American Cancer Society over 10,000 people will die this year from the disease, making it the most lethal of all skin cancers. A collaborative research project between Yale scientists and Protea Biosciences is seeking to change that with a new diagnostic technology. In April of this year they announced exclusive licensing for a method which uses unique protein expression profiles to discern the presence of cancer. The results of the first clinical study were presented in 2015, showing 99 percent accuracy in identifying malignant melanoma and benign melanocytic nevi.

Achievements like this highlight the benefit of partnerships between academia and industry, which are becoming more common in many sectors of biotechnology. If precision medicine is to become a reality, it will have to tackle complex disease models that have historically confounded individual pharmaceutical companies or research labs. Open innovation between researchers on both sides advances scientific discovery and expedites successful clinical implementation of potentially life-saving drugs. As scientists work on more complicated human health issues, they will need to find collaborators who are best suited to solve the research objective at hand, while accessing novel technologies best suited for the job.

Just as the concept of precision medicine has expanded with scientific discoveries in biotechnology, the technique of mass spectrometry has evolved to address new research questions with advances in bioinformatics and lab technology. Deciphering the human proteome is still a ways off, but innovative techniques and research partnerships will surely have a role to play in unlocking the power of proteomics for human health. As LC-MS capabilities continue to improve, new disease diagnostics and treatments will be added to the arsenal of options available to physicians. The next time you hear about an advancement in precision medicine (or pop a bag of popcorn), thank a physicist.

Looking for a cutting-edge collaborator like Protea to help with your research project? Visit our marketplace to find the right provider for your mass spec analysis, or any of the thousands of experiment types we offer.

The Importance of Replication Studies

July 28, 2016 | Posted by Team in Company, Reproducibility, Research, Science Exchange News, Uncategorized |

My TEDMED talk about scientific reproducibility was released today, so I wanted to take the opportunity to provide some additional thoughts about the importance of replication studies.

Every year, billions of dollars are spent funding biomedical research, resulting in more than one million new publications presenting promising new results. This research is the foundation upon which new therapies will be developed to enhance health, lengthen life, and reduce the burdens of illness and disability.

In order to build upon this foundational research, these results must be reproducible. Simply put, this means that when an experiment is repeated, similar results are observed. Over the last five years, multiple groups have raised concerns over the reproducibility of biomedical studies, with some estimates indicating only ~20% of published results may be reproducible (Scott et al. 2008, Gordon et al. 2007, Prinz et al. 2011, Steward et al. 2012, Begley and Ellis 2012). The National Institutes of Health (NIH), the largest public funder of biomedical research, has stated, “There remains a troubling frequency of published reports that claim a significant result, but fail to be reproducible. As a funding agency, the NIH is deeply concerned about this problem”.

Despite the growing concern over lack of reproducibility, funding for replication studies, the only way to determine reproducibility, is still absent. With no funding systematically allocated to such studies, scientists almost never conduct replication studies. It would be interesting to obtain the exact numbers, but it appears that last year the NIH allocated $0 to funding replication studies, out of a $30B+ budget. In the absence of replication studies, scientists end up wasting precious time and resources trying to build on a vast, unreliable body of knowledge.

It is easy to see why funders might shy away from funding replication studies. Funders want to demonstrate their “impact,” and it is tempting for them to solely focus on funding novel exploratory findings that can more easily be published in high profile journals. This is a mistake. Funders should instead focus on how to truly achieve their stated goals of enhancing health, lengthening life, and reducing the burdens of illness and disability. Although allocating a portion of funding towards replication studies would divert funds from new discoveries, it would enable scientists to efficiently determine which discoveries were robust and reproducible and which were not. This would allow more rapid advancements by allowing scientists to build upon the most promising findings and avoid wasting their time and funding pursuing non-robust results.

Some researchers find the idea of replicating previous studies unnecessary or even offensive. However, it is the responsibility of the scientific community, including funders, to work as quickly and cost effectively as possible to make progress. Introducing replication studies as part of the process provides an effective way to enable this.

If you would like to see funding specifically allocated for replication studies, please register your support. We will share this information with funders in the hope that it will encourage them to establish funding programs specifically for replication studies to improve the speed and efficiency of progress in biomedical research.

by Elizabeth Iorns, Ph.D.

CEO and Co-Founder

Science Exchange

About Science Exchange

 

Science Exchange is the world’s leading marketplace for outsourced research. The Science Exchange network of 3000+ scientific service providers has run the experiments for the major replication studies that have been conducted to date including the largest biomedical replication study undertaken (Reproducibility Project: Cancer Biology). Additional details are available here: https://www.scienceexchange.com/applications/reproducibility

 

References

  1. https://www.nih.gov/about-nih/what-we-do/budget#note
  2. http://www.ncbi.nlm.nih.gov/pubmed
  3. https://www.nih.gov/about-nih/what-we-do/mission-goals
  4. Scott et al. Amyotroph Lateral Scler. 9, 4-15 (2008)
  5. Gordon et al. Lancet Neurol. 6, 1045–1053 (2007)
  6. Prinz et al. Nat Rev Drug Discov. 10, 712 (2011)
  7. Stuart et al. Experimental Neurology 233, 597–605 (2012)
  8. Begley and Ellis. Nature. 483, 531-3 (2012)
  9. http://www.nature.com/news/policy-nih-plans-to-enhance-reproducibility-1.14586
  10. http://www.nature.com/news/reproducibility-the-risks-of-the-replication-drive-1.14184

 

 

Two Papers Published in the Online Journal PeerJ; First Step to Reproducing Critical Prostate Cancer Findings.

September 22, 2015 | Posted by Keith Osiewicz in Research, Science Exchange News |

Science Exchange published two papers in PeerJ, the online journal, that are being funded by the Prostate Cancer Foundation-Movember Foundation Reproducibility Initiative. This initiative seeks to address growing concerns about reproducibility in scientific research by conducting replications of recent papers in the field of prostate cancer.  It is a collaboration between the Prostate Cancer Foundation, the Movember Initiative, and Science Exchange.  These two papers represent the first step to reproducing the original experiments. Today’s papers are meant to report what the collaborators will do so the scientific community has a full understanding of the process. PeerJ will publish the final results of the replications.

The first paper, The Androgen Receptor Induces a Distinct Transcriptional Program in Castration-Resistant Prostate Cancer in Man by Sharma and colleagues, was originally published in Cancer Cell in 2013. Of thousands of targets for the androgen receptor (AR), the authors elucidated a subset of 16 core genes that were consistently down-regulated with castration and re-emerged with castration resistance. These 16 AR binding sites were distinct from those observed in cells in culture. The authors suggested that cellular context can have dramatic effects on downstream transcriptional regulation of AR binding sites. The present study will attempt to replicate Fig. 7C by comparing gene expression of the 16 core genes identified by Sharma and colleagues in xenograft tumor tissue compared to androgen treated LNCaP cells in vitro.

The second paper Androgen Receptor Splice Variants Determine Taxane Sensitivity in Prostate Cancer by Thadani-Mulero and colleagues was published in Cancer Research in 2014. The experiment that will be replicated is reported in Fig. 6A. Thadani-Mulero and colleagues generated xenografts from two prostate cancer cell lines; LuCaP 86.2, which expresses predominantly the ARv567 splice variant of the androgen receptor (AR), and LuCaP 23.1, which expresses the full length AR as well as the ARv7 variant. Treatment of the tumors with the taxane docetaxel showed that the drug inhibited tumor growth of the LuCaP 86.2 cells but not of the LuCaP 23.1 cells, indicating that expression of splice variants of the AR can affect sensitivity to docetaxel.

Labs listed on Science Exchange will perform the lab work. These labs include Nobel Life Sciences, ProNovus Bioscience LLC, and the Stem Cell and Xenograft Core at the University of Pennsylvania.

NASA’s Super-Black Carbon Nanotubes Developed through Science Exchange

July 17, 2013 | Posted by Team in Company, Research, Science Exchange News, Stories |
Principal Investigator John Hagopian working with a nanotube material sample. Image Credit: NASA Goddard/Chris Gunn

Principal Investigator John Hagopian working with a nanotube material sample.
Image Credit: NASA Goddard/Chris Gunn

A collaboration formed on Science Exchange has lead to new developments in the production of carbon nanotube forests – the blackest materials ever measured! The research resulted from a partnership between NASA and the Melbourne Centre for Nanofabrication (MCN), a part of the Australian National Fabrication Facility (ANFF).

“NASA and the ANFF’s research is monumental, and we are thrilled to have been part of such an important development in nanotechnology,” said our CEO Dr. Elizabeth Iorns.“Scientists can now access the vast expertise available globally to produce powerful partnerships that lead to innovative research.”

An International Collaboration

The NASA team and MCN connected using Science Exchange when NASA was searching for a way to coat instrument components with a thin film, and continue development on their super-black material.

The NASA team, lead by Principal Investigator John Hagopian, was able to submit an open RFQ and identify the MCN as a capable provider based on the expertise and novel deposition platforms offered, forming an ideal overseas partnership to develop NASA’s intricate nanotubes.

Read the rest of this entry »

Techniques Series: RNA Extraction

June 6, 2013 | Posted by Mamata Thapa in Research |

russo1

RNA extraction involves a series of steps for the isolation of RNA from biological molecules. The final product is used for experiments such as qRT-PCR, microarray, next-generation sequencing, or Northerns.

This technique can aid scientists into addressing a wide range of questions. For instance, what may be the level of mRNA expression of a gene involved in the developmental stage of an invasive insect species? Which steps of rRNA processing are affected in cells depleted of cell cycle proteins? What modifications in tRNAs lead to certain autoimmune diseases? This technique is applicable in a wide range of organismal groups extending from bacteria to humans.

Read the rest of this entry »

Techniques Series: Next Generation Sequencing technologies

May 31, 2013 | Posted by Guest in Research |
Transcriptome SOLiD sequencer by EMSL, on Flickr
Creative Commons Attribution-Noncommercial-Share Alike 2.0 Generic License  by  EMSL 

 

This is a guest post by James Hadfield, Head of the Genomics Core Facility at Cancer Research UK.

Today there are three main next-generation sequencing (NGS) technologies; Illumina, Ion Torrent and 454. Sanger sequencing is still used by almost all research labs and remains a key tool for simple clone verification or PCR based sequencing.

Although the DNA sequencing in each system is very different, all three technologies share many commonalities; they generally start with fragmented genomic DNA to which oligonucleotide adapters are ligated, and single adapter-ligated molecules are clonally amplified ready for highly-parallel sequencing of millions or even billions of reads. The technologies were conceived and developed as the Human Genome Project (HGP) was finishing. Sanger sequencing was a brute force tool; requiring an international effort, billions of dollars and 15 years to complete the single HGP genome.

Read the rest of this entry »

Solving The Research Integrity Crisis

May 20, 2013 | Posted by Elizabeth in Research |

Earlier this month, I had the pleasure of attending the third World Conference on Research Integrity in Montreal, bringing together thought leaders on research integrity and responsible conduct in research. The Conference covered issues including the contributing factors of fabrication and systemic dishonesty, potential solutions in better training and support for whistleblowers, and larger incentives to changing the research culture.

Aggregating these respective themes, I felt it important to review the different opinions offered at the Conference. Consolidating the various themes and propositions presented can in turn allow for discussion of potential strategies to build more effective solutions to the problem of research integrity.

Read the rest of this entry »

10 Free Scientific Resources For Graduating Students

May 16, 2013 | Posted by Piper in Research |
Open_Access_PLoS.svg by PGRsOnline, on Flickr
Creative Commons Attribution-Noncommercial-Share Alike 2.0 Generic License  by  PGRsOnline 

 

Upon finishing my Ph.D. in December, I was quickly confronted with the loss of journal, publication, and general program access.

Within an academic institution, we are privileged in access to a wide array of resources and traditionally subscription-based service. And while there are far more open-access resources than ever before, with Wikipedia and PLOS as significant examples, it is important to recognize some of the other commonly available resources which can assist scientists who are set to graduate this summer from their institutions.

Below I highlight some of the references, software, and literature I myself am using that are all free, open access, and ready to use. And as the discussion about open access scientific literature makes significant strides, I think it is important to start thinking about what other resources and expertise should be available for scientists to freely access and use.

Read the rest of this entry »

About Science Exchange

We are transforming scientific collaboration by creating a marketplace where scientists can order experiments from the world's top labs.

Check the Science Exchange blog for updates, opinions, guest posts and the latest happenings at Science Exchange HQ!

Visit Science Exchange →

Subscribe to the blog
Never miss a post! Science Exchange blog posts delivered right to your inbox.
Thank you for joining the SciEx revolution!
Powered By WPFruits.com