This summer, The New York Times published a series of articles that explore the growing world of retracted scientific studies. One in particular lists a handful of high-profile breakthroughs that have been retracted since 1980, while another harps on the fact that we are on pace for more than 500 retractions in 2015. For perspective, there were just 40 total retractions in 2001.
Dr. Richard Horton, editor-in-chief of U.K. medical journal The Lancet, recently estimated that half of all scientific data might be untrue. He cited small sample sizes, tiny effects, invalid exploratory analyses, conflicts of interest, and the pursuit of fashionable science to be the main sources of this lack of validity. In the same article, Dr. Marcia Angell, editor-in-chief of The New England Journal of Medicine, also claimed that its no longer possible to believe a majority of the clinical research being published nowadays.
These are some very serious allegations, and its hard not to feel discouraged by them. To some, it may seem like all research should be outright disregarded and the scientists responsible for publishing tainted studies should be publically chastised.
But this high-drama viewpoint obscures several factors about medical science that I would like to bring to light.
Probability and Personalization
Even clean, repeated, statistically significant findings the gold standard of scientific research need to be taken with a grain of salt.
If a drug works on 85 percent of people in a double-blind placebo controlled study, it still tells me nothing about the individual walking into my clinical office. Is this person part of the 85 percent, or is he part of the 15 percent? I can only assume theres a strong likelihood the drug will work for my patient if and thats a strong if that patient is exactly like 85 percent of the people in the study.
Heres the bottom line: Even in so-called pure circumstances, research merely illustrates an averaged-out statistical probability produced by thousands of unique variables pure or impure, it should stimulate thought and not be regarded as truth. The scientific method is simply a powerful framework for exploration, and we should tighten or loosen this framework as we see fit. Moreover, it prevents random assertions from people wanting to advance their personal agendas.
Bias Is a Factor, Not a Detractor
Bias is an unavoidable reality that exists from the moment an experimenter articulates his or her hypothesis. This fact doesnt negate the scientific method, nor does it give the green light for haphazard research. What it does suggest is that biases should be added to the factors we consider when reviewing scientific findings the same way we should recognize the limiting roles of probability and personalization.
Lets stop trying to pretend that the world can be bias-free. Yes, we could certainly benefit from eliminating certain biases such as race and gender, but most research shows that removing bias is, to say the least, very challenging.
Wouldnt it be more productive if we considered alternatives to the blame-and-punishment approach of vilifying bias? Instead, we should recognize it, converse about it, and factor it into our findings. Bias doesnt invalidate research it provides more food for thought.
The Dangers of Writing Off Research
The human brain is wired for envy, gloating, and schadenfreude . We want to see well-intentioned, prominent people fall from grace, and were unlikely to support something that we dont agree with. We see it happen all the time on television and social media, and now were seeing it happen in science. When we fall down this rabbit hole, we enter a realm of radicalism that can be likened to intellectual terrorism. This is, in itself, a conflict of interest. You cant objectively critique something that you want to bash for personal reasons anyway.
Scientists should be the first to admit holes in their methodologies and studies. But these same holes are what have stimulated many incredible findings throughout our history. Innovation is born from tinkering with these attitudes and thoughts, and that tinkering is only possible if we embrace science for what it is: thought-provoking.
Science is by no means an indicator of truth, and without it, we would probably still be in the dark ages, trapped by dogmas, opinions, beliefs, and extremist power that would negate our abilities to advance. For all its faults, science has allowed us to live longer, reduce suicide in manic-depressive illness , reverse erectile dysfunction , curb the impact of AIDS , and understand more about the biology of sexual orientation than ever imaginable.
Science has undoubtedly saved lives and improved the world around us. Looking at the big picture, does it really matter if one of these great discoveries happened to occur in a biased setting or if the results werent identical across studies?
Its not worth undoing, diminishing, or halting scientific progress in a quest to achieve an unrealistic, unattainable level of experimental purity. While the integrity of medical science is of paramount importance, it is not productive to use the holes in the scientific method as a reason to bash science. Rather, it is more important to recognize the holes, see the potential innovation that can come from them, and try to improve upon what we know instead of being carried away by an anti-science bias that does society no good.
I do believe that critiques, reexamination, and retractions may be in order but we also need to recognize that the holes in science are an opportunity for personalization, innovation, and improvement. They do not detract from the fact that the scientific method has much to offer medicine and, more importantly, the people who are served by it.
About the Author: Dr. Srini Pillay, founder and CEO of NeuroBusiness Group , is a pioneer in brain-based executive coaching who is dedicated to collaborating with experts to help people unleash their full potential. He also serves as an assistant professor of psychiatry at Harvard Medical School and teaches in the executive education programs at Harvard Business School and Duke Corporate Education.