WhatFinger


The solutions that Dr. Ioannidis suggests: more access to complete data sets, more independent investigations, and more transparency.

Questioning Medical Research



Medical Research John Ioannidis first burst onto the national scene in 2005 with a groundbreaking paper titled “Why Most Published Research Findings Are False.” 1 His statistical analysis and logic are impeccable, and his paper has never been seriously refuted. Furthermore, he has had a tremendous impact: the paper has been viewed more than 2.5 million times.2 Since then, Ioannidis has gone on to show that the best scientists don't always get funded, why neuroscience is unreliable, why most clinical research is useless, and that most economic studies are exaggerated. In other words, the process by which we acquire new knowledge is fundamentally flawed and much of what we think we know is wrong2 These days he's aimed his attacks at nutrition research and guidelines on disease definition statements.

Nutrition epidemiology is in need of 'radical reform'

In a recent op-ed article in JAMA, Ioannidis bluntly states that nutrition epidemiology is in need of 'radical reform.' In a paragraph that perfectly captures the absurdity of the field, he writes, “...eating 12 hazelnuts daily (1 oz) would prolong life by 12 years (ie., 1 year per hazelnut), drinking 3 cups of coffee daily would achieve a similar gain of 12 extra years, and eating a single mandarin orange daily (80 g) would add 5 years of life. Conversely, consuming 1 egg daily would reduce life expectancy by 6 years, and eating 2 slices of bacon (30 g) daily would shorten life by a decade, an effect worse than smoking. Could these results possibly be true?” 3 The answer to his rhetorical question is obviously no. So, Alex Berezow asks how did this garbage get published.? 2 Dr. Ioannidis blames residual confounding and selective reporting. Confounding means incorrectly concluding that A causes B when, in reality, some other factor X causes B. The trouble for researchers is when A and X are related to each other, teasing out the true cause can be quite difficult. For instance, eating bacon very well may be associated with a shorter lifespan. But maybe bacon eaters are also less likely to exercise, and lack of exercise (the confounding factor) is the actual cause of a shorter lifespan. Selective reporting means that any study which shows a link between bacon and early death is likelier to be published than one that doesn't show a link. Combined, Ioannidis believes that residual confounding and selective reporting have created a systemic bias in nutrition research. 2 A few years ago, two researchers took 50 most used ingredients in a cook book and studied how many had been linked with a cancer risk or benefit based on a variety of studies published in scientific journals. Their result? Forty out of 50, including salt, flour, parsley and sugar. Is everything we eat associated with cancer the researchers wondered in an article based on their findings? 4

Support Canada Free Press


Persistent problem in the research world: too few studies have large enough samples to support generalized conclusions.

Their investigation touched on a known but persistent problem in the research world: too few studies have large enough samples to support generalized conclusions. “The majority of papers that get published, even in serious journals, are pretty sloppy,” said Ioannidis. Since then, he says, only limited progress has been made. Only a third of the 100 studies published in three top psychology journals could be successfully replicated in a large 2015 test. Medicine, epidemiology, population science and nutritional studies fare no better, Ioannidis said, when attempts are made to replicate them. Too many studies are based solely on a few individuals, making it difficult to draw wider conclusions because the samplings have so little hope of being representative.4 So what should we take away from the food studies published every day? Ioannidis recommends asking the following questions: is this something that has been seen just once, or in multiple studies? Is it a small or large study? Is this a randomized experiment? Who funded it? Are the researchers transparent? 4 Ioannidis is also questioning the role professional societies should or should not play in authoring professional guidelines. 5 Here is a quote that provides a summary: “...these guidelines writing activities are particularly helpful in promoting the careers of specialists in building recognizable and sustainable hierarchies of clan power, in boosting the impact factors of specialty journals, and in elevating the visibility of the sponsoring organizations and their conferences that massively promote society products to attendees.” 6

Should you want an insider's view of these guideline committees,....

His concern, “do they improve or do they homogenize biased collective and organized ignorance?” He goes on to point out several potential conflicts of interest; the financial ones have been getting a lot of coverage lately, at least concerning individual physicians. But professional societies are also entangled with industry; he cites 20% of the American Heart Associations' $912 million budget as coming from industry, and 77% of the European Society of Cardiology's budget of almost $70 million. He also describes a guideline ecology, where experts already consultants with industry, write the guidelines, develop the appropriate use criteria as well as measures of performance.5 Should you want an insider's view of these guideline committees, consider the commentary by Dr. Milton Packer. Says Packer, “I have been a member of many guidelines committees, and I can confidently say that it is a messy process. Typically, each committee member is asked to write one section of the document. And at the end, the entire committee is supposed to critically review everyone's else's work. But sometimes it doesn't work that way.7 Few have the time to examine the entire document carefully; most are happy to believe that their colleagues have carried out their tasks satisfactorily. When the committee meets face-to-face, the primary goal is to move the document to the finish line. Given the time pressures, it is often easier to accept the approach of a fellow committee member than to challenge it. One or two determined members can have an outsized influence on the committee's deliberations. Take, for example, the 2017 hypertension guidelines from the American Heart Association and the American College of Cardiology. That document challenged the criteria for the diagnosis of high blood pressure, thereby placing an enormous burden on the US population and primary care physicians. The recommendations of the document were heavily influenced by the results of one highly controversial trial, the SPRINT study. Interestingly, several key members had played an important role in that trial.7


Waste across medical research (clinical and other types) has been estimated as consuming 85% of the billions spent each year

The key finding of the report was that although blood pressure measurements were obtained using an automated measurement device there were substantial differences in the methods used by different SPRINT centers. This finding confirmed revelations about these differences that first emerged in 2018.6 There are many papers of clinical research, approximately 1 million have been published to date, along with tens of thousands of systematic reviews, but most of them are not useful. Waste across medical research (clinical and other types) has been estimated as consuming 85% of the billions spent each year. 9 So all of this means there is a lot of medical misinformation. The Institute of Medicine estimates that only 4 percent of treatments and tests are backed up by strong scientific evidence; more than half have very weak evidence or none. 10 The solutions that Dr. Ioannidis suggests: more access to complete data sets, more independent investigations, and more transparency. References
  • John P. A. Ioannidis, “Why most published research findings are false,” PLOS Medicine, August 30, 2005
    1. Alex Berezow, “John Ioannidis aims his bazooka at nutrition science,” American Council on Science and Health, August 24, 2018
    2. John P. A. Ioannidis, “The challenge of reforming nutritional epidemiological research,” JAMA, 2018; 320(10):969-970
    3. Ivan Couronne, “Beware those scientific studies—most are wrong, researcher says,” AFP, July 5, 2018
    4. Chuck Dinerstein, “Professional society guidelines: John Ioannidis tackles medicine's bubble of eminence,” American Council on Science and Health, October 18, 2018
    5. John P. A. Ioannidis, “Professional societies should abstain from authorship of guidelines and disease definition statements,” Circulation: Cardiovascular Quality Outcomes, Oct. 2018
    6. Milton Packer, “When did guidelines become holy writ?”, MedPage Today, October 17, 2018
    7. Larry Husten,”The survey says: BP measurement in SPRINT was all over the place,” cardiobrief.org, November 14, 2017
    8. John P. A. Ioannidis, “Why most clinical researd is not useful,” PLOS Medicine, June 21, 2016
    9. Shannon Brownlee, Overtreated, (New York, Bloomsbury, 2007), 92

    View Comments

    Jack Dini -- Bio and Archives

    Jack Dini is author of Challenging Environmental Mythology.  He has also written for American Council on Science and Health, Environment & Climate News, and Hawaii Reporter.


    Sponsored