newcohospitality.com

Understanding Daniel Andrews' Misinterpretations of Science

Written on

Victoria's Premier, Daniel Andrews, recently caused a wave of discontent among the 6.4 million residents of his state by announcing a further two-week extension of the lockdown measures, resulting in a total of at least 10 weeks under the strict Stage 4 restrictions.

At the start of his lengthy 90-minute press conference, Andrews invoked the authority of science.

“You can’t argue with this sort of data. You can’t argue with science.”

The Fallacy of "Because Science"

Regardless of one's stance on the lockdown laws in Victoria, invoking "because science" does not automatically validate an argument or negate the possibility for discussion and debate.

I want to clarify that this commentary is not an attack on science itself, the forecasting model Andrews referenced, or his decision to prolong the lockdown.

I am a strong proponent of science, having built a business around emerging technology startups over the past six years. Over the last 150 years, science has significantly improved our quality of life, providing us with clean water, effective sanitation, transportation, electricity, medical advancements, and even the internet—despite its drawbacks.

As the saying goes, aside from status and privilege, an average person today enjoys a higher standard of living than kings and queens from centuries past.

However, I am equally committed to truth and reason, and I believe it is crucial not to mislead the public by pointing to science as an irrefutable conclusion, especially when it involves politicians who have authority over millions, as is currently the case in Victoria.

The Argument Against Science

Despite Andrews’ claims, you can indeed argue with science.

The essence of the scientific method is to challenge and falsify assumptions. It aims to bring us closer to understanding, but it seldom results in absolute truths.

As the late Nobel Prize-winning physicist, Richard P. Feynman, famously stated, “we can never be sure we’re right, we can only ever be sure we’re wrong.”

This is due to the potential for errors in science, including known unknowns, unknown unknowns, and even manipulation.

Authors Carl Bergstrom and Jevin D. West, in their book Calling Bullshit: The Art of Scepticism in a Data-Driven World, assert that “there’s plenty of bullshit in science, some accidental and some deliberate.”

They emphasize that every scientist operates under human motivations—like reputation and power—that may overshadow their pursuit of knowledge.

The Replication Crisis

Such flawed incentives have led to a replication crisis across various scientific fields. Numerous studies are either difficult or impossible to reproduce, particularly in medicine, psychology, and economics.

Medicine

Among 49 medical studies published from 1990 to 2003, only 24% remained largely unchallenged, prompting John Ioannidis, a Professor of Medicine at Stanford University, to explore the question of Why Most Clinical Research Is Not Useful. Furthermore, a 2012 paper by Glenn Begley, a biotech consultant, revealed that only 11% of pre-clinical cancer studies could be replicated.

Psychology

A report by Open Science Collaboration in August 2015 found replication rates in psychology studies varied from just 23% in social psychology to about 50% in cognitive psychology. Even the more favorable statistics indicate that half of these studies are not replicable.

Economics

Perhaps unsurprisingly, a 2016 study revealed that one-third of 18 studies from premier economics journals failed to replicate. A follow-up study suggested that “the majority of average effects in empirical economics are exaggerated by at least a factor of 2, and one-third by a factor of 4 or more.”

A single study rarely captures the broader reality. Researchers must evaluate evidence across multiple studies to form a “more likely to be correct” understanding of the world, which is subject to change over time—remember the shifting views on dietary fats.

Science is ultimately prone to numerous failings.

Common Pitfalls in Science

  1. Garbage In, Garbage Out

    Andrews asserted, “you can’t argue with this sort of data.” However, computer scientists understand that the output quality is reliant on the input quality. Effective data must be accurate, interpreted correctly, and yield meaningful conclusions devoid of errors or bias.

  2. Confirmation and Selection Bias

    This bias leads individuals to seek information that validates their pre-existing beliefs while disregarding contradictory evidence. For example, if I wanted to assert that people are more productive at night, I might selectively sample night owls and ignore early risers.

  3. Cherry Picking

    This involves presenting only the results that support an argument while omitting contrary findings. If 10% of results favor a viewpoint, it can be tempting to highlight only that 10%.

  4. Confusing Correlation with Causation

    Correlation does not imply causation. For instance, if my friend Daniel Cannizzaro raised $1 million for his fintech startup while Victoria’s coronavirus cases declined, it would be erroneous to claim that the funding caused the drop in cases.

Correlation vs. Causation
  1. P-Hacking

    P-values indicate statistical significance. Researchers may manipulate data to achieve a desired p-value, leading to misleading conclusions.

  2. Predatory Journals

    Many predatory journals operate on a “pay to play” model, allowing questionable studies to be published if the author pays a fee, which can be leveraged for personal gain.

  3. Peer Review

    The peer review process, while valuable, does not guarantee that a paper is free from errors or misconduct. It often fails to catch all mistakes, as noted by Bergstrom and West.

How to Critically Analyze Science and Data

These points illustrate that science can indeed be flawed.

“The first thing to recognize is that any scientific paper can be wrong,” states Bergstrom.

Science can be manipulated by anyone with a vested interest in shaping a specific narrative.

Next time someone invokes “because science,” whether a politician or a friend, remember that this is just the beginning of a discussion.

Consider questioning:

  • The credibility of the publication
  • Whether the findings are supported by a substantial body of literature or just an isolated study
  • The motivations of the researchers
  • The quality of the source data and its interpretation
  • Whether correlation is being mistaken for causation
  • If visual data is presented to mislead (graphs can be manipulated)
  • The various ways data may be distorted through biases or selective reporting

Conclusion

When equipped with the right knowledge, you can easily expose inaccuracies and engage others in meaningful dialogue.

In the quest to be correct and win arguments, our understanding of what is truly right often suffers.

The more informed we are about the workings of the world, including science, the better our decision-making will be.

Pick up *Calling Bullshit*, a pivotal read for the year, here.

Steve Glaveski is the co-founder of Collective Campus, author of Time Rich, Employee to Entrepreneur, and the host of the Future Squared podcast. He is a lifelong learner with interests ranging from 80s metal to high-intensity workouts and even stand-up comedy.