top of page
Search
  • Ariel Bogle

Why we fall for online pseudoscience – and how to stop



I'm a sceptical person most days, but I do have my weak moments. Last weekend, Gwyneth Paltrow found me in one of them.


Half way through an episode of the lifestyle guru's Netflix series which explored the Wim Hof breathing method, I was almost on board to have cold showers for the rest of my life.


Wim Hof is not the first person to extol the health benefits of freezing water — friends have told me the same, and if you believe the stereotype, Swedish people thrive off running from saunas to ice lakes.


So perhaps my mind was primed to fall for this exact episode of Goop.


There are many reasons why we find pseudoscience persuasive, according to Dr Micah Goldwater, a cognitive scientist at the University of Sydney.


It is often simpler for us to add knowledge than subtract it, he said.


"It's actually much easier to add things to your mental model of how things work than to take things away."


There's also the 'illusory truth effect', where the more familiar something sounds, the more likely you are to believe it's true.


In other words, because I've heard so many anecdotes about the power of cold water, when I hear an embellishment dressed up in scientific jargon, I may be more likely to endorse it.


The power of the anecdote


As a journalist who reports on online misinformation, I've spent plenty of time in anti-vaccination Facebook groups or in internet forums that suggest herbal remedies protect against the coronavirus.


In those groups, it's easy to observe the seductive nature of personal stories. A friend's nephew whose case of the measles was cured by tea tree oil is more engaging than a dozen dry public health announcements.


We're biased to sometimes think narrative detail is linked to accuracy, when that's not always the case.


Plenty of engaging, specific examples can capture your emotion and empathy, according to Amanda J. Barnier, a professor of cognitive science at Macquarie University.


    "Perhaps by identifying with the details, it rips you away from a critical evaluation of the evidence underlying it," she said.


One study conducted by Dr Goldwater, which has so far been presented at a conference, attempted to understand the power of positive and negative anecdotes.


Participants in the study were assessed on how stories about the impact of medical treatments on 'Jamie' (a fictional person) affected whether they would use the same treatment.


Even though they were told that the treatment worked for most people, knowing one negative personal story — about Jamie's symptoms failing to improve — often made study participants report that they would not want to take that treatment.


"When you are affected by an anecdote, what you are potentially doing is generalising from a single case to your life," he said. "But it's possible that it just made you feel icky [about the treatment]."


We like an explanation


Humans like explanations that help them predict how the world works.


We are constantly thinking about cause and effect. That can lead us to a bias called 'illusory causation', where we interpret a causal effect when there really isn't one.


If you take herbal medication for a cold and then get better two days later, you might assume the medicine did the trick.


"The bias people have is they don't think, 'wait, what would have happened if I didn't take that herbal medication?'," Dr Goldwater said.


"Well, you probably would have gotten better in two days just the same."


This is why scientific studies use randomised control trials that give half the test subjects the medicine and half none — to test that assumption.


We are also susceptible to what's called the naturalness bias. Put simply, "natural stuff is good and artificial stuff is bad".


"It's just this heuristic that helps you make a decision, a shortcut," Dr Goldwater said. There will always be scenarios where that rule does not hold true.


To avoid being tricked, slow down


We like explanations and rules, but our problem is that we want them fast.


"Critical thinking about information you're presented with often requires people to slow down and think about it carefully and weigh the evidence," Dr Barnier said.


She recommends taking your time when dealing with health information, and considering its source.


    "Who's saying this? And where is the information coming from? And what is it based on?"


Dr Goldwater said we must also be willing to interrogate our own beliefs and why we came to conclusions.


"If you are constantly sceptical of your own thinking, that is potentially the best way to vet yourself," he said.


But this is a struggle, even if it's your life's work.


"If anything, being a cognitive scientist just makes me question my own thinking," he said.


"You can always, in retrospect, realise you fell prey to one of these biases, because these are biases that are so ingrained to how we think."



Article from ABC Science

19 views0 comments
Post: Blog2_Post
bottom of page