clock menu more-arrow no yes mobile

Filed under:

This researcher may have discovered the antidote to health bullshit

A big, new experiment shows it’s possible to train kids to detect dubious health claims.

Researchers from Uganda, Kenya, Rwanda, Norway, and England developed curricula — a cartoon-filled textbook, lessons plans — for schoolchildren on how to instill critical thinking skills at an early age.
Informed Health Choices

Andy Oxman is obsessed with the study of bullshit health claims and how to prevent them from spreading.

For decades, he’s been trying to find ways to get adults to think critically about the latest diet fads, vaccine rumors, or “miracle cures.” But he realized these efforts are often in vain: Adults can be stubborn old dogs — resistant to learning new things and changing their minds.

So Oxman, now the research director at the Norwegian Institute of Public Health, started to wonder whether the best hope for bullshit prevention lay with children. To put this idea to the test, back in 2000 he visited his then-10-year-old son’s class.

“I told them that some teenagers had discovered that red M&Ms gave them a good feeling in their body and helped them write and draw more quickly,” Oxman said. “But there also were some bad effects: a little pain in their stomach, and they got dizzy if they stood up quickly.”

He challenged the kids to try to find out if the teens were right. He split the class into small groups and gave each group a bag of M&Ms.

The children quickly figured out they had to try eating M&Ms of different colors to find out what happens, but that it wouldn’t be a fair test if they could see the color of the M&Ms. In other words, they intuitively understood the concept of “blinding” in a clinical trial. (This is when researchers prevent study participants and doctors from knowing who got what treatment so they’re less likely to be biased about the outcome.)

In a short time, they were running their own blinded, randomized trials — the gold standard for testing medical claims — in the classroom. By the end of their experiment, Oxman said, “They figured out that there was little if any difference in the effects of the different colors and they asked me if the teenagers who made the claim really believed that.”

The little classroom visit convinced Oxman he had to start schooling people in the ways of bullshit detection early in life.

So he began working with other researchers from around the world to develop curricula — a cartoon-filled textbook, lessons plans — on critical thinking skills aimed at school children.

In 2016, Oxman tested the materials in a big trial involving 10,000 children from 120 primary schools in Uganda’s central region.

The results of the trial were published in the Lancet, and they showed a remarkable rate of success: Kids who were taught basic concepts about how to think critically about health claims massively outperformed children in a control group.

This means Oxman now holds the best blueprint out there for how to get young people to think critically and arm them with the tools they need to spot “alternative facts” and misinformation. His work brings us closer to answering that important question that haunted him — the one that should haunt all of us who care about evidence and facts: How do you prevent fake news and bullshit from catching on in the first place?

How researchers taught kids to spot “alternative facts” about health

One indispensable reading for anyone interested in evidence-based thinking in health is Testing Treatments (downloadable for free). The basic idea behind the book, as the book’s co-author Sir Iain Chalmers put it, is that "you don’t need to be a scientist to think critically and ask good questions." In plain language, he and the book’s co-authors explain concepts people need to understand in order to sort reliable health advice from nonsense.

Building on the M&M experiment, in 2012, Oxman asked Chalmers whether they might adapt concepts from the book and try to teach them to primary school children in Uganda. (Oxman already had strong ties to Uganda, where he’d been leading a World Health Organization project to bring more research evidence to policymaking.)

With the book, the researchers had a template for the kinds of things they could teach. And they knew this exercise of inculcating skepticism in children, while uncommon in high-income settings, was even rarer in a developing country like Uganda, where pseudoscientific medical advice can spread with abandon, just as it can in the US.

The researchers, along with others from Uganda, Kenya, Rwanda, Norway, and England, worked to identify the most important ideas a person would need to grasp to think critically about health claims, including:

  1. Just because a treatment is popular or old does not mean it’s beneficial or safe.
  2. New, brand-name, or more expensive treatments may not be better than older ones.
  3. Treatments usually come with both harms and benefits.
  4. Beware of conflicts of interest — they can lead to misleading claims about treatments.
  5. Personal experiences, expert opinions, and anecdotes aren’t a reliable basis for assessing the effects of most treatments.
  6. Instead, health claims should be based on high-quality, randomized controlled trials.

They also drew up lesson plans and collaborated with teachers in Uganda to make materials that would resonate with local schoolchildren. Allen Nsangi, a Ugandan researcher and co-investigator on the trial, told me that a big part of that process involved mining local medical myths.

For example, she said, "Some people have been told to use locally available stuff like cow dung [on burns] — it’s almost the best known treatment." (Spoiler alert: It doesn’t work.)

Other medical myths come with heavy costs, she added. "Some of the immunization campaigns have been sabotaged because of claims to do with infertility for the future." Worried parents end up skipping shots for their children. Rumors have also spread that people should replace their antiretroviral therapies for HIV with herbal supplements.

Ultimately, the researchers put together a guide for teachers and cartoon-filled reading and exercise books for students.

"We are trying to teach children that stories are usually an unreliable basis for assessing the effect of treatments," Nsangi explained. The kids learned to watch out for conflicts of interest — like whether the person promoting a certain health claim has a financial stake in it — and to recognize that all treatments carry both harms and benefits and that large, dramatic effects from a treatment are really, really rare.

The researchers didn’t stop there. They also wanted to know whether their work would actually improve children's ability to assess health advice, so they designed a randomized controlled trial.

The trial ran during the second school term — from June to September 2016 — on more than 10,000 fifth-graders, mostly ages 10 to 12. Half of the kids got the lessons, and half didn’t.

At the end of the trial, students in both groups were tested to see whether their understanding about the reliability of health claims improved. Oxman and the other researchers evaluated their results in the Lancet study.

More than twice as many children in the intervention schools (where kids received the lesson plans) achieved a passing score on the test compared with those in the control group. The average score on the test for the intervention schools was 62.4 percent compared with 43.1 percent for the control schools — a difference of about 20 percent.

And about one-fifth of the children had a test score indicating they mastered the key concepts (getting more than 20 of 24 answers correct) compared with less than 1 percent of the kids in the control schools.

So on every measure, the children who participated in the study outperformed the kids who didn’t.

“[The effect] was bigger than we had hoped,” said Oxman. “It shows that without any training, most kids are not able to put the findings to assess claims.” It’s also the first published trial finding that it’s possible to teach children as young as 10 how to critically appraise health claims — skills Oxman believes are “widely applicable” beyond Uganda.

Separately, the researchers also created a podcast on critical thinking concepts for parents, and tested that approach in another randomized controlled trial, also published in the Lancet. They were successful here as well: Nearly twice as many parents who listened to the podcast series passed a test on their understanding of key health concepts compared with parents in the control group.

The studies are remarkable: They can be read as a recipe book for how to turn children (and their parents) into bullshit detectors. Or, as Chalmers put it, to empower them to “detect bullshit when bullshit is being presented to them.” If other educators and policymakers find ways to apply these teachings in their own schools and communities, the potential impact is huge. And with conspiracy theories swaying elections, public health losing the battle against anti-vaccine campaigners, and “alternative facts” being presented as evidence, the findings couldn’t be more timely.

The new science of preventing the spread of bullshit

There have been other attempts to understand whether teaching kids to think critically works, but there’s very little research focusing specifically on health or on teaching these skills early in life.

The Uganda study, which was mostly supported by the Research Council of Norway, was big enough to detect meaningful differences in the critical thinking abilities between the groups of children.

Children studying from an earlier version of the Informed Health Choices textbook at one of the Ugandan pilot schools.
Sarah Rosenbaum/Informed Health Choices

But the study did come with limitations — for example, it’s well known that when an outcome measure (like the multiple-choice tests the kids took in the randomized trial) aligns with the intervention, it can bias the results.

It’s also possible that kids would know how to answer the questions in the test, but not how to apply those concepts in real life. As Hilda Bastian, a health researcher at the National Institutes of Health, said, “It doesn't matter what we know, if we don't apply it in real life. Knowledge has to kick in when it's needed. It has to over-ride other influences and impulses.”

Still, independent researchers who read the study were impressed by its rigor and size. “I’m pleasantly surprised with their results,” said Stanford University professor John Ioannidis. “It’s an interesting observation, and it’s at a minimum reassuring. Yes, these kids can learn [critical thinking].”

Ioannidis has also become convinced that the best hope for bullshit prevention lies in early childhood education, since waiting to teach people the standards of evidence-based thinking late in life doesn’t always work. “We need to start early on, to make people understand that basing decisions on fair tests, on science, on evidence is important,” he says.

But whether you believe the results of the Lancet trial is sort of beside the point. The trial brings us closer to understanding how to prevent bullshit from taking off and how to arm children with the skills needed to protect themselves from misinformation in the future. That’s something schools everywhere should pay attention to.

"My hope," Oxman said, "is that these resources get used in curricula in schools around the world, and that we end up with the children ... who become science-literate citizens and who can participate in sensible discussion about policy and our health. ... I’m looking to the future. I think it’s too late for my generation."

With Oxman's help, maybe we'll see fewer patients harmed by unhelpful treatments and fewer quacks profiteering off bogus medical advice — and a world with a little less bullshit in circulation.

Sign up for the newsletter Today, Explained

Understand the world with a daily explainer plus the most compelling stories of the day.