Can we teach students to spot misinformation?
Does the Pacific Northwest Tree Octopus exist?
Last week, the UK government released its long-awaited Curriculum & Assessment Review. One recommendation in particular has been getting a lot of attention: that the government should “strengthen the role of media literacy” with a particular focus on “understanding how to identify and protect against misinformation and disinformation.”
Much of the subsequent discussion about this proposal has focused on the challenges of defining misinformation in an era of political polarisation. However, I want to focus on something even more basic: even when falsehoods are obvious and undeniable, many students struggle to spot them.
The Pacific Northwest Tree Octopus
The Pacific Northwest Tree Octopus is a deliberate and humorous hoax website about a fabricated species of octopus that supposedly lives in the forests of the Pacific Northwest. The website has all the features of a serious conservation website, but it’s completely made up.
A couple of research studies have used this website to see how good students are at evaluating the reliability of online sources. In a 2007 study in America, barely any 7th grade students identified it as a hoax, and a more recent Dutch study found something similar with 11 & 12 year olds. In the US study, students still insisted the octopus was real even after being told it was a fake.
How could we get students to spot this hoax?
A common response to this problem is to say that we should teach students to be digitally literate, or to teach them some kind of checklist that they can use to evaluate websites. For example, there is the CRAAP checklist and the SIFT checklist, and other resources designed for younger students.
The problem with all of these checklists is that they’re a bit like telling a student to look up words they don’t know in a dictionary. This only works if the student has a big enough vocabulary to know what the words in the definition mean. If they don’t, they’re caught in an infinite loop of looking up the words they don’t know, only to find more words they don’t know.
Most of these checklists recommend that students should check the source of the information to see if it is trustworthy and reliable. But how do you know if a source is trustworthy or reliable? The Pacific Northwest Tree Octopus website is associated with the “Kelvinic University branch of the Wild Haggis Conservation Society.” If you’re an adult, that just sounds a bit off. But lots of students think it sounds like a great endorsement.
Of course, all the checklists recommend doing further research and online searches to verify information. What happens if you do a Google search on the Kelvinic University? Currently, the first result you get tells you “Kelvinic University is a fully accredited, independent institute of higher learning that offers Bachelor’s, Master’s, and PhD programs.” Well, that sounds OK. I guess the Tree Octopus is real!
It’s the same for critical thinking
In a 2007 article, the cognitive scientist Dan Willingham noted that over the last 20 years, programmes designed to teach critical thinking had become very popular, but they were not very effective. He concludes with the following:
“Can critical thinking actually be taught? Decades of cognitive research point to a disappointing answer: not really.”
He makes the same point about the limitations of teaching maxims.
“If you remind a student to ‘look at an issue from multiple perspectives’ often enough, he will learn that he ought to do so, but if he doesn’t know much about an issue, he can’t think about it from multiple perspectives.”
He gives another example: suppose you want to investigate why one car gets better gas mileage than another. How will you devise your research hypothesis and which factors will you choose to investigate? Your decision about what to investigate depends on specific knowledge.
You won’t choose to investigate a difference between cars A and B that you think is unlikely to contribute to gas mileage (e.g., paint color), but if someone provides a reason to make this factor more plausible (e.g., the way your teenage son’s driving habits changed after he painted his car red), you are more likely to say that this now-plausible factor should be investigated. One’s judgment about the plausibility of a factor being important is based on one’s knowledge of the domain.
Your ability to apply maxims like “devise a research hypothesis” and “control the variables” depends on very specific contextual knowledge.
What is truth, said jesting Pilate, and would not stay for an answer
In order to make a judgement about a truth claim, you have to know something about the claim itself. Here is the philosopher Dan Williams making this point.
The fundamental problem is that there are no intrinsic differences between true and false claims. That is, whether a claim is right or wrong—or informative or misleading—depends not on characteristics of the claim itself but on whether it accurately represents how things are.
If someone tells you that there are tree octopuses living in the forests of the Pacific Northwest, “you cannot simply examine the statement—or even its surrounding rhetorical context—to figure out whether it is true or false; its truth or falsity depends on the world.”
Williams is not claiming that we have to independently verify every single fact before we can trust it. That’s impossible. He is just pointing out that statements and their rhetorical contexts are imperfect guides to truth, and telling students they are is setting them up to fail.
I think history teachers in the UK have similar experiences of the challenges of trying to teach students to evaluate truth claims with general principles. Are sources written by individuals more or less reliable than ones written by governments? Are sources written by eyewitnesses more or less reliable than ones written centuries after the event? Are sources written for publication more or less reliable than private diaries or letters?
The answer in every case is “it depends”. If only we could say: this source has feature x, therefore it is definitely true. It would be wonderful – we wouldn’t have to teach any history at all! But we do have to teach history, and science, and geography, and it’s good teaching of these kinds of traditional school subjects that will, in the long-term, provide students with the best possible defence against hoaxes and misinformation.
The Curriculum and Assessment Review quite rightly recognises a lot of what I’ve said above. It makes it very clear that background knowledge is necessary to evaluate truth claims, and that “having secure knowledge is essential to discerning truth from falsehoods and is one of the many reasons why a knowledge-rich curriculum is more, not less important in the modern world.”
It is now up to the government to implement their recommendations about improving media literacy and helping students spot misinformation. So what should they do?
So how can we improve media literacy?
One of the major themes of this Substack is that assessment is where the rubber hits the road, or, in Dylan Wiliam’s terms, assessment operationalises curriculum.
If we’re interested in strengthening media literacy, we need to design assessments that will help us a) work out exactly what it is we are trying to improve and b) whether the interventions we’re proposing work or not.
Here is a suggestion about how this could work. We could create a pre-test consisting of four websites. Three are accurate, and one is a hoax, like the Pacific Northwest Tree Octopus website. We could then create a post-test of four new websites, again made up of three accurate ones and one hoax. In each case, we ask the students to identify the hoax website and explain why. (You could use Comparative Judgement to evaluate their explanations!)
In between the pre and post test, we can deliver our intervention and see if it leads to improvements on the post-test.
I would suggest that any new media literacy curriculum should be piloted and evaluated in this way before wider implementation. I think it’s unlikely that any generic checklist approach will be successful - but I might be wrong, and either way, we will be adding to the sum of human knowledge and discovering more about what does and doesn’t work.


In Year 6, we were set an exercise in French which involved writing the names of various animals on the continents in the approximate places in which they were found. A friend of mine, as a prank, put 'Le panda' in North America as well as China and, when challenged by the teacher, spoke convincingly of the 'North American pandas' he claimed to have recently learned about from a nature documentary. The end result was the teacher instructing the whole class to add 'Le panda' to North America.
I am dubious about our ability to spot misinformation in primary schools.
I’ve thought about this a lot. I think it’s a lot less about techniques for fact checking or critical thinking, although those are helpful and necessary. Truth seeking begins with curiosity. What we don’t teach is how to recognize when we are in a certainty mindset, how to let that go, and then tap into our curiosity and abilities to explore, analyze, and synthesize. And of course, recognize when others are so certain they’ve also lost their abilities to reason, take in new information, and wonder.
Only then do the tools become useful. I’ve seen really good results with that approach.