In Year 6, we were set an exercise in French which involved writing the names of various animals on the continents in the approximate places in which they were found. A friend of mine, as a prank, put 'Le panda' in North America as well as China and, when challenged by the teacher, spoke convincingly of the 'North American pandas' he claimed to have recently learned about from a nature documentary. The end result was the teacher instructing the whole class to add 'Le panda' to North America.
I am dubious about our ability to spot misinformation in primary schools.
I agree, which is why the burden of proof has to be on any new programme.
I could actually see a programme working if it involved spotting misinformation within a set domain - eg hoax animals. If you knew the post-test would be on hoax animals you could gear the instruction towards that. I actually think something like that could be useful & fun and would lead to better understanding of the natural world. It’s similar to the way I’ve argued that reading tests should focus on specific curriculum knowledge.
There should be a rule: Every time a politician asks for schools to teach more of X, they should tell us what we should be teaching less of to make space for it.
Couldn't agree more. This resonates so much with what I see dailly in my classes. It perfectly follows your previous point about needing practical digital literacy, not just theory. The tree octopus example is quite telling. If they can't spot that, how do we prepare them for the more complex online disinformation?
Thank you for this post! Source savviness (or lack thereof) is something that keeps me up at night when thinking about the most essential skills with which we equip our students! Love your idea for the pre-and post assessment.
Thanks for an interesting post. When I was teaching primary aged youngsters we'd devise various strategies to help develop research skills. In simple terms this would mean giving them things to find out, usually factual, and a range of books and photocopied articles to use. This was pre Internet days.They had to use at least three sources and cite this to back up their answers. Not dissimilar to "... create a pre-test consisting of four websites...". I think the trickier thing now is that there is so much information out there. So many shiny things to pick from or to distract you. Is there any secure research on whether the mass on information including on social media has made people more gullible. I'm aware that many people do get scammed sometimes at great personal cost but has the percentage increased I wonder. I realise too that my adolescent naivety, for example thinking The Beatles were just four fun lovingi innocent mop top lads, probably wouldn't result in putting me in a life threatening situation. By the time I was in my 20s I could see it was all drugs and see and rock 'n' roll. Now kids seem to be sucked into some nasty experiences through websites. I wish schools well with this endeavour though I suspect other generations probably need the input as well.
With the Tree Octopus, the issue is that pupils don't stop to think about how realistic what they're reading actually is - does it sound true that an octopus would accept dollar bills from visitors and use it to build their nests? Objectively, it's odd - and that's the bit that pupils need to know, that if it sounds too extreme, it may not be true. Also, a side note - I'd argue that the Tree Octopus is disinformation rather than misinformation, given it seems to be more of a deliberate prank, rather than the accidental sharing of inaccurate information despite the best intentions.
More broadly, it's important that pupils know that there's no clear cut way to identify truth from fiction, but rather it's equipping them with skills to get a sense of what might be inaccurate or might be more truthful. Helen Blachford and I created the REVIEW model in 2019 after a two-week teacher trip about media literacy to the USA, standing for Reputation, Evidence, Verification, Intent, Emotions and Weigh it Up. It's important to help pupils identify specific evidence that they can verify, but also to try to identify if the story feels like it's giving a one-sided view, is an opinion, or if there might be gaps in the story that don't quite make sense. Yes, they can verify content, but the nature of an exclusive story means such an article couldn't be verified.
It's also about explaining types of media or other information; the Information Neighbourhoods from Stony Brook University helps pupils to recognise that advertising and entertainment differ from journalism, and the ideas related to independence and accountability.
In terms of assessment, we can look at pupils' abilities to demonstrate the skills of the REVIEW model (or other options) and to justify their beliefs as to whether or not the content is accurate. Is it perfect? No. But just as it's difficult to know for sure if someone has accidentally passed on false information or is deliberately trying to trick you, it's hard to measure a subjective opinion and decide the extent to which someone's view is valid.
I'm clearly biased given I have taught Citizenship and have spent the last six years writing media literacy lessons and training teachers, but I think it's a good thing to add it to the curriculum; the alternative of not supporting young people is surely a poorer outcome.
I'm afraid you are seriously mischaracterizing the SIFT approach, which was precisely a rejection of any checklist attempt to evaluate the item of interest, but instead use lateral reading to quickly see if the original source has a reputation for getting stuff right, and if not, asking reliable sources about the thing. It gets the octopus hoax correct, needless to say.
SIFT its own can't get anything right or wrong. The human being who uses it can. So are you saying that 100% of humans who use SIFT will spot the tree octopus hoax?
Teaching media literacy requires teachers to be media literate which may include a) reading more widely than the Guardian and BBC b) being critical of the Guardian and the BBC.
I’ve thought about this a lot. I think it’s a lot less about techniques for fact checking or critical thinking, although those are helpful and necessary. Truth seeking begins with curiosity. What we don’t teach is how to recognize when we are in a certainty mindset, how to let that go, and then tap into our curiosity and abilities to explore, analyze, and synthesize. And of course, recognize when others are so certain they’ve also lost their abilities to reason, take in new information, and wonder.
Only then do the tools become useful. I’ve seen really good results with that approach.
I think the arboral octopus is a very difficult thing to start with. In general, testing the truth of "something sonebody said or did for a hoot" is an advanced skill. Misleading or fraudulent advertising is a lot easier to start with.
I get what you're saying, but there's no point in investigating falsehoods that demand nothing of us. These courses should focus specifically on misinformation which contains an implicit call-to-action, phishing scams and propaganda and such. This gives us something much more workable to, well, work with: training students to be skeptical of information that demands something from them, and training them to notice these demands. A media literacy course where all of the graduates fall for harmless pranks but don't fall for propaganda is fine and plausible, I think.
In Year 6, we were set an exercise in French which involved writing the names of various animals on the continents in the approximate places in which they were found. A friend of mine, as a prank, put 'Le panda' in North America as well as China and, when challenged by the teacher, spoke convincingly of the 'North American pandas' he claimed to have recently learned about from a nature documentary. The end result was the teacher instructing the whole class to add 'Le panda' to North America.
I am dubious about our ability to spot misinformation in primary schools.
I agree, which is why the burden of proof has to be on any new programme.
I could actually see a programme working if it involved spotting misinformation within a set domain - eg hoax animals. If you knew the post-test would be on hoax animals you could gear the instruction towards that. I actually think something like that could be useful & fun and would lead to better understanding of the natural world. It’s similar to the way I’ve argued that reading tests should focus on specific curriculum knowledge.
There should be a rule: Every time a politician asks for schools to teach more of X, they should tell us what we should be teaching less of to make space for it.
Definitely!
Couldn't agree more. This resonates so much with what I see dailly in my classes. It perfectly follows your previous point about needing practical digital literacy, not just theory. The tree octopus example is quite telling. If they can't spot that, how do we prepare them for the more complex online disinformation?
Thank you for this post! Source savviness (or lack thereof) is something that keeps me up at night when thinking about the most essential skills with which we equip our students! Love your idea for the pre-and post assessment.
I wrote an article about this for TES last year: https://bernardandrews.substack.com/publish/post/178420823
Thanks for an interesting post. When I was teaching primary aged youngsters we'd devise various strategies to help develop research skills. In simple terms this would mean giving them things to find out, usually factual, and a range of books and photocopied articles to use. This was pre Internet days.They had to use at least three sources and cite this to back up their answers. Not dissimilar to "... create a pre-test consisting of four websites...". I think the trickier thing now is that there is so much information out there. So many shiny things to pick from or to distract you. Is there any secure research on whether the mass on information including on social media has made people more gullible. I'm aware that many people do get scammed sometimes at great personal cost but has the percentage increased I wonder. I realise too that my adolescent naivety, for example thinking The Beatles were just four fun lovingi innocent mop top lads, probably wouldn't result in putting me in a life threatening situation. By the time I was in my 20s I could see it was all drugs and see and rock 'n' roll. Now kids seem to be sucked into some nasty experiences through websites. I wish schools well with this endeavour though I suspect other generations probably need the input as well.
With the Tree Octopus, the issue is that pupils don't stop to think about how realistic what they're reading actually is - does it sound true that an octopus would accept dollar bills from visitors and use it to build their nests? Objectively, it's odd - and that's the bit that pupils need to know, that if it sounds too extreme, it may not be true. Also, a side note - I'd argue that the Tree Octopus is disinformation rather than misinformation, given it seems to be more of a deliberate prank, rather than the accidental sharing of inaccurate information despite the best intentions.
More broadly, it's important that pupils know that there's no clear cut way to identify truth from fiction, but rather it's equipping them with skills to get a sense of what might be inaccurate or might be more truthful. Helen Blachford and I created the REVIEW model in 2019 after a two-week teacher trip about media literacy to the USA, standing for Reputation, Evidence, Verification, Intent, Emotions and Weigh it Up. It's important to help pupils identify specific evidence that they can verify, but also to try to identify if the story feels like it's giving a one-sided view, is an opinion, or if there might be gaps in the story that don't quite make sense. Yes, they can verify content, but the nature of an exclusive story means such an article couldn't be verified.
It's also about explaining types of media or other information; the Information Neighbourhoods from Stony Brook University helps pupils to recognise that advertising and entertainment differ from journalism, and the ideas related to independence and accountability.
In terms of assessment, we can look at pupils' abilities to demonstrate the skills of the REVIEW model (or other options) and to justify their beliefs as to whether or not the content is accurate. Is it perfect? No. But just as it's difficult to know for sure if someone has accidentally passed on false information or is deliberately trying to trick you, it's hard to measure a subjective opinion and decide the extent to which someone's view is valid.
I'm clearly biased given I have taught Citizenship and have spent the last six years writing media literacy lessons and training teachers, but I think it's a good thing to add it to the curriculum; the alternative of not supporting young people is surely a poorer outcome.
How about every student taking a course in logic to begin with. Learn how to identify logical fallacies.
Fully agree with this
I'm afraid you are seriously mischaracterizing the SIFT approach, which was precisely a rejection of any checklist attempt to evaluate the item of interest, but instead use lateral reading to quickly see if the original source has a reputation for getting stuff right, and if not, asking reliable sources about the thing. It gets the octopus hoax correct, needless to say.
SIFT its own can't get anything right or wrong. The human being who uses it can. So are you saying that 100% of humans who use SIFT will spot the tree octopus hoax?
Media literacy starts here:
https://youtu.be/EjPlfUt4S9U?si=ly1TZHoSNO9Oz58f
https://youtu.be/EczqWXF_ch0?si=BQB1hx9lgKY_4Lgj
Teaching media literacy requires teachers to be media literate which may include a) reading more widely than the Guardian and BBC b) being critical of the Guardian and the BBC.
Never mind tree octopi.
Guardian in 2020
https://www.theguardian.com/commentisfree/2020/jun/09/conspiracies-covid-19-lab-false-pandemic
Also Guardian (2025)
https://www.theguardian.com/commentisfree/2025/jun/25/covid-lab-leak-theory-right-conspiracy-science.
I’ve thought about this a lot. I think it’s a lot less about techniques for fact checking or critical thinking, although those are helpful and necessary. Truth seeking begins with curiosity. What we don’t teach is how to recognize when we are in a certainty mindset, how to let that go, and then tap into our curiosity and abilities to explore, analyze, and synthesize. And of course, recognize when others are so certain they’ve also lost their abilities to reason, take in new information, and wonder.
Only then do the tools become useful. I’ve seen really good results with that approach.
I certainly agree about curiosity. It's one of the cornerstones of learning.
I think the arboral octopus is a very difficult thing to start with. In general, testing the truth of "something sonebody said or did for a hoot" is an advanced skill. Misleading or fraudulent advertising is a lot easier to start with.
The crucial thing to me is the understanding of argument, logic and perspective. That is teachable. It’s 90% of history or English literature.
Their prefrontal cortex isn’t fully online yet. So they react more to tone, confidence and group approval more than logic.
I get what you're saying, but there's no point in investigating falsehoods that demand nothing of us. These courses should focus specifically on misinformation which contains an implicit call-to-action, phishing scams and propaganda and such. This gives us something much more workable to, well, work with: training students to be skeptical of information that demands something from them, and training them to notice these demands. A media literacy course where all of the graduates fall for harmless pranks but don't fall for propaganda is fine and plausible, I think.