The photo below is of me with my grandmother standing by her market stall in Petticoat Lane, a street market in London.
When that photo was taken, she’d worked on that stall every Sunday for nearly 50 years.
Working on a market stall gives you many skills, and one of them was that she was incredibly quick at mental arithmetic. A couple of years before that photo was taken, I remember asking her if I could count some of the coins from a day’s takings. I had seen her do it so many times, and she made it look so easy. After a minute or so, I realised it wasn’t. My hands and my brain just couldn’t work as quickly as hers.
She was not the only one with those skills. Most of the other stallholders were similar. People used to tell stories about older men who could “beat the Tote” – that is, who could calculate in their head the results of the on-track betting totalizator before they flashed up on the 1930s equivalent of a big screen.1 Other people of that generation had useful skills too. At least one person in every extended family seemed able to play the piano to a decent standard.
This is all the more remarkable when you consider that most of this generation had little formal education. My grandmother’s generation had their education badly disrupted by the war, and the generation before that had only a few years of compulsory elementary education.
Do younger generations have the same set of skills? Do adults born in the 1980s and 1990s have the same speed at mental arithmetic? Is the ability to play a musical instrument as widespread?
My instinct is no. But claims about the relative capabilities of different generations get very controversial, and can be surprisingly tricky to answer using data.2 Still, there are two important pieces of research that shed light on these questions: 1) data on declining IQ scores and 2) research on what happens to our brains when we use technology.
1) The reversal of the Flynn effect
For most of the 20th century, scores on IQ tests steadily increased. This trend was termed the Flynn effect, after James Flynn, who described it. But in developed countries, this increase appears to have stopped and even reversed for cohorts born in the latter decades of the 20th century.
The most robust evidence for the reversal of the Flynn effect is from Scandinavian countries where they have comprehensive and consistent data on the IQ scores of 18-year-old military conscripts. Nothing so comprehensive exists for the UK, but Flynn himself has analysed historical UK data and concluded that there is a dip in the IQ scores of teenagers from the 1980s onwards. He tentatively suggests that one causal factor might be changes in how teenagers spend their leisure time.
Obviously, one major change that’s happened in this era has been the increasing prevalence of technology. This brings us to the second important area of research: how technology affects our brains.
2) The cognitive offload
The “cognitive offload” is the term for when you use technology to spare you some mental effort. Humans really like doing this: we are cognitive misers, and we try to expend as little mental effort as possible. Using a calculator to do a maths problem, writing down a shopping list instead of attempting to remember it, using GPS to navigate, pressing play on Spotify instead of sitting down at a piano – all these are examples of cognitive offloading.
The cognitive offload is a mixed blessing. It will generally let you achieve your immediate goal more efficiently and reliably than otherwise. However, it will stop you developing your own abilities. People who use GPS get to their destination faster than people who don’t, but they develop weaker mental representations of their journey. People who use cameras to take photos instead of drawing will get faster and higher-quality images, but they won’t remember as much about what they see.
That’s why I’ve written before that if your goal is to get a task done quickly, you should definitely use technology. If your goal is to develop your skills, you shouldn’t. If you want to travel 26 miles as quickly as possible, drive or get a taxi – just don’t pretend that you’ve run a marathon.
You also shouldn’t pretend that by cognitively offloading a task, you will magically get smarter because you can focus on more advanced and complex skills. More advanced and complex skills are based on simpler ones. If you don’t have the simpler skills, you can’t develop the more advanced ones. You can’t think strategically about chess unless you know how the pieces move. And the cognitive offload makes it less likely we will get the practice that lets us acquire the basic skills.
In an incredibly important recent paper on the impact of AI on education, Oakley et al make the link between the reversal of the Flynn effect and the rise of cognitive offloading. In their words: “frequently offloading cognitive work to devices may cause certain "mental muscles" to atrophy.”
We might not be smarter, but we have nicer lives
It is plausible, therefore, that cognitive offloading can lead to societies becoming more prosperous, more advanced, longer-lived – and less smart.
The more we get machines to do things, the better our lives are. You can see this through the evolution of retail from street markets to supermarkets.
It is impossible for old-fashioned London street markets of the type I grew up working on to sustain the kinds of living standards people expect today. If you want to be nostalgic about these markets, go for it – I have my moments – but be honest and acknowledge what it would mean.
To sustain modern society, you need supermarkets, cold chains, barcodes, computerised tills, sophisticated logistics, and much more - and all of it underpinned by machines that do calculations far quicker and more reliably than even the most experienced market worker.
But every solution contains within it the seed of a new problem. In fact, there are no solutions, only trade-offs. The new problem here is that a lot of the basic routine tasks we’ve offloaded to machines had very useful by-products that we have now lost.
First, they were the pathway to more advanced skills which machines have not yet mastered, and which therefore retain value in the labour market. E.g., if you spent years working on a racecourse calculating basic odds, it would help you develop more advanced skills in probability and statistics, which are still very valuable.
Second, those basic skills allowed people to develop their human potential. E.g., if you learnt to play the piano when you were younger, it could be something you ended up doing for pleasure and enjoyment when you were older.
Cognitive Scrooges
OK, so we no longer need to do basic routine tasks to earn a living. But if those tasks are so important, surely we can choose to do them?
But the guilty truth is that if we don’t have to do something, we often don’t. Most people need some kind of discipline or structure, particularly when they are starting out. 100 years ago, you were incentivised to learn basic skills by some obvious carrots and sticks.
The carrot was that even quite basic skills could earn you money and make your friends and family happy. The stick was that daily life forced a lot of tedious practice on to you, and if you made mistakes you would look foolish and cause yourself problems.
Today, the incentives have changed. You have to reach much higher standards to add any economic or social value. And daily life doesn’t force the same practice on you.
What’s happened with music and maths in the past century or so is now coming for reading and writing. Large Language Models could mean that for many people – not all, but many – the costs of learning to read and write well may no longer be justified by the benefits, and the daily practice in reading and writing that society and the economy used to force on them is no longer there.
Most text messaging services let you send voice notes and provide you with audio read-outs of text. Consumer software is designed to be so intuitive that it doesn’t need instructions. Entry-level jobs replace text with images: fast food tills have pictures of each food item, not words.
One of the utilitarian justifications for learning to read and write was that it was hard to negotiate life without basic literacy. It’s now getting much easier.
Obesogenic and stupidogenic societies
Previously, I’ve written about some of the parallels between the historical moment when physical machines outstripped humans, and our own moment, when thinking machines may well be outstripping humans.
Here is another parallel.
People often describe our current society as being “obesogenic” – that is, a society that makes it easy to be overweight. We have abundant food, and limited need for hard physical labour. The problem that tormented humans for generations – scarcity of food and necessity of backbreaking toil – has been replaced by another problem.
Is there a parallel now that thinking machines are getting so good? Most people don’t have to do hard cognitive labour, and technology can provide us with so much easy entertainment and distraction.
But these improvements might have created a new problem: it’s now easy to be stupid.
The Times writer James Marriott has suggested that our current media environment is “moronogenic”. Gurwinder Bhogal suggests we are living through an intellectual obesity crisis.
Relatively speaking, these are nice problems to have. I’d rather live in an obesogenic society than one with famine. I’d rather live in a stupidogenic society than one with no technology.
But they are still problems.
The cost of physical machines is human obesity; the cost of intelligent machines is human stupidity.
How will people react?
Whilst we live in an obesogenic society, there are countervailing trends. Many people have taken up physical activity as a leisure pastime. As recently as the 1970s, joggers used to be laughed at as faddish weirdos, but now Parkrun attracts millions of participants every Saturday morning. People pay money to go to the gym, run marathons and take part in events like Iron Man and Tough Mudder – activities that previous generations would have thought were insane.
Some people alive today are probably fitter and healthier than any in history. And some people are not, and there is a huge range in between. Liberating people from physical toil has led to a much wider range of physical fitness.
Maybe something similar will happen with intellectual fitness. Cognitive games and puzzles and competitions will become very popular. Adults will employ personal tutors and coaches to help them win. Who knows what the intellectual equivalent of Tough Mudder will be.
Some people will end up smarter and cleverer than any in human history. Some will not. And there will be a huge range in between.
What should schools do about this?
I think schools should respond to this change by becoming gymnasia for the mind.
They should be places that allow you to practice and acquire the basic skills that are no longer directly rewarded in daily life, but which are still vital.
However, many people have drawn the exact opposite conclusion. They see the development of powerful artificially intelligent thinking machines as an opportunity for students to stop doing the basics, and to “focus on the things that machines can’t do”.
In a world where machines can do so much and are constantly learning to do more, that’s an incredibly reductive vision of education, and one that allows machines to set a ceiling on what humans can do.
It’s a recipe for a stupidogenic society.
I love this:
“That’s why I’ve written before that if your goal is to get a task done quickly, you should definitely use technology. If your goal is to develop your skills, you shouldn’t. If you want to travel 26 miles as quickly as possible, drive or get a taxi – just don’t pretend that you’ve run a marathon.”
Great piece. I do believe our society has become so smart we truly are stupid. I don’t know of any world leader or society that has made their nation better because of technology. All I have seen is a more fractious society, with everyone screaming that they are right, and nobody knows how to engage in civil discourse, yet none are truly intelligent in any subject matter anymore because they truly do not have a firm foundation in any meaningful subject matter.
There is no golden age to reflect on, however part of being "civilized", is to adapt specific traits in order to progress forward in a meaningful manner. Those mores are now frowned upon as being old fashioned and no longer relevant, but this isn’t true. Young people have not benefited from not learning grace, poise, or the art of conversation and are wracked with anxiety because they don’t know basic maths to count back change when the power goes out. Who knew that basic skills such as baking and changing a tire would become obsolete in our highly structured world, yet these are the very skills we are now paying "experts" to teach our children. We do not have to live in a bygone era, but we shouldn’t forget all the wisdom that came before us.
As for your query about if younger people are still interested in reading and writing, as my youngest is finishing up on her Masters in Comparative Literature, I can assure you it is still very much needed for the younger generation. Interestingly it was technology which allowed her to watch the 1970s movie on "Wuthering Heights", but it was the 200 year old book which she fell in love with. And her friends who study Anthropology and Ancient Classics at university are equally enthralled. Technology is a blessing and a curse. Humans should use it as a tool to aid in our lives, but the encouragement of our leaders to allow Silicon Valley to continue to grow unchecked and allow the populace to become addicted to social media is just plain wrong. I just hope society wakes up and realizes that before it’s too late.