Certainty is everywhere, fundamentalism is in full bloom. Legions of authorities cloaked in total conviction tell us why we should invade country X, ban The Adventures of Huckleberry Finn in schools, or eat stewed tomatoes; how much brain damage is necessary to justify a plea of diminished capacity; the precise moment when a sperm and an egg must be treated as a human being; and why the stock market will eventually revert to historical returns. A public change of mind is national news.
This is the first paragraph from On Being Certain by Robert A. Burton. I am just a couple of pages in, but this has stopped me short.
The book is ostensibly about the biological origins of the feeling of knowing. How it is separate from “reason” and logic. But this paragraph perfectly encapsulates how the way society frames issues ignores and even rewards unwarranted certainty.
Modern media has an endemic sense of certainty. Journalistic convention is based on an underlying assumption of causation, of the world in front of us as the direct result of something that can be tracked down and explained. Something happened so there must be someone to talk to, or a bang that preceded it.
There’s no way it’s unknowable, or the result of complex interactions we can only tease out with time and after making many assumptions. Dogged by problems of measurement and perception. As a result you get a lot of declarative statements, black and white.
When a professional athlete is doing well, for instance, we are furnished with stories of their extensive workouts. When they do poorly we hear about their troubled childhood and off-court issues. Or maybe they just suck now. There’s little room for underlying randomness, problematic measurements, statistical noise and mean reversion. A cause must be found and responsibility taken.
The issue here is the need for a narrative. As a journalist, narrative is an important tool for grabbing someone’s attention, keeping it, and guiding them through a larger point. Or to highlight something specific and make it memorable.
But what does a narrative need? In this context it almost always entails simplistic cause and effect.
At the end of many news bulletins we get a financial update. We hear how this currency rose, a stockmarket over there fell, after hours trading is stagnant etc. All fair enough, except that they’re often immediately tied to a news hook.
The Yen went down cause a Yeti was spotted in Turkey. The Nasdaq rose cause Dutch tulips were especially vibrant this year. Sunspots.
Narrative is useful for audiences to connect with this kind of abstraction. But there’s no way causation for activities this complex were nailed down in the time between the signal and the news piece, if they ever can be.
These triggers often are big enough to have some association, but how much? How that was figured out is honestly an even better story.
That last line from Burton, about a change of mind being national news is also deserving of unpicking.
It is fair enough that leaders changing their minds about something is news. But the problem is in how it is approached. How often is the story about the change itself rather than what underlay the previous “belief” and how that changed? How good is the information, or, if that hasn’t changed, the mental model that reinterpreted it?
Beliefs often aren’t a binary proposition, especially when it comes to policy. Rather, they are about juggling trade offs, dogged by information asymmetries and stretched resources (mental and otherwise).
But here I am also falling into the trap of beliefs as the function of logic and reason. As I’ve documented here, beliefs have many potential fathers. Perhaps biology is one.
Watching professional sports, you often see a team that is behind suddenly go into desperation mode. The clock is ticking, so a flailing three pointer is launched from ten feet behind the line. Or the batter suddenly tries to hit the skin off of every pitch.
In reality it’s often not so dire. And trying to catch up in one go will likely doom you to failure. Hence the refrain – heard in many sports, not just baseball – that the way to go is by hitting singles, not home runs.
Just get onto first base. The person behind you will try and get you to second, etc. Go for a two pointer and not a three.
Don’t try to win it in one go. You won’t be the big hero. But you’ve got a better chance of succeeding.
I’ve been thinking about the weight given to big leaps, and in turn the relegation of smaller, safer gains, as I continue to read The Hidden Half.
We need to face the possibility that big influences are not as orderly or consistent as we expect, that the way things turn out is bound less by observable laws, forces or common factors than by the mass of uncommon factors, the jumble of hidden, micro-influences. Our habit of thinking of this as ‘noise’–and then thinking of ‘noise’ in turn as an annoying residual–diminishes one of life’s most magical elements.
Because of course, just as on the playing field, the heroes of academia and intellect are the ones who make the big play, not the tinkerers and exception finders.
But so often these big leaps wind up in incongruencies or pale in significance to the influence of smaller ones.
It’s not as sexy, but perhaps we should be emphasising something else – singles, not home runs.
We dream of laws and general truths; the practicality is often a patchwork of unexpected anomalies. Run with these ideas, apply them more widely, and you begin to conceive a world bustling with powerful but enigmatic differences that we just don’t see.
As always my emphasis.
…we can’t help turning up our pattern-making instinct to 11–when life offers only a 5. Too often, we make bold claims about big forces with law-like effects, but with culpable overconfidence that leads us to waste time, money, talent and energy, and detract from real progress… I’d like our claims to be more proportionate to the awkwardness of the task. Every new generation needs reminding of the overconfidence of every previous generation, of how much there is still to know and do, and, above all, how resistant the raw materials of life can be.
Reading books like Thinking In Bets, The Lady Tasting Tea and The Drunkards Walk, it’s hard not to be thoroughly disaffected with the deterministic model of the universe most of us carry in our heads.
Green tea causes weight loss, your aunt tells you. You should try get into that school cause it’s the best, they say.
In fact, it’s tempting to draw this back to school, where we’re taught to find the right answer, not the best approximation of one. Confounding, selection, randomness and the dozens of other thorns in simple causation aren’t even really hinted at.
It’s like a civilisation-wide Dunning-Kruger effect. We engage in pattern matching, fuelled by ascertainment and confirmation bias.
And, most importantly for The Hidden Half, where these excerpts are form, we try to boil all of this down into iron laws. The “noise” that inevitably screws up these simple heuristics are willed away or ignored, to be settled later.
But it’s here where author Michael Blastland really shines – in a plea to embrace the beauty of that which confounds our attempts at simplification.
I’m only a couple of chapters in but it’s already a rollicking ride.
I’ve no desire to dismiss or discourage genuine, careful and humble efforts to understand, and no desire either to knock down robust houses of brick alongside the mansions of straw. It would be easy, but deluded, to see this book as part of an anti-science cynicism that says everything is uncertain, and therefore nothing can be done. I reject that view entirely. On the contrary, I want more robust evidence precisely so that our decisions and actions can be more reliable. I sympathize entirely with how difficult it is to do that well. I applaud those who devote themselves to the problem conscientiously and carefully. This is why we must recognize our limitations, try to understand how they arise, tread more carefully and test what we know vigorously. It was once said that at certain times the world is over-run by false scepticism, but of the true kind there can never be enough. 20 This book aspires to the true kind. The goal is not cynicism; it is to do better.
As always my emphasis.
People who excel at programming, notes the coder and tech-culture critic Maciej Cegłowski, often “become convinced that they have a unique ability to understand any kind of system at all, from first principles, without prior training, thanks to their superior powers of analysis. Success in the artificially constructed world of software design promotes a dangerous confidence.”
This is from Coders, a book I only just downloaded but am absolutely tearing through.
The subtitle is “how software programmers think, and how their thinking is changing our world”, which is a clue to what Ceglowski is referring to.
When you’re writing code you’re trying to break a process down, to first principles and then into easy steps as you go along.
You build it back up in an environment over which you have a huge amount of control, that thrives on trial, error and iteration.
Where something usually either works or breaks obviously. Everything is very structured and built upon logic.
But by this point you’ve also abstracted so much you can trick yourself into thinking you’ve mastered all the nuances, not just how to get from A to B.
It’s also an alluring way of thinking, which you begin applying to other problems in your life. In a similar way to how you can start thinking in another language if you are sufficiently steeped in it.
This is a fantastic book so far. Hope to post some more.
As always my emphasis.
There is a pervasive idea in Western culture that humans are essentially rational, deftly sorting fact from fiction, and, ultimately, arriving at timeless truths about the world. This line of thinking holds that humans follow the rules of logic, calculate probabilities accurately, and make decisions about the world that are perfectly informed by all available information. Conversely, failures to make effective and well-informed decisions are often chalked up to failures of human reasoning—resulting, say, from psychological tics or cognitive biases… Models of social learning help us see that this picture of human learning and rationality is dangerously distorted. What we see in these models is that even perfectly rational—albeit simple—agents who learn from others in their social network can fail to form true beliefs about the world, even when more than adequate evidence is available. In other words, individually rational agents can form groups that are not rational at all.
This is from The Misinformation Age by Cailin O’Connor and James Owen Weatherall, which I first referenced a couple of days ago.
At first glance that your beliefs are a product of the people you surround yourself with is quite banal. Of course most us haven’t personally verified even a fraction of our “knowledge” – from the mathematical heuristics we learn in school to the actual size of Greenland.
In fact the ability to share information – both bad and good – is a major factor in our success as a species.
But unpack this a little more – as the authors of this book do masterfully – and it starts to dawn how devastating poor information hygiene really is. Your personal store of knowledge or model of the world isn’t so much the product of your own “filter”, but the filter of those around you. And around them. And around them….
And, as the models that O’Connor and Weatherall construct show, it isn’t just deliberate misinformation that can affect those in a social network. In fact, misinformation is only one way companies etc. can subtly put their finger on the scale.
Rather, both deliberate and unconscious misunderstanding of uncertainty and randomness can filter through to the unwitting, not least by curtailing the possible or sending us down wrong tracks. And this is before adding the complexities of conformity bias, clustering and selection, distrust etc.
So, what to do? I don’t know. But consider this:
…we need to understand the social character of belief—and recognize that widespread falsehood is a necessary, but harmful, corollary to our most powerful tools for learning truths… When we open channels for social communication, we immediately face a trade-off. If we want to have as many true beliefs as possible, we should trust everything we hear. This way, every true belief passing through our social network also becomes part of our belief system. And if we want to minimize the number of false beliefs we have, we should not believe anything.
As always, my emphasis.