One of my favourite YouTube channels is called Kurzgesagt. They recently released a video explaining why you should trust their short, animated videos on topics as diverse as string theory, ageing and homeopathy.
The interesting part is not the deep dive into their research, writing and fact checking process; but that they spend almost half the video utterly ripping themselves to shreds.
They call out two videos specifically – one on refugees and another on addiction. They explain why they are problematic and that they have been removed.
I trust them so much more because of this self-flagellation. Because they are willing to admit their mistakes and bias. To explain why these were failures and how they were made. To flesh out the context, what has changed and why.
Going through this so comprehensively makes me believe they’ve learnt from their mistakes. And doing so in a prominent space (rather than, for instance, newspaper corrections being buried on page 15) shows they take it seriously.
Being right is a process, is hard, is often undignified, and it doesn’t get easier. Just look at how many public institutions we’ve built around these principles.
Unfortunately the same can’t be said for most of our media. They tend to prefer a model of trust built on prominence and obscurity rather than transparency. They seek to wish away bias rather than own and deal with it.
That doesn’t work anymore.
Some other great videos from the channel:
There is a pervasive idea in Western culture that humans are essentially rational, deftly sorting fact from fiction, and, ultimately, arriving at timeless truths about the world. This line of thinking holds that humans follow the rules of logic, calculate probabilities accurately, and make decisions about the world that are perfectly informed by all available information. Conversely, failures to make effective and well-informed decisions are often chalked up to failures of human reasoning—resulting, say, from psychological tics or cognitive biases… Models of social learning help us see that this picture of human learning and rationality is dangerously distorted. What we see in these models is that even perfectly rational—albeit simple—agents who learn from others in their social network can fail to form true beliefs about the world, even when more than adequate evidence is available. In other words, individually rational agents can form groups that are not rational at all.
This is from The Misinformation Age by Cailin O’Connor and James Owen Weatherall, which I first referenced a couple of days ago.
At first glance that your beliefs are a product of the people you surround yourself with is quite banal. Of course most us haven’t personally verified even a fraction of our “knowledge” – from the mathematical heuristics we learn in school to the actual size of Greenland.
In fact the ability to share information – both bad and good – is a major factor in our success as a species.
But unpack this a little more – as the authors of this book do masterfully – and it starts to dawn how devastating poor information hygiene really is. Your personal store of knowledge or model of the world isn’t so much the product of your own “filter”, but the filter of those around you. And around them. And around them….
And, as the models that O’Connor and Weatherall construct show, it isn’t just deliberate misinformation that can affect those in a social network. In fact, misinformation is only one way companies etc. can subtly put their finger on the scale.
Rather, both deliberate and unconscious misunderstanding of uncertainty and randomness can filter through to the unwitting, not least by curtailing the possible or sending us down wrong tracks. And this is before adding the complexities of conformity bias, clustering and selection, distrust etc.
So, what to do? I don’t know. But consider this:
…we need to understand the social character of belief—and recognize that widespread falsehood is a necessary, but harmful, corollary to our most powerful tools for learning truths… When we open channels for social communication, we immediately face a trade-off. If we want to have as many true beliefs as possible, we should trust everything we hear. This way, every true belief passing through our social network also becomes part of our belief system. And if we want to minimize the number of false beliefs we have, we should not believe anything.
As always, my emphasis.