There is a pervasive idea in Western culture that humans are essentially rational, deftly sorting fact from fiction, and, ultimately, arriving at timeless truths about the world. This line of thinking holds that humans follow the rules of logic, calculate probabilities accurately, and make decisions about the world that are perfectly informed by all available information. Conversely, failures to make effective and well-informed decisions are often chalked up to failures of human reasoning—resulting, say, from psychological tics or cognitive biases… Models of social learning help us see that this picture of human learning and rationality is dangerously distorted. What we see in these models is that even perfectly rational—albeit simple—agents who learn from others in their social network can fail to form true beliefs about the world, even when more than adequate evidence is available. In other words, individually rational agents can form groups that are not rational at all.

This is from The Misinformation Age by Cailin O’Connor and James Owen Weatherall, which I first referenced a couple of days ago.

At first glance that your beliefs are a product of the people you surround yourself with is quite banal. Of course most us haven’t personally verified even a fraction of our “knowledge” – from the mathematical heuristics we learn in school to the actual size of Greenland.

In fact the ability to share information – both bad and good – is a major factor in our success as a species.

But unpack this a little more – as the authors of this book do masterfully – and it starts to dawn how devastating poor information hygiene really is. Your personal store of knowledge or model of the world isn’t so much the product of your own “filter”, but the filter of those around you. And around them. And around them….

And, as the models that O’Connor and Weatherall construct show, it isn’t just deliberate misinformation that can affect those in a social network. In fact, misinformation is only one way companies etc. can  subtly put their finger on the scale.

Rather, both deliberate and unconscious misunderstanding of uncertainty and randomness can filter through to the unwitting, not least by curtailing the possible or sending us down wrong tracks. And this is before adding the complexities of conformity bias, clustering and selection, distrust etc.

So, what to do? I don’t know. But consider this:

…we need to understand the social character of belief—and recognize that widespread falsehood is a necessary, but harmful, corollary to our most powerful tools for learning truths… When we open channels for social communication, we immediately face a trade-off. If we want to have as many true beliefs as possible, we should trust everything we hear. This way, every true belief passing through our social network also becomes part of our belief system. And if we want to minimize the number of false beliefs we have, we should not believe anything.

As always, my emphasis.