The spectacular bias

We must remind ourselves again that history as usually written (peccavimus) is quite different from history as usually lived: the historian records the exceptional because it is interesting-because it is exceptional. If all those individuals who had no Boswell had found their numerically proportionate place in the pages of historians we should have a duller but juster view of the past and of man.

This is from The Lessons of History, a short book that is deeply problematic in some parts and refreshingly frank in others.

This is somewhat understandable given it is more than fifty years old. But the exhortations to not strip history from both historians and ourselves, and so the context within which it has been understood and transmitted, are timeless.

To begin with, do we really know what the past was, what actually happened, or is history ” a fable” not quite “agreed upon”? Our knowledge of any past event is always incomplete, probably inaccurate, beclouded by ambivalent evidence and biased historians, and perhaps distorted by our own patriotic or religious partisanship. “Most history is guessing, and the rest is prejudice”.

I’d argue the same is true for the present. Our view of the world is inevitably shaped by what we find noticeable, what others do, and the context within which this happens.

This could be dictated by the medium – stories related visually are inherently biased by the availability and power of the images. It could also be impacted by time, technology, ideology, culture and many other factors.

But the spectacular reigns supreme. No one sets out to tell a boring anecdote in a bar. The world, the story, reality, as in history, is probably far more mundane.

Poisoning the well

Certainty is everywhere, fundamentalism is in full bloom. Legions of authorities cloaked in total conviction tell us why we should invade country X, ban The Adventures of Huckleberry Finn in schools, or eat stewed tomatoes; how much brain damage is necessary to justify a plea of diminished capacity; the precise moment when a sperm and an egg must be treated as a human being; and why the stock market will eventually revert to historical returns. A public change of mind is national news.

This is the first paragraph from On Being Certain by Robert A. Burton. I am just a couple of pages in, but this has stopped me short.

The book is ostensibly about the biological origins of the feeling of knowing. How it is separate from “reason” and logic. But this paragraph perfectly encapsulates how the way society frames issues ignores and even rewards unwarranted certainty.

Modern media has an endemic sense of certainty. Journalistic convention is based on an underlying assumption of causation, of the world in front of us as the direct result of something that can be tracked down and explained. Something happened so there must be someone to talk to, or a bang that preceded it.

There’s no way it’s unknowable, or the result of complex interactions we can only tease out with time and after making many assumptions. Dogged by problems of measurement and perception. As a result you get a lot of declarative statements, black and white.

When a professional athlete is doing well, for instance, we are furnished with stories of their extensive workouts. When they do poorly we hear about their troubled childhood and off-court issues. Or maybe they just suck now. There’s little room for underlying randomness, problematic measurements, statistical noise and mean reversion. A cause must be found and responsibility taken.

The issue here is the need for a narrative. As a journalist, narrative is an important tool for grabbing someone’s attention, keeping it, and guiding them through a larger point. Or to highlight something specific and make it memorable.

But what does a narrative need? In this context it almost always entails simplistic cause and effect.

At the end of many news bulletins we get a financial update. We hear how this currency rose, a stockmarket over there fell, after hours trading is stagnant etc. All fair enough, except that they’re often immediately tied to a news hook.

The Yen went down cause a Yeti was spotted in Turkey. The Nasdaq rose cause Dutch tulips were especially vibrant this year. Sunspots.

Narrative is useful for audiences to connect with this kind of abstraction. But there’s no way causation for activities this complex were nailed down in the time between the signal and the news piece, if they ever can be.

These triggers often are big enough to have some association, but how much? How that was figured out is honestly an even better story.

That last line from Burton, about a change of mind being national news is also deserving of unpicking.

It is fair enough that leaders changing their minds about something is news. But the problem is in how it is approached. How often is the story about the change itself rather than what underlay the previous “belief” and how that changed? How good is the information, or, if that hasn’t changed, the mental model that reinterpreted it?

Beliefs often aren’t a binary proposition, especially when it comes to policy. Rather, they are about juggling trade offs, dogged by information asymmetries and stretched resources (mental and otherwise).

But here I am also falling into the trap of beliefs as the function of logic and reason. As I’ve documented here, beliefs have many potential fathers. Perhaps biology is one.

Technology’s language bias

One of the many underlying tensions in Sri Lanka is that between people fluent in English and those who aren’t. After independence and Sinhala made the country’s official language, education was converted to “Swabasha”.

But the elite kept English. It is the medium in most private and international schools. University courses are mostly in English, as are the offices of big corporations. There is a huge wage premium in an English education.

All of which have resulted in an inevitable pushback.

The debate over English education has been a backdrop to my last couple of months, as I’ve been learning to code and scraping the web.

Just the other day I was scraping the Sri Lankan parliament website, and commenting to my grandpa how the HTML is all in English. But how can this be?

This is the central point of a great Wired article:

In theory, you can make a programming language out of any symbols. The computer doesn’t care. The computer is already running an invisible program (a compiler) to translate your IF orinto the 1s and 0s that it functions in, and it would function just as effectively if we used a potato emoji 🥔 to stand for IF and the obscure 15th century Cyrillic symbol multiocular O ꙮ to stand for. The fact that programming languages often resemble English words like body or if is a convenient accommodation for our puny human meatbrains, which are much better at remembering commands that look like words we already know.

But only some of us already know the words of these commands: those of us who speak English. The “initial promise of the web” was only ever a promise to its English-speaking users, whether native English-speaking or with access to the kind of elite education that produces fluent second-language English speakers in non-English-dominant areas.

There are a couple of multilingual programming languages, and programming languages based on other natural languages, but they’re nowhere near as supported or extended.

Without a large community of fellow travellers there aren’t big archives of questions and answers to query when you yourself have a problem, or huge repositories of packages and modules to extend your code.

I’ve recently been studying natural language processing, which is another domain with a huge bias towards English. The letters and symbols of many languages aren’t even supported by Unicode.

The Wired article ends with a hopeful message that we might eventually have “Swahili HTML” and “Russian HTML” in addition to “English HTML” (rather than HTML being synonymous with English). The author notes that people learn coding better in their native tongues, and that European writing was once synonymous with Latin before branching into many vernaculars.

But there needs to be a blend of these answers. There likely won’t be any one small natural language-based computer language that can compete with a large one for usability and flexibility. But that doesn’t mean localised computer languages can’t be powerful, useful or a great pathway.

Until then a lot of the power of modern technologies will remain the privilege of those with English language skills.

Tab dump

Research, articles, podcasts and videos in no particular order.