If you’re going to change the world, you must reflect it first

I find taking public transport or hopping a plane immensely stressful. Not because of the shoddy infrastructure, waiting around, or poor service. Because I’m 6″4 with disproportionately long legs in a world built by people who aren’t.

As I continue to read Coders, I’m increasingly worried how this same phenomena will play out in a world full of algorithmic black boxes. Code so complex and systems so arcane that even their creators struggle to understand them.

Techies love to talk about scale and putting their creations in front of millions. But for this to work they themselves need to be drawn from a representative pool.

Otherwise you get self driving cars that are more likely to hit black people. Or image recognition that thinks black People are gorillas.

…then Alciné scrolled over to a picture of himself and a friend, in a selfie they’d taken at an outdoor concert: She looms close in the view, while he’s peering, smiling, over her right shoulder. Alciné is African American, and so is his friend. And the label that Google Photos had generated? “Gorillas.” It wasn’t just that single photo, either. Over fifty snapshots of the two from that day had been identified as “gorillas.”

This isn’t only a Google problem. Or even a Silicon Valley problem. There are also stories of algorithms trained in China and South Korea that have trouble recognising Caucasian faces.

As a journalist with a diverse ethnic and cultural background I had trouble understanding why my editors took so much convincing to run foreign stories. With a family spread around the globe, I could see myself in the Rohingya as much as an Australian farmer.

These issues are linked – what we value, notice and think of as “normal” are all informed by our personal stories. If you grow up or work in a monoculture, that will influence the issues you see, the solutions you propose and contingencies you plan for.

But the world isn’t a monoculture. There are 6″4 people who would like to ride the bus. There will be people who aren’t like you but need to cross the street safely, or be judged fairly.

Who will be deeply offended by racial epithets, which are themselves linked to why they aren’t represented in a database.

If you’re going to try and change the world for the better, you need to be of the world. There will always be edge cases, but without diversity they will be systemic. They will be disastrous.

…why couldn’t Google’s AI recognize an African American face? Very likely because it hadn’t been trained on enough of them. Most data sets of photos that coders in the West use for training face-recognition are heavily white, so the neural nets easily learn to make nuanced recognitions of white people—but they only develop a hazy sense of what black people look like.

As always my emphasis.