The importance of play

I’ve just started reading a mind at play, a biography of Claude Shannon. He’s the “father of information theory”, without which much of the modern world wouldn’t be.

Shannon reminds me of stories about Richard Feynman. Bursting with curiosity, yes, but also sparked with joy at tinkering with knowledge.

I sometimes feel like this is lost in a world where education is sold as preparation for employment rather than rounding someone out or living a fulfilled life.

On Shannon:

“His was a life spent in the pursuit of curious, serious play; he was that rare scientific genius who was just as content rigging up a juggling robot or a flamethrowing trumpet as he was pioneering digital circuits. He worked with levity and played with gravity; he never acknowledged a distinction between the two. His genius lay above all in the quality of the puzzles he set for himself. And the marks of his playful mind—the mind that wondered how a box of electric switches could mimic a brain, and the mind that asked why no one ever decides to say “XFOML RXKHRJFFJUJ”—are imprinted on all of his deepest insights”

And here’s a little vignette from Surely You’re Joking Mr Feynman, one of my favourite books:

“…I laid out a lot of glass microscope slides, and got the ants to walk on them, back and forth, to some sugar I put on the windowsill. Then, by replacing an old slide with a new one, or by rearranging the slides, I could demonstrate that the ants had no sense of geometry: they couldn’t figure out where something was. If they went to the sugar one way, and there was a shorter way back, they would never figure out the short way. It was also pretty clear from rearranging the glass slides that the ants left some sort of trail…”

Not only is this Feynman as a child luxuriating in experimentation and learning for its own sake, but also him repeating the story years later. Despite a dismal career as an myrmecologist, that’s how important and formative he thought experiences like this were.

Putting aside personal fulfilment, I wonder how successful Feynman and Shannon would have been had they not “wasted” time building up other adjacent stores of knowledge.

Especially considering how successful they were at bringing new perspectives and slants to established ideas.

As always my emphasis

Another adjacent possible

Working my way through one of the more fascinating technology books I’ve ever come across, Code by Charles Petzold. I stumbled across this passage:

nobody in the nineteenth century made the connection between the ANDs and ORs of Boolean algebra and the wiring of simple switches in series and in parallel. No mathematician, no electrician, no telegraph operator, nobody. Not even that icon of the computer revolution Charles Babbage (1792–1871), who had corresponded with Boole and knew his work, and who struggled for much of his life designing first a Difference Engine and then an Analytical Engine that a century later would be regarded as the precursors to modern computers…

This is from a chapter on Boolean logic (aka Boolean algebra), which you might have come across if you have ever studied programming, statistics or electrical engineering.

I’ve never before had it explained to me in such a cogent fashion. But what this sections highlights in particular (and the book as a whole rams home) is the power of bringing together seemingly disconnected ideas, theories and fields.

…What might have helped Babbage, we know now, was the realization that perhaps instead of gears and levers to perform calculations, a computer might better be built out of telegraph relays…

This is a great book if you want to understand how computers work, as it combines engineering and information theory to construct a virtual computer, step by step. Starting with a simple light bulb circuit, through logic gates, operating systems and graphical interfaces.

But it is arguably more valuable in demonstrating how something as complex as a computer draws from many fields.