One of the most galling things about politics and policy is how ancillary “facts” and logic can be. The internet has lowered the bar to information and expertise to almost nothing, but has devalued them as well.

The utopia of evidenced based policy seems unlikely to arrive. Even putting aside vested interests. This is sort of related to my previous post on how fond we are of reasoning through faulty analogy. But I wonder if it is not a more fundamental issue of the coldness of facts.

This thought struck while reading The Man Who Solved the Market by Gregory Zuckerman. The book traces the story of Jim Simons and Renaissance Technologies. They were some of the pioneers of quantitative trading and have had almost unparalleled success over decades.

What separates Renaissance (at least initially) is that Jim and most of the others were mathematicians and computer scientists. Many didn’t have any experience, or even interest in, finance. Hence their goal was to create an automated trading system:

Humans are prone to fear, greed, and outright panic, all of which tend to sow volatility in financial markets. Machines could make markets more stable, if they elbow out individuals governed by biases and emotions. And computer-driven decision-making in other fields, such as the airline industry, has generally led to fewer mistakes.

But throughout the story you can feel a tug, between thus clear drive towards a fully automated system and a reticence to cede control. This was especially clear in times of turmoil and as the models became so complex and self-learning that it was all but impossible to understand why exactly a trade was being made (or reccomended).

Then, something unexpected happened. The computerized system developed an unusual appetite for potatoes, shifting two-thirds of its cash into futures contracts on the New York Mercantile Exchange that represented millions of pounds of Maine potatoes. One day, Simons got a call from unhappy regulators at the Commodity Futures Trading Commission: Monemetrics was close to cornering the global market for these potatoes, they said, with some alarm. Simons had to stifle a giggle. Yes, the regulators were grilling him, but they had to realize Simons hadn’t meant to accumulate so many potatoes; he couldn’t even understand why his computer system was buying so many of them. Surely, the CFTC would understand that.

Soon, he and Baum had lost confidence in their system. They could see the Piggy Basket’s trades and were aware when it made and lost money, but Simons and Baum weren’t sure why the model was making its trading decisions. Maybe a computerized trading model wasn’t the way to go, after all, they decided.

What’s striking is that these were all people who understood the underlying logic of such a system. That while it’s (probably) impossible to predict any particular stock or commodity, there are patterns in the data. Patterns that represent biases, mistakes and other phenomena. That these can be identified. And, given enough “bets”, they only had to be correct 50.075% of the time to make an absolute killing.

Most of them had been involved in the construction of the model. Many had previously done academic research, and even invented techniques, on which it was based. And yet they still had trouble trusting something they didn’t fully understand.

Some rank-and-file senior scientists were upset—not so much by the losses, but because Simons had interfered with the trading system and reduced positions. Some took the decision as a personal affront, a sign of ideological weakness and a lack of conviction in their labor. “You’re dead wrong,” a senior researcher emailed Simons. “You believe in the system, or you don’t,” another scientist said, with some disgust.

Of course, they were also aware of the inverse – how fallible human traders are. And that many of their own mistakes, especially early on, were the result of human intervention.

But it was obviously hard and contentious to move beyond this hill. To trust something they didn’t fully understand. The “facts” from nowhere. The truth without a good story. Maybe this is the problem with evidence based policy.

As always my emphasis