Why I’m uneasy about Facebook as arbiter of acceptable

Moderating on Facebook sounds like a nightmare. Casey Newton had an explosive story last year of panic attacks and high turnover. But the problem isn’t just for the moderators themselves, but in aggregate. As a philosophical problem.

Moderation sets the bounds on reality and acceptableness for billions of people. Across cultures and contexts. Across time. Will a private organisation ever be equipped for this? Could anything, really?

I was thinking about this as I finished up Steven Levy’s new book on Facebook. It’s a sweeping account of Facebook’s rise and recent bout with internal territorialism. But there were really interesting digressions on moderating content at scale.

Levy describes intense debates over what is and isn’t kosher. I use the word deliberately as the image of ever expanding notes and commentary strikes as a quasi religious one. A guide that is first quite narrow and absolute but must be argued over, reinterpreted and contextualised to new situations. Interpreted. At scale.

The rules can venture into confounding, Jesuitical flights of logic. Some things are fairly straightforward. There are attempts to define levels of offensiveness in subjects like exposure to human viscera. Some exposure is okay. Other varieties require an “interstitial,” a warning on the screen like the one before a television show that might show a glimpse of buttocks. Outright gore is banned. It takes a judgment call to fit a given bloodbath into the right box. “If you rewind to Facebook’s early, early days, I don’t think many people would have realized that we’d have entire teams debating the nuances of how we define what is nudity, or what exactly is graphic violence,” says Guy Rosen of the Integrity team… Facebook has created a vast number of non-public supplementary documents that drill on specific examples. These are the Talmudic commentaries shedding light on Facebook’s Torah, the official Community Standards. A New York Times reporter said he had collected 1,400 pages of these interpretations

As an abstract problem, sure, Facebook needs to crackdown on abuse and misinformation etc. etc. But in practice they are dealing with oceans of grey.

And there is no average set of values. Give my grandma and me the same stack of posts and we’d whittle them down to different subsets.

This is why the word interpret is important. Not just of the rules but the posts themselves. And layered on top of all this is that it apparently takes place at such speed as to make the nuances moot.

Facebook expects moderators to make about 400 “jumps” a day, which means an “average handle time” of around 40 seconds for them to determine whether a questionable video or post must remain, be taken down, or in rare cases, escalated to a manager, who might send the most baffling decision to the policy-crats at Menlo Park. Facebook says that there is no set time limit on each decision, but reporting by journalists and several academics who have done deep dives on the process all indicate that pondering existential questions on each piece of content would put one’s low-paying moderation career at risk.

The scale and speed, maybe more than anything else, is what concerns me. At least for now it doesn’t seem like a technological solution is in the offing. The incentive of those tasked with interpreting marginal content is to be restrictive. Better over than underdo it. And recourse is opaque and minimal.

Even given good will among everyone involved, top to bottom, this isn’t a great recipe. This isn’t how we should be determining reality for billions of people.

Putting a copyright notice here feels kind of pointless. So I'm just going to appeal to your better nature - please don't steal without credit. A backlink would be nice :)