Facebook content moderators are stationed around the globe and are most often hired by third-party companies. These companies create slide decks for training their armies of sub-contractors. Training slides are the only known repository for Facebook’s guidelines and address a vast range of topics from censoring pornography to defining hate speech. Reading Like a Computer organizes and graphically presents the contents from leaked slide decks that describe Facebook’s hate speech moderation rules.
These training materials exhibit PowerPoint design at its worst: bullet-point oversimplification, lack of organizational coherence between rules and examples, and internal inconsistencies. Reading Like a Computer considers the syntactic and semantic discrepancies between what moderators are expected to allow and block on the social networking platform. The title implies that content moderators are trained to think like an algorithm—deciding yes/no responses to complex and contextually layered communication.
Reading Like a Computer exposes underlying problems in Facebook’s policies and questions Facebook’s developing AI to operate on these ad hoc rules. Reading Like a Computer is not only an exercise in decoding a set of byzantine guidelines, but also a guide to the compounding dangers that culturally agnostic, ahistorical, and algorithmic thinking can inflict when used as a guideline for a complex, global “community of friends.” - Angie Waller