Last week I thought I’d found the job I’d hate most of all (warning: it involves sewers, a shovel, and “fat mounds”), but then I read this New York Times profile of people who are employed as Internet content screeners, which appears to be the real world equivalent of web surfing in hell.
For most online companies that allow user-uploaded content, keeping offensive stuff away from customers’ eyeballs is a full time job. Content algorithms and user flagging only go so far, after which the websites either turn to internal departments or to outsourced companies.
The job description for these human-powered filtering services–review offensive imagery and get paid!–can almost sound like fun, if you assume that “offensive” just means pornographic. In reality, the content under review spans everything from animal abuse, to sexual violence, to murder.
The president of one of those outsourced companies told the New York Times that his employees are like “combat veterans [who are] completely desensitized to all kinds of imagery,” but he might be spinning the business a bit for the press. For contrast, here’s what one of his employees told the Times:
Mr. Bess insists he is still bothered by the offensive material, and acknowledges the need to turn to the cubicle workers around him for support.
“We help each other through any rough spots we have,” said Mr. Bess, 52, who previously worked in the stockrooms at Wal-Mart and Target.
The VP at a competitor says she’s the one called in to have to deal with really disturbing stuff like child pornography, and that she takes it “really personally” sometimes, but she also says you eventually get desensitized.
At least some of the companies say they offer counseling services to help when an employee is traumatized. An industry group created by Congress this summer says such services aren’t widespread enough, and it’s pushing for therapeutic care throughout the industry.
“Policing the Web’s Lurid Precincts” [New York Times]