When The Guardian broke its Facebook story this week, revealing what it called the site’s rule book on sex, terrorism and violence, the picture that emerged was one of a technology company that has accidentally grown into something else, the world’s biggest media platform.
A company that is making up the rules as it goes along, recently almost doubling its number of content moderators.
And Facebook is doing that while trying to maintain what is, and will always be, its primary mission: keeping people on the site.
Boosting its number of content moderators was Facebook’s response to a horrific case earlier this month, when a man in Thailand posted a video of himself murdering his young daughter before taking his own life.
Like other postings of shootings, rapes and suicides, that video was deleted, but only after users complained about it. Facebook lacks the capability to scrutinise such material before it hits your news feeds.
More from The Listening Post on: