Scrubbing the net: The content moderators – The Listening Post (Feature)

Many social media users assume that content moderation is automated, that when an inappropriate image or video is uploaded to the net a computer removes it – whereas, in reality, there are reportedly more than 150,000 content moderators working today.

The job description involves sifting through images, videos and text, assessing whether or not the content contravenes their platform’s policies. And the work can take its toll.

In December last year, two Microsoft employees sued the company, saying that years of content moderating left them with post traumatic stress disorder (PTSD).

It can be unpleasant work, but necessary, and many social media companies based in the west outsource it to places like the Philippines or India.

But the question is: do they do that responsibly? Or do they just take advantage of the cheap labour with little consideration for the labourer.

Contributors:
Sarah Roberts, assistant professor, University of California
Ben Wells, attorney
Ciaran Cassidy, filmmaker
Suman Howlader, CEO, Fowie Info Global Solutions

More from The Listening Post on:

YouTube – http://aje.io/listeningpostYT
Facebook – http://facebook.com/AJListeningPost
Twitter – http://twitter.com/AJListeningPost
Website – http://aljazeera.com/listeningpost


Leave a Comment

We don't require your email address, or your name, for anyone to leave a comment. If you do add an email address, you may be notified if there are replies to your comment - we won't use it for any other purpose. Please make respectful comments, which add value, and avoid personal attacks on others. Links are not allowed in comments - 99% of spam comments, attempt to post links. Please describe where people may find additional information - for example "visit the UN website" or "search Google for..." rather than posting a link. Comments failing to adhere to these guidelines will not be published.