Many social media users assume that content moderation is automated, that when an inappropriate image or video is uploaded to the net a computer removes it – whereas, in reality, there are reportedly more than 150,000 content moderators working today.
The job description involves sifting through images, videos and text, assessing whether or not the content contravenes their platform’s policies. And the work can take its toll.
In December last year, two Microsoft employees sued the company, saying that years of content moderating left them with post traumatic stress disorder (PTSD).
It can be unpleasant work, but necessary, and many social media companies based in the west outsource it to places like the Philippines or India.
But the question is: do they do that responsibly? Or do they just take advantage of the cheap labour with little consideration for the labourer.
Contributors:
Sarah Roberts, assistant professor, University of California
Ben Wells, attorney
Ciaran Cassidy, filmmaker
Suman Howlader, CEO, Fowie Info Global Solutions
More from The Listening Post on:
YouTube – http://aje.io/listeningpostYT
Facebook – http://facebook.com/AJListeningPost
Twitter – http://twitter.com/AJListeningPost
Website – http://aljazeera.com/listeningpost