Facebook grooming 7,500 content reviewers for objectionable posts


San Francisco, Jul 28 (IANS): After facing ire over reports that its moderators protect far-right activists and under-age accounts, Facebook says it is constantly grooming over 7,500 content reviewers how to handle posts related to hate speeches, terror and child sexual exploitation on its platform.

The content reviewers are a mix of full-time employees, contractors and companies Facebook partners with -- covering every time zone and over 50 languages across the world.

"Content review at this size has never been done before. After all, there has never been a platform where so many people communicate in so many different languages across so many different countries and cultures. We recognise the enormity of this challenge and the responsibility we have to get it right," Ellen Silver, Vice President of Operations at Facebook, wrote in a blog post on Friday.

"Language proficiency is key and it lets us review content around the clock. If something is reported in a language that we don't support 24/7, we can work with translation companies and other experts who can help us understand local context and language to assist in reviewing it," Silver added.

The company came under heavy criticism Channel 4 Dispatches -- a documentary series -- sent an undercover reporter to work as a content moderator in a Dublin-based Facebook contractor.

It showed that moderators at Facebook were preventing Pages from far-right activists from being deleted even after they violate the rules.

In a blog post, Monika Bickert, Vice President of Global Policy Management at Facebook, said the TV report on Channel 4 in the UK raised important questions about their policies and processes.

Facebook has also promised to double the number of people working on its safety and security teams this year to 20,000.

Silver said the company is training its team of content reviewers in three areas -- pre-training which includes what to expect on the job; hands-on learning that includes a minimum of 80 hours with a live instructor followed by hands-on practice and ongoing coaching.

"We want to keep personal perspectives and biases out of the equation entirely -- so, in theory, two people reviewing the same posts would always make the same decision. Of course, judgments can vary if policies aren't sufficiently prescriptive.

Facebook said it audits a sample of reviewer decisions each week to find out if a wrong call was made.

"Our auditors are even audited on a regular basis. In addition, we have leadership at each office to provide guidance, as well as weekly check-ins with policy experts to answer any questions," said the social media giant.

Facebook said it has a team of four clinical psychologists across three regions who are tasked with designing, delivering and evaluating resiliency programmes for everyone who works with graphic and objectionable content.

"This group also works with our vendor partners and their dedicated resiliency teams to help build industry standards," said Silver.

  

Top Stories


Leave a Comment

Title: Facebook grooming 7,500 content reviewers for objectionable posts



You have 2000 characters left.

Disclaimer:

Please write your correct name and email address. Kindly do not post any personal, abusive, defamatory, infringing, obscene, indecent, discriminatory or unlawful or similar comments. Daijiworld.com will not be responsible for any defamatory message posted under this article.

Please note that sending false messages to insult, defame, intimidate, mislead or deceive people or to intentionally cause public disorder is punishable under law. It is obligatory on Daijiworld to provide the IP address and other details of senders of such comments, to the authority concerned upon request.

Hence, sending offensive comments using daijiworld will be purely at your own risk, and in no way will Daijiworld.com be held responsible.