Sunday, December 30, 2018



Revealed: Facebook's secret censorship rule-book

Facebook's secret rules governing which posts are censored across the globe have been revealed.

A committee of young company lawyers and engineers have drawn up thousands of rules outlining what words and phrases constitute hate speech and should be removed from the social media platform across the globe.

They have also drawn up a list of banned organisations laying down which political groups can use their platform in every country on earth, a New York Times investigation has revealed.

An army of 7,500 lowly-paid moderators, many of whom work for contractors who also run call centers, enforce the rules for 2 billion global users with reference to a baffling array of thousands of PowerPoint slides issued from Silicon Valley.

They are under pressure to review each post in under ten seconds and judge a thousand posts a day.

Employees often use outdated and inaccurate PowerPoint slides and Google translate to decide if user's posts should be allowed on the social network, a Facebook employee revealed.

Dozens of Facebook employees gather to come up with the rules for what the site's two billion users posting in hundreds of different languages should be allowed to say.

But workers use a combination of PowerPoint slides and Google translate to work out what is allowed and what is not in the 1,400-page rulebook.

The guidelines are sent out to more than 7,500 moderators around the world but some of the slides include outdated or inaccurate information, the newspaper reports.

Facebook employees, mostly young engineers and lawyers, meet every Tuesday morning to set the guidelines, and try to tackle complex issues in more than 100 different languages and apply them to simple yes-or-no rules, often in a mater of seconds.

The company also outsources much of the individual post moderation to companies that use largely unskilled workers - many hired from call centers.

Posts about Kashmir were flagged up if they called for an independent  state, as a slide indicated Indian law bans such statements. But the accuracy of this law has been debated by legal experts.  

The slide instructs moderators to 'look out for' the phrase 'Free Kashmir', despite the slogan, common among activists, being completely legal.

One slides inaccurately described Bosnian war criminal Ratko Mladic as still being a fugitive, though he was arrested in 2011.

Another slide incorrectly described an Indian law and advised moderators that almost any criticism of religion should be flagged as illegal.

Moderators were once told to remove fund-raising appeals for Indonesian volcano victims because a co-sponsor of the drive was on Facebook's list of banned groups.

A paperwork error in Myanmar allowed a prominent extremist group accused of inciting genocide to stay on the social media platform for months.

Moderators will sometimes remove political parties, like Golden Dawn in Greece, but also mainstream religious movements in Asia and the Middle East, an employee revealed.

One moderator told the Times there is a rule to approve any post if it's in a language that no one available can read and understand.

The rulebook is made up of dozens of unorganized PowerPoint presentations and Excel spreadsheets with titles like 'Western Balkans Hate Orgs and Figures' and 'Credible Violence: Implementation standards'.

One of the document sets out several different rules just to determine when a word like 'jihad' or 'martyr' indicates pro-terrorism speech.

Moderators have to review a post and decide if it falls into one of three tiers of severity, with lists that state the six 'designated dehumanizing comparisons', such as comparing Jewish people to rats.

Sara Su, a senior engineer on the News Feed at Facebook told the New York Times: 'It's not our place to correct people's speech, but we do want to enforce our community standards on our platform.

'When you're in our community, we want to make sure that we're balancing freedom of expression and safety.'

SOURCE 


2 comments:

Anonymous said...

Avoid Facebook.

Anonymous said...

If Facebook is controlling the material on their site then they are not a mere conduit and shouldn't have the protections of the Communications Decency Act.