Home | Society | The dark side of Facebook

The dark side of Facebook

image
Our social networking pages are being policed by outsourced, unvetted moderators.







By Iain Hollingshead, and Emma Barnett





For most of us, our experience on Facebook is a benign – even banal – one. A status update about a colleague’s commute. A “friend” request from someone we haven’t seen for years (and hoped to avoid for several more). A picture of another friend’s baby, barely distinguishable from the dozen posted the day before.
 

Some four billion pieces of content are shared every day by 845 million users. And while most are harmless, it has recently come to light that the site is brimming with paedophilia, pornography, racism and violence – all moderated by outsourced, poorly vetted workers in third world countries paid just $1 an hour.
 

In addition to the questionable morality of a company that is about to create 1,000 millionaires when it floats paying such paltry sums, there are significant privacy concerns for the rest of us. Although this invisible army of moderators receive basic training, they work from home, do not appear to undergo criminal checks, and have worrying access to users’ personal details. In a week in which there has been an outcry over Google’s privacy policies, can we expect a wider backlash over the extent to which we trust companies with our intimate information?
 

Last month, 21-year-old Amine Derkaoui gave an interview to Gawker, an American media outlet. Derkaoui had spent three weeks working in Morocco for oDesk, one of the outsourcing companies used by Facebook. His job, for which he claimed he was paid around $1 an hour, involved moderating photos and posts flagged as unsuitable by other users.
 

“It must be the worst salary paid by Facebook,” he told The Daily Telegraph this week. “And the job itself was very upsetting – no one likes to see a human cut into pieces every day.”
  
Derkaoui is not exaggerating. An articulate man, he described images of animal abuse, butchered bodies and videos of fights. Other moderators, mainly young, well-educated people working in Asia, Africa and Central America, have similar stories. “Paedophilia, necrophilia, beheadings, suicides, etc,” said one. “I left [because] I value my sanity.” Another compared it to working in a sewer. “All the ---- of the world flows towards you and you have to clean it up,” he said.
 
Who, one wonders, apart from the desperate, the unstable and the unsavoury, would be attracted to doing such an awful job in the first place?
 
Of course, not all of the unsuitable material on the site is so graphic. Facebook operates a fascinatingly strict set of guidelines determining what should be deleted. Pictures of naked private parts, drugs (apart from marijuana) and sexual activity (apart from foreplay) are all banned. Male nipples are OK, but naked breastfeeding is not. Photographs of bodily fluids (except semen) are allowed, but not if a human being is also shown. Photoshopped images are fine, but not if they show someone in a negative light.
 
Once something is reported by a user, the moderator sitting at his computer in Morocco or Mexico has three options: delete it; ignore it; or escalate it, which refers it back to a Facebook employee in California (who will, if necessary, report it to the authorities). Moderators are told always to escalate specific threats – “I’m going to stab Lisa H at the frat party” is given as the charming example – but not generic, unlikely ones, such as “I’m going to blow up the planet on New Year’s Eve.”
 
It is, of course, to Facebook’s credit that they are attempting to balance their mission “to make the world more open and connected” with a willingness to remove traces of the darker side of human nature. The company founded by Mark Zuckerberg in his Harvard bedroom is richer and more populated than many countries. These moderators are their police.
 
Neither is Facebook alone in outsourcing unpleasant work. Adam Levin, the US-based chief executive of Criterion Capital Partners and the owner of British social network Bebo, says that the process is “rampant” across Silicon Valley.
 
“We do it at Bebo,” he says. “Facebook has so much content flowing into its system every day that it needs hundreds of people moderating all the images and posts which are flagged. That type of workforce is best outsourced for speed, scale and cost.”
 
A spokesman for Twitter said that they have an internal moderation team, but refused to answer a question about outsourcing. Similarly, a Google spokesperson would not say how Google+, the search giant’s new social network, will be moderated. Neither Facebook nor oDesk were willing to comment on anything to do with outsourcing or moderation.
 
Levin, however, estimates that Facebook indirectly employs between 800 to 1,000 moderators via oDesk and others – nearly a third of its more handsomely remunerated full-time staff. Graham Cluley, of the internet security firm Sophos, calls Silicon Valley’s outsourcing culture its “poorly kept dirty secret”.
 
The biggest worry for the rest of us, however, is that the moderation process isn’t nearly secretive enough. According to Derkaoui, there are no security measures on a moderator’s computer to stop them uploading obscene material themselves. Despite coming into daily contact with such material, he was never subjected to a criminal record check. Where, then, is the oversight body for these underpaid global police? Quis custodiet ipsos custodes?
 
Facebook itself is guarding them, according to a previous statement to which the Telegraph was referred. “These contractors are subject to rigorous quality controls and we have implemented several layers of safeguards to protect the data of those using our service,” it read. “No user information beyond the content in question and the source of the report is shared. All decisions made by contractors are subject to extensive audits.”
 
And yet in the images due for moderation seen by the Telegraph, the name of anyone “tagged” in an offending post – as well as the user who uploaded it – could be clearly discerned. A Facebook spokesman said that these names are shared with the moderators to put the content in context – a context sufficient for Derkaoui to claim that he had as much information as “looking at a friend’s Facebook page”. He admits to having subsequently looked up more information online about the people he had been moderating. Cluley is worried that Facebook users could be blackmailed by disgruntled moderators – or even see pictures originally intended for a small circle of friends pasted all over the web.
 
Shamoon Siddiqui, chief executive of Develop.io, an American app-building firm that employs people in the developing world for a more generous $7 to $10 an hour, agrees that better security measures are needed. “It isn’t wrong for Facebook to have an Indian office,” he says. “But it is wrong for it to use an arbitrary marketplace with random people it doesn’t know in that country. This will have to change.”
 
In Britain, for example, all web moderators have to undergo an enhanced CRB check. eModeration, whose clients range from HSBC to The X-Factor, pays £10 an hour and never lets its staff spend too long on the gritty stuff. They wouldn’t go near the Facebook account. The job, says Tamara Littleton, its chief executive, is too big, the moderating too reactive, and they couldn’t compete on cost with the likes of oDesk.
 
So, if no one can undercut the likes of oDesk, could they not be undermined instead? If Mr Zuckerberg will not dig deeper into his $17.5 billion pockets to pay the street-sweepers of Facebook properly, maybe he could be persuaded by a little moral outrage?
 
Levin disagrees. “Perhaps a minute percentage of users will stop using Facebook when they hear about this,” he says. “But the more digital our society becomes, the less people value their privacy.”
 
Perhaps. But maybe disgruntled commuters, old schoolfriends and new mothers will think twice before sharing intimate information with their “friends” – only to find that two minutes later it’s being viewed by an under-vetted, unfulfilled person on a dollar an hour in an internet café in Marrakech.
 



Facebook in new row over sharing users’ data with moderators



New information about Facebook’s outsourced moderation process shows that the social network shares more personal information with moderators than it has so far acknowledged.
 








By Emma Barnett, Digital Media Editor






The social network was criticized last week after gossip site Gawker exposed it as employing third-party content moderators in the developing world for one dollar an hour.
 

Facebook responded saying: “No user information beyond the content in question and the source of the report is shared.”
 

However, new evidence seen by The Telegraph, shows that these moderators, who have to deal with the distressing images and messages which are reported every day, and are clearly able to see the names of the person who uploaded the ‘offensive’ content, the subject of the image or person tagged in a photo - in addition to the person who has reported the content.
 

Moreover, there are currently no security measures in place stopping these moderators taking screen shots of people's personal photos, videos and posts.
 

When challenged about the data displayed on the screenshots shown to The Telegraph by an ex-Facebook moderator, a spokesman for the social network said: “On Facebook, the picture alone is not the content. In evaluating potential violations of our rules it is necessary to consider who was tagged and by whom, and well as additional content such as comments…Everything displayed is to give content reviewers the necessary information to make the right, accurate decision.”
 
The former Facebook moderator, 21 year-old Amine Derkaoui from Morocco, has shown The Telegraph several screenshots of what these outsourced workers see when deciding if a piece of content is suitable to be on the site.
 
Derkaoui, who was employed by oDesk – the company Facebook used to employ outsourced content moderators, claimed here was no decent security at all through the content system and looking at each report – was like “looking at a friend’s Facebook page”– that’s how much information was on there.
 
He has since looked up information online about the people he had been moderating.
 
Security experts are concerned about the amount of personal information Facebook is allowing these poorly paid third-party workers in the developing world access to.
 
Graham Cluley, of the British internet security firm Sophos, said: “When people report content on Facebook, I don’t think they expect all of their details to end up in India, with someone who doesn’t directly work for Facebook…By sharing information about a Facebook account holder, there is obviously the potential for abuse and blackmail.
 
“Some of the photos that people post, which under Facebook’s rules may be deemed inappropriate, such as your children running around naked or a mum breastfeeding, could still end up on the open internet, if a moderator, who is able to copy the images, publishes them.”
 
Cluley is calling for Facebook to improve its content moderation system as it currently relies upon people to report offensive material, which is not always done as quickly as possible.
 
“There is a lot of obscene material on Facebook which stays up there for a long time, until someone spots in. The company needs to improve the moderation so that there is some kind of in-built scanner which will prevent the awful stuff from going up in the first place,” he explains.
 
Philip James, a privacy specialist at Pitmans law firm, says the onus is on Facebook to ensure it improves the security around the content system which these third party workers are using so they cannot easily take screen shots of people’s information.
 
“Facebook should be carrying out better due diligence on the systems its third party contractors are using to ensure they are secure and not open to images and information being easily exported by these workers,” he explained.
 
Derkaoui, who now works for a New York-based technology company Zenoradio, having quit his oDesk job out of protest against the poor wages, was never explicitly told that the site he was moderating was Facebook, (as the social is secretive about the obscene material which people upload every day). He is now calling for cash-rich Facebook to increase the wages it pays these workers.
 
“Facebook has to increase these wages. One dollar an hour is the lowest wage at oDesk and I believe it must be the worst salary paid by Facebook. They also have to recruit people to do this job from around the world, not only those from the third world… And they need to keep users’ data private too." 

The social network has not made a comment about these third party workers' rate of pay.
 
A spokesman for the social network said: "These contractors are subject to rigorous quality controls and we have implemented several layers of safeguards to protect the data of those using our service."
 
Facebook refused to tell The Telegraph whether or not it was still using oDesk’s services. And oDesk failed to return any calls on this subject.
 
Last week Derkaoui revealed to Gawker the bizarre set of content rules which Facebook uses when deciding whether or not a piece of content is allowed.
 
For instance, an image containing “any OBVIOUS sexual activity, even if naked parts are hidden from view” must be deleted - while “deep flesh wounds”, “excessive blood” and “crushed heads, limbs”, are acceptable as long as no insides are showing.
 
He said the poorly paid work of content moderation was extremely distressing. 

“The job was upsetting – no one likes to see a human cut into pieces every day,” Derkaoui told The Telegraph. “No one from oDesk or Facebook appeared to care about the psychological condition of the moderators."
 
Facebook has not made a specific comment on this point. 

Derkaoui added: “I was paid one dollar an hour, plus small commissions for four hours a day. Generally with four hours of work, you can get six to seven dollars a day. We worked during weekends too.”
 
While Facebook has its own internal content moderation team, outsourcing this type of work is considered normal across many technology firms, as the companies grow so fast.
 
Facebook now has more than 850 million members, employs more than 3,200 full time people and has just moved into a huge new campus in the US – which can hold 9,000 members of staff.
 
When it floats later this year, it is expected to break technology records, with a $100bn valuation. Telegraph
 

Subscribe to comments feed Comments (0 posted)

total: | displaying:

Post your comment

  • Bold
  • Italic
  • Underline
  • Quote

Please enter the code you see in the image:

Captcha
Share this article
Tags

No tags for this article

Rate this article
0