On a busy day, contract employees in India monitoring nudity and pornography on Facebook and Instagram will each view 2,000 posts in an eight-hour shift, or almost four per minute.
They are part of a 1,600-member team at Genpact Ltd, an outsourcing firm with offices in the southern Indian city of Hyderabad that is contracted to review Facebook content.
Seven content reviewers at Genpact said in interviews late last year and early this year that their work was underpaid, stressful and sometimes traumatic.
The reviewers, all in their 20s, declined to be identified for fear of losing their jobs or breaching nondisclosure agreements. Three of the seven have left Genpact in the past few months.
“I have seen women employees breaking down on the floor, reliving the trauma of watching suicides real-time,” one former employee said, adding that he had seen this happen at least three times.
Reuters was unable to independently verify the incidents or determine how often they might have occurred.
Genpact declined comment.
The working conditions described by the employees offer a window into the moderator operations at Facebook Inc and the challenges faced by the company as it seeks to police what its 2 billion users post.
Their accounts contrast in several respects with the image presented by three Facebook executives in interviews and statements of a carefully selected, skilled workforce that is paid well and has the tools to handle a difficult job.
Facebook vice president of operations Ellen Silver acknowledged that content moderation “at this size is uncharted territory.”
“We care deeply about getting this right,” she said in January. “This includes the training reviewers receive, our hiring practices, the wellness resources that we provide to each and every person reviewing content, and our overall engagement with partners.”
While rejecting the Hyderabad employees’ assertions about low pay, Facebook has said that it had begun drafting a code of conduct for outsourcing partners, but declined to give details.
It has also said that it would be introducing an annual compliance audit of its vendor policies this year to review the work at contractor facilities.
The company is organizing a first-ever summit next month to bring together its outsourcing vendors from around the world, with the aim of sharing best practices and bringing more consistency to how moderators are treated.
These efforts were announced in a blogpost on Monday by Facebook vice president of global operations Justin Osofsky.
Facebook works with at least five outsourcing vendors in at least eight countries on content review, a Reuters tally showed.
About 15,000 people, a mix of contractors and employees at more than 20 content review sites worldwide, were working on content review at Facebook as of December last year, Silver said.
More than a dozen moderators in other parts of the world have talked of similar traumatic experiences.
A former Facebook contract employee, Selena Scola, filed a lawsuit in California in September last year, alleging that content moderators who face mental trauma after reviewing distressing images on the platform are not being properly protected by the social networking company.
Facebook in a court filing has denied all of Scola’s allegations and called for a dismissal, contending that Scola has insufficient grounds to sue.
Some examples of traumatic experiences among Facebook content moderators in the US were described this week by technology news Web site The Verge.
The Genpact unit in Hyderabad reviews posts in Indian languages, Arabic, English and some Afghan and Asian tribal dialects, Facebook said.
On one team, employees spend their days reviewing nudity and explicit pornography.
Meanwhile, the “counterterrorism” team watches videos that include beheadings, car bombings and electric shock torture sessions, the employees said.
Those on the “self-harm” unit regularly watch live videos of suicide attempts — and do not always succeed in alerting authorities in time, two of the employees said, adding that they had no experience with suicide or trauma.
Facebook said that its policies called for moderators to alert a “specially trained team” to review situations where there was “potential imminent risk or harm.”
The moderators who spoke to reporters said that in the instances they knew of, the trained team was called in when there was a possibility of a suicide, but the reviewers continued to monitor the feed even after the team had been alerted.
Job postings and salary pay slips seen by reporters showed that annual compensation at Genpact for an entry-level Facebook Arabic-language content reviewer was 100,000 Indian rupees (US$1,409) annually, or just more than US$6 per day.
Facebook contended that benefits made the real pay much higher.
The workers said that they did receive transportation to and from work, a common noncash benefit in India.
Moderators in Hyderabad employed by another information technology outsourcing firm, Accenture PLC, monitor Arabic content on YouTube on behalf of Google for a minimum of 350,000 rupees annually, according to two of its workers and pay slips seen by reporters.
Accenture declined to comment, citing client confidentiality.
Facebook disputed the pay analysis, saying that Genpact is required to pay more than industry averages.
The outsourcer, while declining to comment on its work for Facebook, said in a statement that its wages are “significantly higher than the standard in the industry or the minimum wage set by law.”
The Genpact moderators in Hyderabad said that Facebook sets performance targets, which are reassessed periodically, that are called “average review time” or “average handling time.”
“We have to meet an accuracy rate of 98 percent on massive targets,” one of the moderators told reporters. “It is just not easy when you are consistently bombarded with stuff that is mostly mind-numbing.”
Among the rows of vibrators, rubber torsos and leather harnesses at a Chinese sex toys exhibition in Shanghai this weekend, the beginnings of an artificial intelligence (AI)-driven shift in the industry quietly pulsed. China manufactures about 70 percent of the world’s sex toys, most of it the “hardware” on display at the fair — whether that be technicolor tentacled dildos or hyper-realistic personalized silicone dolls. Yet smart toys have been rising in popularity for some time. Many major European and US brands already offer tech-enhanced products that can enable long-distance love, monitor well-being and even bring people one step closer to
Malaysia’s leader yesterday announced plans to build a massive semiconductor design park, aiming to boost the Southeast Asian nation’s role in the global chip industry. A prominent player in the semiconductor industry for decades, Malaysia accounts for an estimated 13 percent of global back-end manufacturing, according to German tech giant Bosch. Now it wants to go beyond production and emerge as a chip design powerhouse too, Malaysian Prime Minister Anwar Ibrahim said. “I am pleased to announce the largest IC (integrated circuit) Design Park in Southeast Asia, that will house world-class anchor tenants and collaborate with global companies such as Arm [Holdings PLC],”
Sales in the retail, and food and beverage sectors last month continued to rise, increasing 0.7 percent and 13.6 percent respectively from a year earlier, setting record highs for the month of March, the Ministry of Economic Affairs said yesterday. Sales in the wholesale sector also grew last month by 4.6 annually, mainly due to the business opportunities for emerging applications related to artificial intelligence (AI) and high-performance computing technologies, the ministry said in a report. The ministry forecast that retail, and food and beverage sales this month would retain their growth momentum as the former would benefit from Tomb Sweeping Day
Thousands of parents in Singapore are furious after a Cordlife Group Ltd (康盛人生集團), a major operator of cord blood banks in Asia, irreparably damaged their children’s samples through improper handling, with some now pursuing legal action. The ongoing case, one of the worst to hit the largely untested industry, has renewed concerns over companies marketing themselves to anxious parents with mostly unproven assurances. This has implications across the region, given Cordlife’s operations in Hong Kong, Macau, Indonesia, the Philippines and India. The parents paid for years to have their infants’ cord blood stored, with the understanding that the stem cells they contained