At Risk Of Losing Their Jobs, Facebook Content Moderators In Ireland Speak Out Against…


[This story was updated at 6:12 PM Eastern with statements from Facebook and Covalen.]

Facebook content moderators in Ireland met with the country’s deputy prime minister Leo Vardkhar on Friday, seeking support against unsafe working conditions. This marks the first time that a government official from any country has personally met with the social network’s frontline workers, who have effectively become the guardians of public discourse around the world.

Moderators Ibrahim Halawa and Paria Moshfeghi, with the support of Foxglove, a UK-based nonprofit tech watchdog, held a press conference moments after the meeting. “A lot of employees are really worried about their safety and about their mental health,” said Halawa. As well as the mental trauma of sifting through horrific images and hate speech, moderators in Ireland were required to return to the office late last year, despite an August decree from the social media giant that employees could work from home until August 2021. Notably, Facebook’s approximately 15,000 content moderators are mostly employed through third-party contractors. Both Halawa and Moshfeghi are employed by third-party firm, Covalen. 

As a result of the meeting, Irish deputy PM Vardkhar told the two workers that he would write a letter to Facebook and open an investigation into their workplace conditions.

“We disagree with these characterizations of how we’ve supported people during Covid-19,” a spokesperson for Facebook said, adding that the company has increased its use of technology to allow for a majority of reviewers to work from home. “But considering some of the most sensitive content needs to be reviewed in the office, we’ve worked with our partners to bring reviewers back into some of our site as government guidance has permitted.”

Covalen also cited “the sensitive of the content” as a reason for why some employees had to go back into the office, according to a written statement from the company.

“One of the things that makes this meeting historic is that these two have been the first who have been prepared to stick their necks out in public and run the risk of retaliation from the companies,” said Cori Crider, co-founder of Foxglove, which helped convince Halawa and Moshfeghi to speak out in public and take the meeting with Vardkhar. The two employees are breaking non-disclosure agreements with Covalen, and they are unsure of how the company will respond.

“Me and Paria don’t know what is going to happen once we go back to work,” Halawa said. He added that they had already tried other channels of communication within the company to bring up their concerns. “A lot of employees have written to the media with their identities hidden and that is not fair… to work in a place in a modern time in Ireland and be in such fear to speak up just for simple improvements.”

Whether a similar kind of meeting will be sought after with an elected U.S. official, where Facebook content moderator workers based in Texas have also reportedly been told to go back into the office amid the pandemic, Foxglove’s Crider responded, “We’d love to.”

In November, over 200 Irish content moderators wrote and signed a letter, addressed to Facebook’s Mark Zuckerberg and Sheryl Sandberg, as well as the CEOs of Covalen and Accenture, two of the largest outsourcing firms for this kind of work, with a list of demands. 

The letter asked the business leaders to reinstate a work-from-home policy after the contractors were told to come back to work in August, just as Ireland was reverting back to lockdown measures due to a resurgence of Covid-19 cases. This was instated even when full-time Facebook employees who supervise content moderation on the same content are allowed to work from home. They also wrote for hazard pay and to end the practice of outsourcing and hire them as full-time Facebook employees.

“Facebook’s algorithms are years away from achieving the necessary level of sophistication to moderate content automatically. They may never get there,” reads the letter. “Without our work, Facebook is unusable. Its empire collapses. Your algorithms cannot spot satire. They cannot sift journalism from disinformation. They cannot respond quickly enough to self-harm or child abuse. We can. Facebook needs us. It is time that you acknowledged this and valued our work.”

Facebook’s content moderation workers — and the level of psychological trauma they are exposed to everyday by watching, reading and filtering through the worst of the Internet and human nature — were exposed to the world in 2012 through a series of investigative reports by Adrian Chen. In 2018, former contract Facebook content moderator Selena Scola based in California sued Facebook in a class-action case, alleging that she had developed post-traumatic stress disorder (PTSD) after being forced to sift through thousands of photos and videos of rapes, suicides, beheadings and other killings. 

Several months later, a series of reports from The Verge revealed the dismal conditions that face Facebook content moderation teams in America, operated by third-party vendor Cognizant. The reports described workers developing post trauma stress disorders from watching disturbing content daily while living on a $28,000 per year salary, the frequency of verbal and physical fights in the office, and one worker in the Tampa, Florida office dying at his desk after a heart attack

Scola’s lawsuit was settled in May 2020, in which the social network behemoth agreed to pay $52 million to current and former content moderators to pay for health services to treat mental health issues that they developed while on the job. Facebook admitted no wrongdoing as part of the settlement, but agreed to add more mental health counseling services in the workplace.

But while the settlement was a milestone for the 11,000 Facebook content moderators based in California, Arizona, Texas and Florida, there are thousands of other content moderators outside of America who were not included in this settlement. 

Content moderators in Ireland sued Facebook in December 2019 in Ireland’s High court over the psychological trauma they allege to have gone through while on the job. That case, which is being supported by Foxglove, has not yet been settled.

What Halawa and Moshfeghi said about their working conditions echo the stories of other content moderators who have spoken out in recent years. While they were unable to discuss details about the actual content they must look through as part of their work, Foxglove’s Crider filled in the blanks for them: “Beheadings, child abuse, terrorism and bullying, day in and day out.”

 “We signed a contract clearly telling us we are not allowed to talk about the work,” Moshfeghi added. “I’m not even allowed to talk to my husband about the job.”



Source link