Fb Responds to Scathing Letter From 200+ Content material Moderators

Facebook replied to an open letter sent on Wednesday by over 200 moderators who are concerned about their working conditions during the pandemic.

The letter was addressed to CEO Mark Zuckerberg, Chief Operating Officer Sheryl Sandberg, and the CEOs of two companies that provide content moderators to the social network on a contract basis: Anne Heraty of CPL / Covalen and Julie Sweet of Accenture.

The moderators expressed concerns about being forced to return to the office during the pandemic, as well as the shortcomings of Facebook’s artificial intelligence systems, calls for better pay and permanent employee status, and access to better health and mental health care.

“We value the valuable work content auditors and attach great importance to their health and safety,” said a Facebook spokesman. “Although we believe in an open internal dialogue, these discussions have to be honest. The majority of these 15,000 global content reviewers have worked from home and will continue to do so for the duration of the pandemic. All of them have access to medical care and confidential resources for wellbeing from day one, and Facebook has exceeded health guidelines for facility safety for any job in the office. “

The moderators said in their letter that while those with medical notes on personal Covid-19 risks have been excused from working in offices, those with vulnerable relatives have not, and they urged Facebook to maximize working from home.

Facebook said the majority of its global content reviewers work from home and have done so since the beginning of the pandemic.

The moderators also pointed out that multiple Covid-19 cases have occurred in several offices.

Facebook has taken detailed measures, including measures with significantly reduced capacity to enable social distancing. Room occupancy limits and instructions on how to comply with these protocols; compulsory temperature tests before entry; mandatory use of face masks; Daily deep cleaning, with all desks cleaned at the end of each shift and the touch-sensitive surfaces cleaned several times a day. wide availability of consumables such as hand sanitizer, wipes and face masks; and improved air filters and more frequent changes and venting of building air pressure.

The company also found these logs to be consistent with those of all Facebook global establishments, including those where full-time workers have returned to work.

The moderators also called for a hazard payment for those working on high-risk material like child abuse, saying these people should receive 1.5 times their usual wages. They called for an end to outsourcing and wrote: “There is, if anything, more than ever. Demand aggressive moderation of content on Facebook. This requires our work. Facebook should bring the workforce into the house to moderate content and offer us the same rights and advantages as all Facebook employees. “

They indicated that a content presenter in Accenture’s Austin, Texas office generally makes $ 18 an hour.

The letter’s signatories said that content moderators are offered 45 minutes a week with a wellness coach. These coaches are generally not psychologists or psychiatrists and are contractually prohibited from providing any diagnosis or treatment.

According to Facebook, content reviewers have access to health care from the first day they work, and there is no weekly limit on wellbeing resources. Reviewers are requested to use them if necessary.

Facebook’s AI systems, which are supposed to relieve the moderators of content, were blown up in the letter.

The moderators wrote: “Without informing the public, Facebook carried out a massive live experiment on the highly automated moderation of content. Management told moderators that certain types of toxic content should no longer be shown in the review tool we are working with, such as: B. graphic violence or child abuse. The AI ​​wasn’t up to the job. Important language was torn into the mouth of the Facebook filter – and risky content such as self-harm stayed up. The lesson is clear. Facebook’s algorithms are years away from the level of sophistication required to automatically moderate content. You may never get there. “

Continue reading

Comments are closed.