Image Analyzer has urged website operators to take note that thirty content moderators are suing Facebook and its recruitment agencies, CPL Solutions, Accenture, Majorel and CCC, for exposing them to toxic visual content which has harmed their mental health. The moderators from Ireland, Germany and Spain have filed their lawsuit with the Irish High Court in Dublin claiming that they took on community moderation roles with inadequate training and no access to psychiatrists and that their working conditions resulted in severe mental trauma which left some moderators feeling suicidal.
The first case was brought before the Irish High Court in December 2019 by Chris Gray, a former community operations analyst at Facebook, who was recruited by CPL Solutions. His case is being handled by Coleman Legal Partners who have filed a complaint that Gray was expected to review up to 600 pieces of content every shift and was under intense pressure to achieve an accuracy rate of 98%, to avoid erroneously removing content that did not contravene Facebook’s community guidelines. This required him to scrutinise graphic footage including child sexual abuse, fatal beatings, executions and torture. Twenty nine more moderators have subsequently filed similar lawsuits with the Irish High Court, claiming that images that they witnessed in the course of their duties have injured their mental health.
Commenting on the case, Crispin Pikes, CEO and founder of Image Analyzer, a provider of technology that automatically moderates harmful online images and video, said, “Back in 2019 when the first moderator brought his case to the High Court in Dublin, employee rights advocate, Foxglove, compared Facebook’s community moderation system to an unsafe factory floor, where workers are being recklessly exposed to an injurious working environment. This is precisely what this is.
We have known since the first world war that continuous exposure to violent and disturbing images can cause chronic damage to people’s mental health. The Lancet first wrote about post-traumatic stress disorder in soldiers returning from the front in 1915. More than a hundred years later, we’re still seeing employees being sent into the trenches to defend the general public against viewing toxic images on major social media platforms. I’m particularly disheartened to hear that human moderators are expected to act like robots and achieve a 98% accuracy rate, when this could be achieved by technology, leaving only the most nuanced images to be reviewed by human moderators.
This issue is now going across jurisdictions, with moderators in Ireland, Germany and Spain coming forward to report that they have been injured by being required to wade through toxic content for extended periods. In addition, third-party agencies that recruit staff for organisations that fail to safeguard their employees’ mental health and safety are being exposed to litigation.
In the UK and Europe new regulations are coming in the form of the Online Safety Bill and the European Digital Services Act. Any organisation that enables users to upload visual content should be watching this case with interest as, under new regulations, all interactive website operators will have a legal duty of care to remove harmful content from their websites. However, as this case demonstrates, relying on humans to clean up toxic content is not in the spirit or intent of these laws.”