Ibrahim Halawa's head is full of Facebook. Well known for spending more than four years of his life in an Egyptian prison, he is now a Facebook content moderator. But he does not work for Facebook, and he dare not speak its name. It is simply "the client".
Ibrahim is one of thousands of outsourced workers across the world who view and remove violent and graphic content for the social media giant.
"We are expected to take care of the people on the social platform. And we want to do that," Ibrahim told Prime Time. "We want to create a safe platform for my nephews to use, for your kids to use, for everyone to use."
But though content moderation is core to its business, Facebook outsources this work. Workers feel left to carry the burden of rapidly processed graphic content without proper care or support, and are bound to secrecy in their contracts.
Ibrahim has decided to speak out. Today, he will be one of the workers’ representatives who meets Tánaiste and Minister for Trade, Enterprise & Employment, Leo Varadkar.
The workers, and a London-based digital justice group that advocates on their behalf, Foxglove, believe it will be the first time a government politician anywhere in the world has met moderators to discuss their grievances.
Their cause has been well-publicised over the last two years. They claim that they are treated as second-class workers, and have a range of complaints: compared to Facebook’s own staff, the pay and perks are poor. They also have issues with workplace protections, particularly during the Covid-19 pandemic. But their chief concern is their mental health.
Both Covalen, one of the outsourcing firms, and Facebook dispute their claims.
"In my opinion, it’s very disturbing work," said Dr Ann Leader, a consultant psychiatrist at Dublin’s Bon Secours hospital who has examined over 30 content moderators experiencing mental health distress. "Images of beheadings, of child rape, murders, dismemberments and really images that you couldn’t imagine, or you would never have seen anywhere else."
The videos that moderators must review can include, among other harrowing sounds, the screams of people being tortured. As a result, moderators often experience flashbacks, nightmares, anxiety, symptoms and panic attacks, Dr Leader said.
"They feel changed. They sometimes feel that their core values have shifted," she told Prime Time.
Ibrahim was detained in Egypt for four years without trial after he was arrested in August 2013, aged just 17, in connection with protests supporting the ousted Egyptian president, Mohamed Morsi.
Released in October 2017, after being acquitted at trial, he was welcomed home to Ireland. But then, he said, he went through his own experience of online abuse – targeted harassment, bullying, Islamophobia, and racism.
A friend suggested he take a job online. It was for CPL – now rebranded as Covalen – who was seeking content moderators for a large big tech client.
Covalen’s client was Facebook. But strict confidentiality clauses and secrecy rules curtail the moderators’ freedom to talk about what they do or, they say, even name who the client is.
Facebook compares the secrecy to standard non-disclosure agreements, or NDAs, in businesses where employees are working on sensitive content and user information.
All content moderators are required to sign confidentiality clauses with its outsourcing partners, partly to protect user information, and partly for the moderators’ own safety, Facebook told Prime Time in a statement.
"We want to ensure that they aren’t put in a position of being targeted based on that work, or the perceived work that they do", the company said.
But it also said that content moderators should "feel absolutely comfortable discussing the general challenges of their jobs with family and loved ones, so long as they’re not discussing specific details of the tools they use or personal identifiable information from the content."
Dr Leader said that moderators she has assessed who have worked for either Covalen or another provider, Accenture, have told her that they are Facebook’s "dirty little secret".
Moderators often work in separate locations to other Facebook employees. Partly because of the working conditions, and partly because of mental health issues, there is also a huge turnover in staff, with many moderators only lasting a short time in the job.
Facebook’s reasoning for its outsourcing is clear: it’s about flexibility to meet global challenges.
"We work with a global network of partners, so we can quickly adjust the focus of our workforce as needed," it said.
"For example, it gives us the ability to make sure we have the right language expertise and can quickly hire in different time zones as new needs arise or when a situation around the world warrants it."
But workers find contradiction in the parallel claims, by both Facebook and the outsourcing firms, that their work is consequential. Or, as Covalen puts it, that the "work carried out by content moderators is critical in keeping our online communities safe."
Moderators "see themselves very much as outsiders who are often working in completely different buildings, and do not have same perks. I think they are very dispensable people who have very low value and they quickly realise that themselves," said Dr Leader.
Covalen staunchly disputes the workers’ claims that they are not properly protected from the mental health hazards of their work.
"The health, safety and well-being of our employees is our top priority and we have many measures in place to ensure employee well-being, including unrestricted access to counselling services", it told Prime Time.
"Our Wellness Coaching team members are highly qualified professionals, many of them holding Masters or PhD level qualifications, in the areas of psychology, counselling and psychotherapy. This team provides 1:1 counselling support for all employees," it said.
"If required, follow-on referrals to occupational health are also supported by the Covalen team."
Workers and their lawyers are not convinced. Cori Crider, from Foxglove, said that the wellness coaching that workers are typically offered is limited and weak.
The workers believe they are being offered standard corporate wellness aids, when what they really need is something far more focused on what they have to do every day: rapidly process grueling and sometimes unthinkable content.
Ibrahim Halawa thinks his experience in prison steeled him for the difficult job of dealing with disturbing content.
"A lot of people are dreaming bad dreams. And for me, it took me a very very long time to actually properly sleep at night without getting any nightmares", he said.
"It was very, very difficult. I was able to overcome it, because I had in the past such an ordeal, but some people won't be able to overcome it. Some people need people to be there for them."
In 2018, Facebook agreed $52m in compensation for over 10,000 content moderators in the US who suffered mental health impacts, including PTSD, or post-traumatic stress disorder.
And, last year, the Financial Times reported that another outsourcing provider, Accenture, had asked its content moderators to sign waivers that concluded with a stark warning: "I understand the content I will be reviewing could even lead to Post Traumatic Stress Disorder."
Due to intense political scrutiny, Facebook has faced increasing pressure about content on its platform. So too have its outsourced content moderators.
Yet the job of content moderator is one that Ibrahim wants to keep while he completes his studies in law school, once more supports are provided.
But Ibrahim said he is speaking out because doing this job should not come at the cost of his mental health.
"I didn't want to be reaching a numb stage, where I'm watching this stuff and I'm not feeling anything anymore," Ibrahim said. "I still want to be a human."