Jennier Zamparelli was joined by ex-Facebook Content Moderator, Chris Gray, on RTÉ 2FM to speak about his new book, The Moderator. Listen back above.
Most of us use social media on a daily basis, but rarely do we wonder about what's happening behind the scenes at these platforms. Every day, thousands upon thousands of videos are reported to teams of content moderators - to be watched, assessed, and potentially taken down.
Some of the content flagged is of an extremely graphic nature.
"When you start you're really excited, you're all pumped up," ex-Facebook moderator Chris Gray told Jen Zamparelli, explaining that when he started out he felt like a bit of a superhero, on a mission to "protect the innocent" and shield people from seeing things they shouldn't.
Despite receiving eight days of training ahead of the job, Gray says he didn't feel like he knew how to process some of the things he was seeing. And he saw a lot.
During a shift, he says he would watch an average of hundred posts hour for six to eight hours at a time. While much of the content was simple enough to deal with it, some posts were particularly disturbing and left a big impact on the moderator.
"You're immersed in it and you notice the certain attitudes and emotions that are prevalent throughout all this content. People who are not doing very well in their lives are looking for somebody to blame, somebody to punch down at."
"You start to become super, super sensitive to that - to somebody having a bad day and shouting down the customer service phoneline, for example, or just complaining online about something that didn't work.
"You're putting those people in the same category as the mass murderers, and the terrorists and the child molesters. You start to just see everybody as being right on the edge of becoming the world's worst person. It really pulls you down."
Although the job began to take a toll on his mental health, Gray says that non-disclosure agreements prevented him from speaking to friends and family about the things he was seeing.
"You can't tell anybody what you do or where you work," he explained. "We don't have a logo outside the building because we don't want anybody who is angry at a decision turning up to have an argument with the moderators. It's all under wraps and you're not allowed to discuss it."
Eventually, he decided to leave his role at Facebook, and says that therapy and a new job have helped him turn things around.
"I've been out for a few years and I work now with people. Face to face contact with real people. As I've become more self-aware, I've learned not only to manage my feelings but to understand other people's feelings."
"We've taken a bit more drastic action with some of the therapy to try and fix the wiring problems so that when I encounter difficult situations and conflicts I can kind of go, '[deep breath], right, I know what's happening and I know how to deal with it'. It's sometimes hard but that's hard for anybody."
Eventually, Chris decided to speak about his experience - dictating the story into his phone until it took form of his new book, The Moderator.
When offered a right to reply on this topic by The Jennifer Zamparelli Show, Facebook said:
"We are committed to ensuring that our partners provide support for those that review content for Meta [previously named Facebook] as we recognise that reviewing certain content can sometimes be difficult.
"Everyone who reviews content for Meta goes through an in-depth, multi-week training programme on our community standards and has access to extensive psychological support to ensure their wellbeing. This includes 24/7 on-site support with trained practitioners, an on-call service and access to private healthcare from their first day of employment.
"For many years, we have had technical solutions to limit reviewers exposure to graphic material as much as possible. This is an important issue and we are committed to getting this right."
If you have been affected by issues raised in this story, please visit: www.rte.ie/helplines.