A Facebook content moderator has spoken of the mental health impact of her job and has called for improvements in the regulation of such work.

Isabella Plunkett, who works for Covalen, a content moderation firm outsourced by Facebook, said that for months she has been taking anti-depressants and has been having "horrible lucid dreams" about things she has seen on the platform.

Appearing before the Oireachtas Joint Committee on Enterprise, Trade and Employment, she told politicians that she often comes across content such as hate speech, bullying, graphic violence, suicide, abuse and child exploitation.

She explained that her job is to "train the algorithm". She believes that social media companies, like Facebook, hope that one day human content moderators will no longer be required.

Isabella Plunkett was joined by Cori Crider, co-founder of the non-profit organisation Foxglove, which works to ensure "justice in technology".

Ms Crider told TDs and Senators that "content moderation, a new form of work, is here to stay".

She called for content moderation to be regulated, as other hazardous businesses are regulated. This would allow for opt-outs for toxic content, exposure limits set by independent experts and psychiatric support, she explained.

Foxglove also want major social media companies to bring moderation "in house".

In response to queries from RTÉ, Facebook said: "We tackle harmful content and illegal behaviour through a combination of technology and human reviewers. We work with a global network of partners, so we can quickly adjust the focus of our workforce as needed.

"For example, it gives us the ability to make sure we have the right language expertise and can quickly hire in different time zones as new needs arise or when a situation around the world warrants it."

Covalen, which is outsourced by Facebook, told RTÉ that there is "24/7 health support and wellness coaching on site" adding that "this team provides 1:1 counselling support for all employees".

The company set out a range of other supports it offers, including "training on anxiety awareness, trauma, stress management, and personal resilience; Breakout areas to give content reviewers the option to step away from their desks if needed" and a "company paid healthcare plan from the first day of employment."

A content moderator has spoken about the mental health impact of the job (File pic, Getty Images)

Isabella Plunkett told the committee that some of her colleagues work in child abuse and self-harm "queues" all day. Managers have told them to limit their exposure to this content to two hours a day, but Ms Plunkett maintains that this is not happening.

She said her employers, Covalen, provide wellness coaches, but while the coaches mean well, they are not doctors. They have suggested karaoke or painting as way to cope with the impact of watching harmful content, she said.

"You don't always feel like singing, frankly, after you've seen someone battered to bits".

Ms Plunkett said the pay at Facebook for staff reviewing similar content is double the pay of a moderator like her, with sickness days for direct staff as well. She said she is allocated seven unpaid days a year.

Fionnuala Ní Bhrógáin, head of organising with the Communications Workers Union, told the committee that moderators do not enjoy "anywhere near the support, benefits and protections of direct employees".

She said such workers are exposed to "grotesque and traumatising violence, extremism, child exploitation and more".

Ms Ní Bhrógáin said artificial intelligence and other technologies are nowhere near being able to review and make the judgement calls taken by thousands of moderators here in Ireland, across Europe and the rest of the world".

She expressed concern over "where the buck stops" when it comes to the welfare of staff, given that many major social media companies outsource content moderation to other companies.

Cori Crider of Foxglove told politicians that she was pleased to discuss these issues with Tánaiste Leo Varadkar in January, however they only received a response to the issues raised during that meeting yesterday evening at 7pm, which she said "raises more questions than it answers".

Sinn Féin's Louise O'Reilly told Isabella Plunkett that she was "a brave young woman" who was doing her colleagues and her union "a great service".

Deputy O'Reilly acknowledged that the Irish Parliament was the first in the world to address the issue of social media content moderators.

She expressed concern over the outsourcing of content moderation to outside companies.

"It is deeply worrying on a number of levels", she said. "From the point of view of workers' rights, it is deeply worrying and also from the point of view of users."

Ms O'Reilly told the committee it was worrying that such "essential frontline work conducted at a remove from the main companies".

Cori Crider of Foxglove told Louise O'Reilly that moderators, like Isabella Plunkett, are allocated a "short period of time each week with a so-called wellness coach".

She explained that these coaches are not psychiatrists and cannot diagnose.

Ms Plunkett said workers are allocated an hour a week with such coaches who "do their best". She said workers are also given an hour and a half each week of wellness time to allow them to take break to "get a cup of tea for example, if people want to get a cigarette" after seeing upsetting content.

However, she believes that these measures are not enough.

The committee heard concerns around non-disclosure agreements (File pic, Getty Images)

In a written response to RTÉ, Facebook said: "Reviewers can step away from their desk at any time if they need to take a break. These breaks, which include breaks for wellness support, have no time limits".

Fionnuala Ní Bhrógáin of the Communication Workers Union said outsourcing made it difficult to organise workers.

"When an individual or a worker raises an issue with their direct employer, the outsourced company, they're referred to the business requirements of the client and if they attempt to raise any issues with the client company, they're correctly advised that they're not employed by the client company", she explained.

The committee heard concerns around non-disclosure agreements which moderators sign.

Cari Crider said workers found it difficult to get a copy of such agreements. She said Facebook told her that such a document would have to be sought through a "subject access request under GDPR".

However, Ms Crider said that she is not satisfied that requests she has made through this process have been properly answered.

She also expressed concern that such a document may be a "paper tiger" and if analysed by a legal expert, it may be the case that workers have more room to speak about workplace grievances than they initially thought.

Covalen told RTÉ that "there is never an instance where an employee does not receive a copy of their confidentiality agreement. If an employee misplaces their copy of the confidentiality agreement, our Human Resources team is more than happy to provide them with a copy".

It said: "Therefore, the impression that was given to the Oireachtas Committee and to other media outlets, that employees do not receive a copy of such agreements is completely inaccurate and untrue.

It added that "over the last number of months we have rolled out and promoted our 'speaking up policy' and if staff have any issues, they are actively encouraged to raise these through identified channels".