skip to main content

TikTok completes review into harmful content following RTÉ story

TikTok has completed a review following a Prime Time report last month
TikTok has completed a review following a Prime Time report last month

TikTok has completed an "urgent review" of hundreds of videos displayed to would-be 13-year-old users triggered by a recent RTÉ Prime Time report into content on the app related to self-harm and suicide.

However, the company is refusing to disclose how many videos linked to the review were removed from the platform, or hidden from teenage users' feeds, for violating its guidelines.

The content displayed to the accounts set up by Prime Time included references to suicide and self-harm.

In many instances, images or videos were overlaid with text captions saying things like 'I just want to sleep and never wake up,’ ‘if i lose my battle, stay alive for me ok,’ and ‘all the voices in my head get loud, I wish I could shut them out.’

Others referred to self-harm and showed images of blades and text about scars on arms and wrists.


Read more: 13 on TikTok: Self-harm and suicide content shown shocks experts


All were displayed to would-be 13-year-old account holders as part of the Prime Time report, the results of which were published in mid-April.

Following the report, TikTok told the Oireachtas Committee on Children that they were initiating an urgent review of content.

Now completed, it is refusing to provide specifics on how many videos were removed or restricted.

When in recent days Prime Time viewed the watch histories for the test accounts which TikTok understood to be controlled by 13-year-old users, the videos containing content detailed above were not visible.

However, other videos with captions such as ‘holding it together so they don’t have to sit at a funeral and wonder what they did wrong’, or ‘can i ask you a question, it’s just hypothetical, how would it make you feel if you never saw me again?’ remained viewable within the accounts.

Content with captions including ‘my room is clean, but my arms definitely aren’t,’ or ‘be honest, if a car was coming towards you full speed, would you move?’ were also visible.

TikTok's EU-headquarters are in Dublin

This week, Prime Time obtained a letter sent by TikTok to the Department of Health following the publication of the programme’s report. In it, TikTok told the Department "a small number" of the videos shown on the accounts breached its community guidelines.

It said under its guidelines it does "not permit TikTok community members to share content depicting them partaking in, or encouraging others to partake in, dangerous activities that may lead to serious injury or death."

TikTok further told the Department of Health that on foot of the Prime Time report "we are now expediting the deployment of additional technologies which will further increase our ability to disperse content of this nature."

Prime Time asked TikTok to explain what was meant by ‘dispersing content’ of a similar nature but it did not provide details.

The original report highlighted how the design of TikTok’s algorithm means users’ feeds can quickly become dominated by increasingly harmful content. The reference to ‘dispersing content’ appears to related to changing how TikTok’s algorithm functions for younger users, to interrupt that ‘down the rabbit hole’ effect.

Dept of Health is located at Miesian Plaza

The Irish Society for the Prevention of Cruelty to Children (ISPCC) CEO John Church told Prime Time that further dispersing content "is quite simply not good enough."

"It is painfully evident that content on these platforms and their algorithms is amplified, not dispersed," he said.

"We see the effects of this every day - children and young people, many of whom are already vulnerable, are being bombarded with dangerous and harmful content," he added.

The ISPCC said that in a 10-week period between March and mid-May 239 children contacted its Childline support number in relation to suicidal ideation and that social media companies have a role to play in helping reduce that.

"These are truly shocking stories, and we must remember that behind the numbers are hundreds of vulnerable children and young people," he said.

"It is the responsibility of technology platforms such as TikTok, which create and implement the algorithms, to protect their users from such harm."

"We need to see exactly what these platforms are doing to safeguard children and young people from this harmful content," Mr Church said.

TikTok told Prime Time that although the "experiment did not represent how real teens interact with the app" that it "took action against any content that was violative or shouldn't have appeared in teenagers' feed."

The most typical community guideline it says would have been breached was ‘Mental Behavioural Health.’

The European Commission is investigating TikTok.

With TikTok’s European headquarters in Dublin, Ireland’s online media regulator, Comisiún na Meán, bears significant responsibility for regulating how TikTok operates in the EU.

In relation to the review undertaken by TikTok, Comisiún na Meán told Prime Time its forthcoming Online Safety Code will oblige video sharing platforms such as TikTok to "prohibit the sharing of content which promotes self-harm or suicide."

It also said it is supporting the on-going European Commission investigation into TikTok’s use of algorithmic recommender systems, and the measures it takes to keep children safe.

In February, the European Commission launched an investigation into TikTok over concerns the platform is not adequately protecting children. The investigation focuses on TikTok's design and alleged shortcomings in adhering to the Digital Services Act (DSA).

Comisiún na Meán said it "would not be appropriate for us to comment on TikTok’s claims while this investigation is on-going."

Prime Time’s report followed concerns published by researchers and advocacy groups like Amnesty International about young teens' mental health being negatively influenced by content on TikTok.

The programme conducted an experiment between March and April in which three new TikTok accounts were created on phones with newly-installed operating systems.

Each time, when asked to provide the users’ age, Prime Time gave a date in 2011. As a result, TikTok understood the user was 13-years-old.

Prime Time did not search for topics, 'like’ or comment on videos, or engage with content in any similar way. Videos shown by TikTok on the ‘For You’ feed which related to topics like parental relationships, loneliness, or feelings of isolation, were watched twice.

When the experiment concluded, TikTok was provided with the usernames of the three test accounts, and therefore could access all data related to those accounts.

It was also provided with 10 sample screenshots from videos that concerned mental health experts interviewed by Prime Time.

At the time, TikTok told Prime Time that it had removed or restricted 7 of those 10 videos from teenage users’ feeds.


If you have been affected by the issues raised in this article, visit rte.ie/helplines