skip to main content

Internal TikTok documents reveal moderation failures and concerns

The details were revealed as part of case filed in Louisville, Kentucky (pictured).
The details were revealed as part of case filed in Louisville, Kentucky (pictured).

Confidential internal TikTok documentation, disclosed and referenced in a US court filing obtained by RTÉ, reveals how some senior employees were aware of potential harms to young users linked to compulsive use of the app.

They also detail the percentage of content that violates TikTok's own rules which is not moderated or removed.

These include, according documents referenced in the court filings, 35.71% of content categorised as 'Normalisation of Pedophilia’; 33.33% of ‘Minor Sexual Solicitation’ content; 39.13% of ‘Minor Physical Abuse’ content; 50% of ‘Glorification of Minor Sexual Assault’ and ‘100%’ of content categorised as ‘Fetishising Minors.’

The details were released in error, after a faulty digital redaction process was used when publishing court documents as part of a case filed by the State of Kentucky Attorney General’s Office.

They were first reported last week by US outlets NPR and Kentucky Public Radio. Following the erroneous publication, the documents were resealed – in other words, removed from public access.

The documents were released in error after a faulty digital redaction process and have since been resealed

Kentucky is one of 14 US states separately suing TikTok alleging that it created an app "designed to addict and otherwise harm minors."

Dates are not listed on all the referenced internal documentation in the filings. Of those for which dates are provided, some are from as recently as May 2022.

In each of the separate lawsuits, information from dozens of internal TikTok communications, documents, and research data, were released on agreement references would be redacted under confidentiality agreements.

TikTok told Prime Time in a statement the US states' "complaint cherry-picks misleading quotes and takes outdated documents out of context to misrepresent our commitment to community safety."


WATCH: Internal TikTok documents reveal moderation failures and concerns


The social media giant, which has its EU headquarters in Dublin, is also facing an investigation from the European Commission regarding "protection of minors" and the "risk management of addictive design and harmful content."

Among the details revealed in the US case filings is data from an internal TikTok presentation on a study about moderation of suicide and self-harm content.

The study showed that videos about suicide and self harm – referred to as ‘SSH’ - do pass through the initial stages of TikTok’s moderation process, and "unmoderated or incorrectly moderated videos can spread widely before being caught."

The moderation stages are referred to as ‘R1’ and ‘R2.’

"The SSH videos that passed R1 and R2 received an average of 75,370 views on TikTok before being identified and removed," according to the study referenced in the filings.

In response to queries about the case, TikTok told Prime Time that "of the content we remove for violating our policies, 99% has fewer than 10,000 views when it is removed."

The court filings also reference an internal presentation by TikTok’s Trust and Safety group, which noted that about "42% [of users] are ‘comment only’ users," but human moderation of comments is "disproportionately low."

"Human moderation for comment review is at 0.25%," according to the presentation, meaning the overwhelming majority of concerning comments never go through human review.

Social media giant TikTok has its EU headquarters in Dublin

The filings also quote from an internal document which highlights concerns about the process whereby content around unhealthy eating and weight loss is moderated.

A document referenced says certain content is labelled a ‘not recommended’ within the app, rather than removed. In effect, it does not appear in users’ feeds, but remains on the app and findable through the search function.

A TikTok user's feed is an algorithmically-driven stream of content which has not been selected for viewing by the user, but is shown to them by TikTok based on content they have previously engaged with or viewed.

Regulators and policymakers in the US and Europe are concerned that TikTok’s powerful algorithm drives young people towards increasingly radical and extreme footage through the content shown within their feed.

They are also concerned the app is addictive. The combination of overuse and exposure to increasingly extreme content is sometimes referred to as "the rabbit hole effect."

Such processes were understood and noted by TikTok executives, according to communication referenced in the court documents.

"The reason kids watch TikTok is because the algo[rithm] is really good," one executive is quoted as writing in internal communications, before adding: "But I think we need to be cognisant of what it might mean for other opportunities. And when I say other opportunities, I literally mean sleep, and eating, and moving around the room, and looking at somebody in the eyes."

On foot of concerns about the rabbit hole effect, the impact on users’ mental health - and to assess how young users may be experiencing the app - TikTok conducted internal experiments whereby employees set up new accounts.

One employee noted "after following several ‘painhub’ and ‘sadnotes’ accounts, it took me 20 mins to drop into ‘negative’ filter bubble. The intensive density of negative content makes me lower down mood and increase my sadness feelings though I am in a high spirit in my recent life."

Similarly, the filings say an internal TikTok report, referred to as the TikTank Report, also found that "compulsive usage correlates with a slew of negative mental health effects like loss of analytical skills, memory formation, contextual thinking, conversational depth, empathy, and increased anxiety."

TikTok told Prime Time "safety is one of the core priorities that defines TikTok."

Publicly, TikTok’s response to criticism of the addictive nature of the app and concerns about the rabbit hole effect has been to say it implements ‘well-being measures,’ and is more and more often ‘dispersing’ content for users.

‘Dispersion of content’ means presenting users with a wider range of content topics through their feed.

For instance, the company said it was extending such measures following concerns raised on foot of a Prime Time report in June.


READ: 13 on TikTok: Self-harm and suicide content shown shocks experts
READ:
TikTok completes review into harmful content following RTÉ story


However, the court filings in Kentucky say that experts TikTok consulted in recent years "‘unanimously’ recommended a different strategy, instead of dispersion, to deal with dangerous rabbit holes."

The consulted experts recommended an approach that "increased user agency, and building algorithm changes that afford users the opportunity to find other interesting content that shifts away from a given rabbit hole," the documents say.

Other measures TikTok says it has introduced to counter addictiveness for young users are ‘Screen Time Management’ features, which include a prompt to take a break after an hour of using the app.

Referencing TikTok documents, the court filing from the Kentucky attorney general says this measure "proved to have negligible impact."

"After running an experiment, the company found that the default screen time use prompts reduced the average time per day teens spent on TikTok per day from approximately 108.5 minutes to approximately 107 minutes," the filing says.

Quotes from internal TikTok discussions on that measure are also detailed. In one internal written message, a senior employee said it was not expected to have a significant impact on how long young users spend on the app.

"After discussing these potential tradeoffs with [a senior TikTok executive] he proposed that we can accept a 5% drop in stay time for Screen Time Management features for special user groups like minors and excessive users."

"This should however not come at the expense of retention. That said, we don’t expect significant impact to stay time with this feature since it is only improving awareness and is not an intervention."

Another employee was quoted as saying: "Our goal is not to reduce the time spent, but should improve user experience satisfaction, finally contribute to DAU [daily active users] and retention."

TikTok told Prime Time it has "robust safeguards, which include proactively removing suspected underage users."

The documents filed by the Kentucky AG accuse TikTok of assessing the success of the measures "not by whether it actually reduced the time teens spent on the platform to address this harm, but by three unrelated ‘success metrics,’ the first of which was 'improving public trust in the TikTok platform via media coverage.’"

In response to queries from Prime Time about the contents of the filings, TikTok said it has "robust safeguards, which include proactively removing suspected underage users, and we have voluntarily launched safety features such as default screen-time limits, Family Pairing, and privacy by default for minors under 16."

The social media giant which is facing major cases in the US and Europe related to the harms posed to society from the content hosted on its platform also said it was "highly irresponsible" of named media outlets to publish "information that is under a court seal."


Kate McDonald's report on TikTok broadcasts on the 17 October edition of Prime Time at 9.35pm on RTÉ One.