Sarah Magliocco explores the rise of "algospeak", a form of online language that is thriving on content filter-restricted social media platforms, and how it might affect how we speak day to day.

The study of etymology has been developing for hundreds of years, as words slowly and colloquially shift and change. As a social media user and eternal scroller, some changes to how people speak have become embedded in the content I consume on a daily basis, and that change has not gone unnoticed.

With the high-speed evolution of the internet and social media, there are new words and viral phrases being introduced to us daily, some of which will never make it into the day-to-day speech habits of the masses, while others become ingrained in the cultural lexicon.

Terms like 'slay', 'boomer', and 'it's giving' - while all originating from preexisting communities - have been taken and popularised online, becoming part and parcel of casual conversation both online and off.

Language changing is nothing new, but with the introduction of content moderation online, people who want their content to be seen risk down-ranking or removal if they don't adapt their speech to fall in line with the preferences of The Great Algorithm.

This has caused the accelerated development of a censorship-avoidant code that constantly shifts to evade software that is trained to catch up and catch you out - aka, the reason half the time you’re scrolling nowadays you haven’t a clue what people on TikTok are on about, and why hundreds of explainer articles have been written on deciphering what "the kids" mean when they say a certain abstract phrase.

While slang terms come and go, this new turn of phrasing has been dubbed "algospeak" - a shortened term for algorithm speak. This is different to regular internet slang because of its reason for existing - to be avoidantly expressive.

Most people may have noticed it first during the earlier months of the Covid-19 pandemic. With disinformation about the virus and vaccinations rife, many social media platforms locked down on accounts discussing the pandemic, adding a disclaimer banner to content using the words "Covid-19" or "pandemic."

We need your consent to load this tiktok contentWe use tiktok to manage extra content that can set cookies on your device and collect data about your activity. Please review their details and accept them to load the content.Manage Preferences

Social media users adopted new ways of saying these words without quite saying them, in order to avoid their content being penalised by moderation algorithms. You may have noticed "the panini" or "the panorama" or even "the panDemi Lovato," depending on what kind of pop cultural online circles you run in, being used as a stand-in for pandemic.

Over on YouTube, creators have been complaining of demonetisation for years, ie. the act of removing advertisements from videos that may be seen as controversial or not family-friendly, leading to the creator being unable to make money from ad revenue.

With TikTok now creeping up in popularity to run alongside longstanding top dogs like Facebook, YouTube, and WeChat, some of the most visible examples of algospeak exist on the platform.

The algorithm rewards content that gets the most engagement, and oftentimes, that can be the most controversial content, so to maintain that engagement without triggering any content filters, creators have lived up to their name and gotten, well, creative with their means of expression, using euphemisms and omissions to avoid using words that might get their content restricted.

A short glossary of recent algospeak that can be seen on TikTok:

Accountant - a code word for sex worker. A number of sex workers share discussions of their life on the app, but use phrases like "a day in my life as an accountant" to avoid having their content shadow-banned due to the adult references.

Unaliving* - a code for killing or being killed. Many true crime content creators use this code when discussing cases, for example: "The suspect was apprehended by police after unaliving his wife."

Going camping - This code phrase came into existence after the overturning of Roe vs Wade in the USA, which gave states the ability to regulate or ban abortion. People in states where abortion is still legal posted videos offering their support and a place to stay for people who were "going camping" i.e. seeking a termination in a different state after reproductive rights were removed in their home state.

Corn emoji star emoji - This one is hard to articulate via the medium of the written word, but the shuck of corn emoji followed by the star emoji means porn star or pornography, which is probably the easiest to decipher on this list. In a similar vein, online adult content creators often also refer to their Only Fans profiles as their "Just Friends" or "Only Friends" links.

Mascara - The word mascara has been used recently to discuss the topic of sex on the TikTok platform. However, the code has gone viral due to the confusion it caused many people, who were unclear as to whether or not people were really talking about mascara at all - plot twist: they were not.

We need your consent to load this tiktok contentWe use tiktok to manage extra content that can set cookies on your device and collect data about your activity. Please review their details and accept them to load the content.Manage Preferences

This particular code word landed it-girl and actress Julia Fox in a spot of hot water recently, as she commented "Idk why but I don’t feel bad for you lol," on one TikTokker's video. The video had read: "I gave this one girl mascara one time and it must’ve been so good that she decided her and her friend should both try it without my consent."

Julia was called out hugely for being insensitive, and apologised, explaining that she genuinely was unaware the term mascara was being used to describe sex, or on this case sexual assault. "Hey babe I’m so sorry I really thought you were talking about mascara, like, as in makeup. I’m sorry that happened to you," she said after being called out for her misinterpretation.

While algospeak has not yet truly jumped off the pixelated screen and into the casual vocabulary of the masses, we have to consider if it might embed itself into the spoken word out of habit and by happenstance.

As people across generations share the same social spaces for the first time via online platforms, hyperfast slang may no longer be the sole domain of teenagers. The 90s movie trope of the Californian Valley Girl whose every sentence is inflected with 'like,’ ‘totally,’ and ‘radical’ may come to mind when those who are not chronically online hear algospeak in the wild, as the unusual phrases sound alien when taken offline.

Taking these phrases offline would be a form of linguistic accommodation, a concept where the listener makes adjustments to their own speech patterns in response to behaviour of the speaker, according to Linguistic Accommodation by David Beaver and Kristie Denlinger.

However, it seems unlikely that algospeak would become as popular as online slang, as the codes and phrases are often short-lived thanks to the intelligence of the very algorithms they seek to evade, and because of their milk-like level of expiry, creators are constantly having to develop new phrases to express non-family-friendly topics.

One element of algospeak that needs to be mentioned is that while many people use it to keep money in their pockets from content creation, it can also be used by marginalised communities to prevent their voices from being lost.

People from BIPOC and LGBTQIA+ communities have already pointed out that algorithms impact their ability to express their lived experiences, as words are suppressed by the algorithm that they have created or reclaimed as most algorithms do not have the same human understanding of the social context of words.

Getty Images

In one 2019 study from the US, researchers found that AI models used for filtering hate speech were more likely to flag tweets as offensive when they were written by African Americans and tweets written in AAVE.

TikTok acknowledged in 2020 that it restricts LGBT-related hashtags in some countries as part of its "localised" approach to moderation, but said they are "deeply committed to inclusivity" in a statement to the BBC.

Another example is that creators who make sexual health and sex education videos complain that their content is stricken down for violating community guidelines while skits about sex from famous comedians or stealthy fetish videos remain up and rake in millions of views.

While it may be jarring to hear algospeak in videos that your brain can’t quite make sense of, it can’t be ridiculed as simply a trend or passing fad. Important content may never be made or seen without it.

While algospeak may have originated for financial means, it can be a lifeline for those wishing to express themselves without the risk of having their content quashed.

The views expressed here are those of the author and do not represent or reflect the views of RTÉ.

* If you have been affected by issues raised in this story, please visit: