Opinion: Instagram's algorithm amounts to large scale censorship, largely affecting accounts belonging to women and the LGBTQ community.
Recently, Facebook announced that Instagram’s algorithm had changed with immediate effect to reduce the reach of posts that don’t violate community guidelines but are still deemed inappropriate. While they didn’t give specific examples of what constitutes an inappropriate post, they did provide some vague categories like violence, misinformation, hate speech, and sexually suggestive content.
The way it works is that Facebook, Instagram’s parent company, are using machine learning to target posts that are "non-recommendable" and demoting them so they only see a fraction of the organic reach they normally would. They have also started deleting popular hashtags in those areas so people are less likely to find those pages.
This isn’t the first time Facebook has made changes like this. In recent years they’ve dramatically reduced organic reach to business pages on their own platform in order to boost ad sales. However, many fear that this latest algorithm change on Instagram’s platform amounts to large scale censorship.
The worry is not about the reduction of spam and fake news - I think we could all agree that less of that would mean we could all breathe a sigh of relief. However, when it comes to "sexually suggestive" content, that’s where we should all be very concerned.
This means that any post, even if it does not contain explicit nudity, is up for scrutiny. Unfortunately, the people who will and, let’s face it, are always affected by this sort of system are women and LGBTQ people. For example, Jade Beall (@jadebeallphotography) is a photographer who creates intimate nude portraits of individuals, couples, and mothers breastfeeding their babies. Within days of Instagram’s announcement, searching for Jade’s account became difficult as one would have to type in every single character correctly before she would appear in the search results.
Intentionally or not, the new algorithm judges us by our bodies. Many plus sized women receive abuse and trolling for the selfies they post, especially if those photos show off their skin. Sarah Tyrrell (@sarahandselflove), a writer and podcast host from Dublin, and Kit Richardson (@coquettewonkette), a sex educator and human rights activist from New York both had selfies taken down multiple times after trolls reported them as being offensive.
While no one apart from the developers at Instagram knows for sure how the algorithm judges content, it is highly likely that this flags both their accounts for further scrutiny. Therefore anyone who finds themselves outside the most acceptable body type - slim, white, able-bodied, fits nicely within a gender binary - is much more likely to be the victims of judgement, both from other users and the platform itself.
Using an algorithm and machine learning for what they admit are grey areas of content is foolish because it robs those photos of their context and allows a machine to make value judgements on what’s too sexy, who’s sexy, and what is offensive. People who are sharing their bodies to promote body positivity, art, feminism, or just to show off a hot outfit they’re proud of, are all having their experiences scrubbed from a platform that 1/7th of the world uses to share their lives.
In recent years we have cheered on young women in the US and UK who have pushed back against unjust school dress codes which fall unfairly upon the shoulders they are told to cover up, lest they become a distraction to other students. We now have a similar dress code put upon us on a global scale by a tech company who conflates human bodies with hate speech and violence.
We need your consent to load this Instagram contentWe use Instagram to manage extra content that can set cookies on your device and collect data about your activity. Please review their details and accept them to load the content.Manage Preferences
Mark Zuckerberg’s companies have huge amounts of power to shape Western society, and while they’ve created huge advances in photo sharing and recognition technology, any algorithm they write is going to have the same biases as the developers who create it. That is why it is so important for the end consumers, like us, to make our voices heard across all platforms when they decide they don’t want us to be seen on any of them.
Shawna Scott is a sex educator and owner of Ireland's multi-award winning, sex-positive online boutique www.sexsiopa.ie.
The views expressed here are those of the author and do not represent or reflect the views of RTÉ.