Instagram has introduced a feature that asks would-be online bullies if they are "sure" they want to post something inappropriate to another user, in an effort to cut the amount of abuse on the platform. 

Instagram now uses artificial intelligence to spot comments that could be bullying in nature, analysing text and highlighting posts that fit the patterns of comments most often flagged as inappropriate by users. 

In the examples given by Instagram, a bully could type "you are so ugly and stupid" and receive a prompt reading "Are you sure you want to post this? Learn more" before posting the comment. 

If this user taps "learn more", a notice reads: "We are asking people to rethink comments that seem similar to others that have been reported."

Elaborating on the move in a blog post entitled "Our Commitment to Lead the Fight Against Online Bullying", firm's chief executive Adam Mosseri said the company must do more on the matter. 

"We can do more to prevent bullying from happening on Instagram, and we can do more to empower the targets of bullying to stand up for themselves," Mr. Mosseri wrote.

"These tools are grounded in a deep understanding of how people bully each other and how they respond to bullying on Instagram, but they're only two steps on a longer path."

He added that while users can post negative comments if they want to, being confronted with a notice like the one above gives them a chance to reflect on their actions.

Mosseri wrote that "From early tests of this feature, we have found that it encourages some people to undo their comment and share something less hurtful once they have had a chance to reflect". 

According to the BBC, Instagram are introducing the feature to English-speaking regions first, with aims to roll it out globally in the future. 

Instagram also announced another feature intended to give control over what kind of interactions people who are bullied receive from their bullies. 

Called Restrict, it is intended to help teenagers and vulnerable people filter out bullying comments from those they feel they cannot outright block or report. 

This may apply to teens who attend school with their bullies, or those who see their bullies in everyday life. 

"We've heard from young people in our community that they’re reluctant to block, unfollow, or report their bully because it could escalate the situation, especially if they interact with their bully in real life," Mr Mosseri wrote.

Similar to how people can "hide" their Instagram stories from certain followers, a person who is filtered out will not know they are filtered out. Once they are "restricted", their comments will only appear to them and the restricted person will not know when the person who filtered them out reads their messages or is active on Instagram. 

The features come in light of increased pressure on Instagram, and many social media platforms, to curb the amount of abuse that takes place on its channels. 

The death of 14-year-old Molly Russell, who took her own life, has added to this pressure, as her father recently said distressing content about depression and suicide on Instagram were partly responsible for his daughter's death.

Instagram has also tried to reduce the negative impacts of social media on mental health by proposing to remove the "likes" on comments. The feature was trialed in Canada, and is intended to minimise the pressure people feel to post content for validation, via the amount of likes it racks up.