skip to main content

Instagram to alert parents if teens search suicide terms

sample caption
The alerts will be sent to parents via email, text or WhatsApp

Instagram has announced a new safety feature that will notify parents if their teen repeatedly tries to search for terms related to suicide or self-harm within a short period of time.

The alerts will start for parents who use Instagram's parental supervision tools in the US, UK, Australia, and Canada next week, and will become available in Ireland and other regions later this year.

Meta, the parent company of Instagram, said it is also building similar parental notifications for teenagers' conversations with AI, to come later this year.

Social media companies have faced years of criticism for the algorithms that funnel content into users' feeds.

These systems are frequently branded as "toxic" by researchers and campaign groups who say they promote posts that are inappropriate and harmful for children such as content relating to suicide, self-harm and eating disorders.

Instagram said the new alert system it is announcing today is the latest protection for Teen Accounts and Instagram’s parental supervision features.

The alerts will be sent to parents via email, text or WhatsApp, depending on the contact information available, as well as through an in-app notification.

Tapping on the notification will open a full-screen message explaining that their child has repeatedly tried to search Instagram for terms associated with suicide or self-harm within a short period of time.

We need your consent to load this rte-player contentWe use rte-player to manage extra content that can set cookies on your device and collect data about your activity. Please review their details and accept them to load the content.Manage Preferences


"The vast majority of teens do not try to search for suicide and self-harm content on Instagram, and when they do, our policy is to block these searches, instead directing them to resources and helplines that can offer support," Instagram said.

"These alerts are designed to make sure parents are aware if their teen is repeatedly trying to search for this content, and to give them the resources they need to support their teen," it added.

The platform said it wants to avoid sending the notifications unnecessarily, which, if done too much, could make the alerts less useful overall.

"In working to strike this important balance, we analysed Instagram search behaviour and consulted with experts from our Suicide and Self-Harm Advisory Group," Instagram said.

"We chose a threshold that requires a few searches within a short period of time, while still erring on the side of caution," it added.

Instagram said it has strict policies against content that promotes or glorifies suicide or self-harm.

"While we do allow people to share content about their own struggles with these issues, we hide this content from teens, even if it’s shared by someone they follow," the platform said.