skip to main content

Ten platforms to be covered by new online safety code

The code requires social media firms to protect children from harmful content (Stock image)
The code requires social media firms to protect children from harmful content (Stock image)

The media regulator, Coimisiún na Meán, has named the ten video-sharing platforms that will be covered by its new online safety code.

They are Facebook, Instagram, YouTube, Udemy, TikTok, LinkedIn, X, Pinterest, Tumblr and Reddit.

Under the new rules, Coimisiún na Meán is responsible for regulating video-sharing platforms which have their EU bases in Ireland.

The online safety code is being finalised and once the new rules are established, they will be legally binding and platforms will face fines of up to €20 million for breaches of the code.

Social media firms will have to protect children from specific types of harmful online material including cyberbullying, content that promotes eating disorders and content that promotes self-harm or suicide.

Platforms will also have to prevent the uploading or sharing of a range of illegal content, including incitement to hatred or violence.

The finalised code will form part of Ireland's overall online safety framework, taking effect from February, which will also include the EU Digital Services Act and the EU Terrorist Content Online Regulation.

Because so many of the big tech firms have their European headquarters here, Ireland will play a leading role in the policing of the new EU online safety rules, a task that will fall to Coimisiún na Meán.

The draft online safety code is open to consultation and the public have until 19 January to submit their views on the new rules.

Under the Online Safety and Media Regulation Act 2022, Coimisiún na Meán is responsible for regulating video-sharing platforms which have their EU base in Ireland.

Meta limiting teen access

Meanwhile, Meta Platforms is to hide more content from teens on Instagram and Facebook, after regulators around the globe, including in Ireland, pressed the social media giant to protect children from harmful content on its apps.

All teens will now be placed into the most restrictive content control settings on the apps and additional search terms will be limited on Instagram.

The move will make it more difficult for teens to come across sensitive content relating to suicide, self-harm and eating disorders when they use features like Search and Explore on Instagram, Meta said.

The company said that the measures, which are expected to be rolled out over the coming weeks, are more "age-appropriate" than current settings.