Opinion: proposed legisation to tackle online and social media abuse, harassment and bullying faces significant structural issues
Bullying, harassment, slander, racism and misogyny are all too common across online media. The Irish Government's Online Media Safety and Media Regulation Bill and the EU's Digital Services Act attempt to address the proliferation of harmful online content. Both rightly address how online platforms like Facebook and Twitter systemically facilitate and shape online communication and interaction in their present formulation. However, both are clouded by two significant structural issues, namley the problems of scale and the problem of regulatory centralisation. The two are interconnected.
The Online Safety and Media Regulation Bill is currently before the Dail. It aims to provide a means whereby hate speech and illegal content can be identified and quickly removed from online platforms. A new regulatory authority, the Irish Media Commission, will oversee this.
We need your consent to load this rte-player contentWe use rte-player to manage extra content that can set cookies on your device and collect data about your activity. Please review their details and accept them to load the content.Manage Preferences
From RTÉ Radio 1's Drivetime, panel discussion on online safety with Catherine Martin (Minister for Tourism, Culture, Arts, An Ghaeltacht, Sport and Media), Barry O'Sullivan (Insight Centre for Data Analytics at UCC); Tanya Ward (Children's Rights Alliance) and Matthew Feldman (Centre for the Analysis of the Radical Right)
Parts of the Bill have their basis in European legislation, which, for the first time, attempts to regulate video-sharing platforms as part of a wider audiovisual environment. Under the Country of Origin principle, Ireland, which is the European headquarters for many tech giants, will manage harmful audiovisual content on video-sharing platforms with significant reach across all 27 member states.
The sheer scale of the content to be regulated entails regulatory centralisation that has a wide-scale implication for the rights and freedoms of citizens across Europe. To take an example of how this matters, one of the criticisms of the Bill is that it lacks a direct complaints mechanism, where individuals can directly report harmful content or contest the takedown of material. The Australian government has included such a mechanism in its recent legislation.
But as Minister for Tourism, Culture, Arts, An Ghaeltacht, Sport and Media Catherine Martin has argued, such a system might work for Australia's population of 40 million, but cannot work for a population of 480 million. The problem is scale. Half of Europe uses Facebook alone. To regulate content, the regulator will set codes and guidelines for platforms and then monitor and assess how they adhere to them.
We need your consent to load this rte-player contentWe use rte-player to manage extra content that can set cookies on your device and collect data about your activity. Please review their details and accept them to load the content.Manage Preferences
From RTÉ Radio 1's Today with Claire Byrne, Alicia O'Sullivan from the Safety over Stigma campaign and Senator Malcolm Byrne on tackling social media abuse
As a means of harmonising European regulation, the Country of Origin principle has worked relatively well in traditional audiovisual media markets such as the Europe wide distribution of broadcast television programs. However, national regulators already exerted considerable influence over television content, and transnational television content rarely travelled beyond particular regions united by geography, language and custom.
Online media platforms are a different beast. Multiple users produce multiple media that can circulate, potentially, on a global basis. Platforms such as Facebook, WhatsApp, Instagram, YouTube, Twitter and TikTok have considerable reach across Europe. No single entity has ever taken responsibility for regulating such wide-reaching communication and interaction. Ever. This is regulatory centralisation on an unprecedented scale.
As in Ireland, the Bill will deprive all 27 member states of a direct complaints mechanism. In addition, definitions of harmful content based in Irish legislation do not reflect the cultural and historical diversity of expression across all European states. Moreover, for many member states that already have regulations addressing online harm, this will be de-regulation by default.
We need your consent to load this rte-player contentWe use rte-player to manage extra content that can set cookies on your device and collect data about your activity. Please review their details and accept them to load the content.Manage Preferences
From RTÉ Radio 1's Today With Claire Byrne, Brian O'Connell talks to Holly Carpenter and Linda Hayden about their experiences with online abuse and harassment
What may be done to avoid these somewhat predictable outcomes? Enter the Digital Services Act, a suite of regulations that includes oversight of content moderation which is currently under consideration by the EU. It is not directly concerned with harmful online content, but does provide measures to compel platforms to identify when their systems are being manipulated to spread harmful content.
Under this Act, Digital Services Coordinators in each nation-state will oversee content moderation. In addition, the European Commission will take responsibility for 'very large online platforms' (those with more than 45 million users) from 2024. The latter appears to solve the issue of enforcement but not regulatory centralisation and recognition of citizens' digital rights and freedoms.
Firstly the transfer of responsibility to the EU for extensive platforms does not address harmful online content. It still leaves the public without a direct complaints mechanism. It also contributes to centralised regulation that will dilute the publics' rights and freedoms.
It is time for the EU to look for measures, instruments and organisational designs to match the power and resources of global platforms
Secondly, the new digital regulatory coordinators will not integrate existing media regulatory bodies such as the proposed Media Commission. Fragmented supervision risks inconsistencies in the interpretation of overlapping legislation, which will again undermine individual rights and freedoms through generating loopholes and lacunae in the legislation.
It is arguably time for the European Union to look for new measures, instruments and organisational designs to match the power and resources of global platforms with trans-European regulatory capacity. The current proposals amplify the problem with the Country of Origin principle and the impossible balance between harmonising markets through one-stop regulation and building regulatory capacity.
Options may include integrating existing media regulators within the proposed Digital Service Coordinators and strengthening transnational cooperation within pan-European initiatives like the European Regulators Group for Audiovisual Media. These measures would support citizens' digital rights and freedoms at a national level. In addition, networked national regulators would have the critical mass to regulate harmful online content on global platforms in coordination with the EU.
The views expressed here are those of the author and do not represent or reflect the views of RTÉ