Analysis: Replacing independent fact-checkers with social media users to moderate online posts raises many questions about content moderation
In the modern media landscape, control over the conversation is in the hands of ordinary people in a way unprecedented since the invention of the printing press. Social media has opened up the means of deliberative discussion to the entire world. But it has also fuelled polarisation, distrust and misinformation of an unprecedented size and scale.
It is clear that traditional approaches to moderating posts – primarily the use of independent fact-checkers – is an inadequate response to this issue. Elon Musk's site, X, has offered a radically new approach: it is now users who moderate each other's posts. This move by X, and other social media sites, raises important questions regarding content moderation online, as is seen particularly clearly in the context of this year’s Irish presidential election.
We need your consent to load this rte-player contentWe use rte-player to manage extra content that can set cookies on your device and collect data about your activity. Please review their details and accept them to load the content.Manage Preferences
From RTÉ Radio 1's The Business, us there enough being done to attract people to the notoriously challenging field of content moderation? With Dr Robbie Smyth from Faculty of Journalism and Media Communications at Griffith College and former content moderator, Chris Gray
What is a community note?
A 'community note' is a small text box appended to a post that X users, known as Community Notes 'contributors’, have suggested needs additional context. One contributor proposes a note and other contributors, and the wider X user base, then rate this proposed note as either helpful, somewhat helpful or not helpful. It was originally pitched as a complement to X's traditional, third-party content moderation.
The posting of a note requires a ‘robust consensus’ across the political spectrum, understood by X to mean that those who typically disagree on their ratings of a note must agree that a note is helpful. Conversely, if users agree a note is not helpful, it is ditched. The vast majority of notes, meanwhile, end up stuck in the rating process, unable to reach a consensus as either helpful or not helpful (more on that later).
After Musk’s takeover of the platform and his dismissal of content moderators, Community Notes has quickly become central to the billionaire’s grand plan to make X the best source of truth on Earth". The system is now being rolled out on other social media sites too, including YouTube, Meta (owner of Facebook, Instagram and WhatsApp) and TikTok.
From AP, Meta to replace fact-checking with X-style community notes
Social media and political campaigns
Before this year's presidential election campaign got underway in earnest, social media already had an impact in the shape of former MMA fighter, Conor McGregor. He pulled out of the presidential election contest via a post on X in September.
A recent report by researchers at the Irish hub of the European Digital Media Observatory (EDMO) found that more than half of McGregor’s X posts related to the presidential election contained false information, while over 28% contained ethnonationalist rhetoric. However, only 23 of McGregor’s posts had a proposed Community Note, and none had reached the consensus required to show on the site.
Why do community notes fail to stop misinformation?
In understanding why Community Notes fails in stopping misinformation, a comparison can be made with audit, a key concept from the discipline of accounting. Community Notes largely resembles an audit: both involve the checking and verifying of information by a third party to create trust in that information.
The Community Notes system breaks when users cannot reach a consensus
Compared to a conventional audit, Community Notes misses two key aspects of creating trust: independence and expertise. Auditors are independent of the companies they investigate and are bound by a professional code of ethics which keeps them accountable. They also receive rigorous specialised training which allow them to adjudicate on – and be trusted to adjudicated on – complex issues. Social media users, on the other hand, are not trained to identify misleading information – far less to adjudicate on it. They are also not independent. On social media we act in a personal capacity, sharing opinions on contentious issues which affect our lives.
Our era of political polarisation betrays a perhaps more important distinction between traditional audit and its manifestation in Community Notes: the nature of consensus. Unlike the relatively apolitical concerns of audit, social media is inherently political and, when it comes to the political, the "truth" largely depends on your viewpoint. To adjudicate on issues of intense political debate is therefore near-impossible.
Any consensus that can be reached must be understood as partial, involved, and often biased in some way. This is important because the Community Notes system breaks when users cannot reach a consensus: as discussed, notes are only published when users who generally disagree with each other can agree, and those notes which cannot reach consensus remain stuck in an interminable rating cycle. To seek consensus on that which, by definition, cannot be agreed upon, is therefore a recipe for disaster.
When it comes to political issues, Community Notes is bound to fail.
This is clearly true of Community Notes. Last year the Centre for Countering Digital Hate (CCDH) analysed over one million notes related to the US election, concluding that "Community Notes are particularly poor when it comes to divisive topics where it's rare for contributors from a range of political leanings to reach consensus". This means that some of the most divisive posts go unnoted, remaining stuck in the rating process because of political contestation of the nature of the "truth" of an issue. On social media we deal with political issues, and political issues cannot be 'solved'.
Political misinformation is not a new phenomenon, but Community Notes, in promising a solution which is so ineffective, represents a particularly dangerous turn of events. While X boss Musk presents Community Notes as a relinquishing of power by social media executives to 'the people', it is rather an accountability sink, to borrow a phrase from Dan Davies’ recent book The Unaccountability Machine.
As a system, it allows X to claim that they are ‘empowering users’ while, at the same time recusing themselves of responsibility for the content posted on their site. In focusing on consensus across political divides, it is particularly poorly equipped to handle the scale of misinformation afflicting our increasingly polarised political environment. When it comes to political issues, Community Notes is bound to fail.
Follow RTÉ Brainstorm on WhatsApp and Instagram for more stories and updates
The views expressed here are those of the author and do not represent or reflect the views of RTÉ