In September 2024, a delegation of Moldovan clerics returned from a trip to Moscow, where they had been lavishly hosted by the Russian Orthodox Church.
On arrival back in Chisinau, the bank debit cards they had been gifted in Russia turned out to have been loaded with amounts of up to $1,200.
One of the priests, Fr Mihai Bicu, told Reuters that in return they were expected to create social media channels across Moldova to warn parishioners about the dangers of their government's pursuit of EU membership.
Parishioners created almost 90 new Telegram channels, which pumped out an identical message of hostility to EU membership on a near-daily basis for a year.
This was just one part of Russia’s battery of measures allegedly used to bribe, confuse or intimidate Moldovan voters ahead of the parliamentary election on 28 September.
The Kremlin has denied any involvement.
The party of the pro-Western president Maia Sandu won by just over 50%, but the sheer scale of interference - widely documented by independent media, civil society and Moldova’s police and intelligence services - overshadowed the launch this week of the EU Democracy Shield, the signature policy of Ireland’s EU commissioner Michael McGrath.
"The reality is that as [candidate countries] get closer to EU membership," he told a news conference on Wednesday, "the intensity of the threat they face in terms of foreign interference is only going to grow. Our response has to grow in tandem and match it".
Officials say the Democracy Shield is designed to support frontline accession countries but also to galvanise European societies into recognising the threat posed to democracies.
Despite a blizzard of announcements, many critics have been underwhelmed.
"It was just not very convincing," said Eileen Culloty, Professor and deputy director of the DCU Institute for Media, Democracy, and Society.
"It sounds ambitious on the one hand, but often it's just repackaging what already exists."
Undoubtedly, the EU is scrambling to keep up with both an acceleration of the technology and a more determined Russian hybrid threat.
On another flank, the Trump administration has been hostile to any European efforts to curb its Big Tech allies, the very ones accused of enabling disinformation and undermining European elections.
National elections are also a sensitive area where the European Commission is loath to tread, say officials. Much of what is involved is voluntary.
"This has been a foundational challenge for the EU," said Rasmus Keils Nielsen, Professor of Communication at the University of Copenhagen and senior research associate at the Reuters Institute for the Study of Journalism.
"Some of these threats come from inside Europe, from political parties and in some cases governments within the European Union itself. It could require the EU to name these threats. That may not be politically possible because it risks casting the EU as a partisan actor in national politics, which is poisonous for its legitimacy."
According to a Moldovan government report circulated to member states, and seen by RTÉ News, Russia spent a year preparing for an all-out disinformation assault on the September election.
At most, Moscow intended to ensure a win for pro-Russian parties, followed by a purge of officials and institutions combating electoral corruption, then arrange fresh elections to secure a stronger majority, which would derail the country’s EU membership bid.
Failing that, the campaign would result in a prolonged political crisis, voter apathy and economic instability, with pro-Russian parties benefiting in follow-up elections.
At the very least, pro-Russian parties might lose but claim the election results were fraudulent.
According to the report, from November last year until mid-July, Russia created troll farms, recruiting volunteers to boost hundreds of new Telegram and TikTok accounts, all designed to foment instability through highly skewed surveys and the creation of an artificial energy crisis, followed by a social media campaign across the EU to saturate any Moldova-related topics with disinformation.
As well as bribing Orthodox priests to promote new Telegram accounts, Russia allegedly used cryptocurrencies to pour money into anti-EU campaigns and protests, expanding a vote-buying network first mobilised in 2024, and creating fake "pro-EU" political parties to split the moderate vote.
There were over 1,000 cyber- and ransomware attacks on critical infrastructure, including airports, media organisations, the Ministry of Finance and the Central Electoral Committee of Moldova. According to the BBC, from early 2025 some 90 TikTok accounts posted thousands of videos gathering more than 23 million views and 860,000 likes.
Separately, the European Platform for Democratic Elections (EPDE), a monitoring organisation, reported that nearly 1,000 divisive videos generated 93.1 million views on TikTok in the three weeks before the election.
"In a country with only 2.3 million citizens, these figures indicate a propaganda machine operating at full capacity, flooding the digital space of Moldovans 24/7," the report concluded.
Researchers found certain types of information were dumped on the political debate to crowd out any other discourse.
In a speech on 11 October, commemorating 35 years of the Venice Commission, a Council of Europe body, President Sandu said Russia’s aim was to "seize Parliament, install a Kremlin-controlled government, crush our democracy, drag Moldova into a gray zone and use it against Ukraine and against Europe".
Launching the Democracy Shield this week, Michael McGrath insisted the very intensity of the Moldovan disinformation operation had exposed existing EU tools as not being sufficiently agile.
"When you have cases of large scale campaigns with rapid consequences affecting public security, more measures are needed to respond in a timely manner," said one senior official.
"Time is of the essence," they said.
A second official said: "In Moldova, one could see the entire arsenal of hybrid threats being very thoroughly deployed. But Moldova is also a very positive example, because it shows how democratic defences can work, and how critical it is to have all the tools and all the actors indeed working together to defend democracy."
As such, the Democracy Shield promises an ambitious, all-of-society approach, from grassroots fact-checkers, to schools, to national governments, so that European democracy itself is primed to respond when a threat is imminent.
A new Centre for Democratic Resilience (CDR) would become a hub coordinating nick-of-time responses, gathering expertise from national capitals, such as the Swedish Psychological Defence Agency, as well as existing systems of monitoring electoral disinformation.
An informal network of fact-checkers, civil society activists and digital experts will be invited to feed into the CDR once it’s up and running next year.
Even social media influencers, who might have millions of followers, will be encouraged to think about the democratic implications of endorsing a political party instead of a skincare product.
"The key thing is that we better coordinate what is happening," said Michael McGrath ahead of the launch, "that we have more of that capacity within the Centre and that we make it available throughout the EU".
There is a commitment to building media and digital literacy, with greater involvement of civil society.
Officials say the European Media Freedom Act could be strengthened (it’s designed to protect the pluralism of news organisations in the face of rampant free content from social media untroubled by traditional editorial standards, as well as boosting ownership transparency and advertising).
The Commission has promised more funding for public service media.
Yet, the Democracy Shield has drawn accusations of duplication.
The EU’s diplomatic arm, the European External Action Service (EEAS), has already been operating a Rapid Alert System to coordinate responses on disinformation since 2018.
The Florence-based European Digital Media Observatory (EDMO) has been running 15 hubs across 28 countries, coordinating the work of fact-checkers, media literacy experts and open source researchers in combating disinformation.
The Audiovisual Media Services Directive, in place since 2018, already provides for protecting children and consumers, safeguarding media pluralism and combating racial and religious hatred.
"All of these things already exist," said Professor Eileen Culloty of DCU, "[so you’re] saying you're just committed to them properly, rather than creating something new. If you talk to people in civil society, they're pretty sick of being consulted at this point."
That sense of frustration is fueled by a complaint that existing and legally binding EU weapons - the Digital Services Act (DSA) and the AI Act - are still not dramatically changing the behaviour of large tech platforms.
The DSA, adopted in 2022, obliges platforms with over 45 million monthly users to prevent algorithmically-boosted disinformation from flooding the digital space at election time, just the kind of charge to which TikTok and Telegram stand accused in the Moldovan elections.
Tech companies can be fined up to 6% of global turnover if found guilty.
The AI Act requires that "deployers of an AI system that generates or manipulates image, audio or video content constituting a deep fake, shall disclose that the content has been artificially generated or manipulated".
In other words, at election time, TikTok and others have to either identify deep fakes as fake or take them down altogether.
In Moldova, the government reported a much more aggressive use of AI deep fakes targeting the pro-EU president Maia Sandu compared to the 2024 election, while the European Platform for Democratic Elections identified 93 accounts specialising in manipulative AI content directed "almost exclusively" against Sandu and her Action and Solidarity Party (PAS).
The Moldovan police reported that 80% of disinformation content on TikTok during the election had been AI-generated.
Under the act, which identifies election fraud as one of four "risk categories", the European Commission liaises with national Digital Services Coordinators (DSCs) - in Ireland’s it’s Coimisiún na Meán - to take action if Big Tech operators, such as TikTok, Meta, Google, YouTube, Instagram, X and others, are not swiftly tackling massive disinformation campaigns.
In turn, those companies, under a voluntary code of conduct, should cooperate with national regulators.
Michael McGrath said the code of conduct approach would continue.
The Irish Council for Civil Liberties (ICCL) described this as a "missed opportunity".
"Rather than shut down the algorithms that push hate and disinformation into people’s feeds, the Commission will worryingly continue to rely on self-assessment by US tech giants," said the ICCL’s Joe O’Brien.
"The politics of the continent are in crisis, artificially spurred by US and Chinese social media algorithms. The Democracy Shield should have been Europe’s turning point to protect democracy," he said.
EU officials insist that, while in its infancy and subject to the stunning speed of digital innovation, the DSA and other tools are working.
They point to recent elections - Slovakia, Finland, Poland, the Netherlands, and last year’s European Parliament election - where Big Tech, under the guiding hand of national regulators, anticipated risks, took mitigating action and changed content policy, thereby reducing the impact of disinformation and AI deep fakes.
"That would have never happened without the DSA," said one EU official.
However, officials admit the system failed during the Romanian presidential election in 2024, when the ultranationalist, pro-Russia candidate Călin Georgescu came out of nowhere to win in the first round.
After Romanian authorities declassified documents purporting to show Georgescu’s campaign had been boosted by paid influencers and bots, the European Commission launched an investigation into TikTok’s recommendation systems for "coordinated inauthentic manipulation or automated exploitation of the service."
Patryk Jaki, a right-wing Polish MEP, told his opponents: "I don't see this kind of evidence" of foreign interference. "What I see [is] you like censorship."
Either way, the Commission has concluded that the system was not sufficiently agile.
However, the slow pace of enforcement when a big platform is in breach is a further bugbear of MEPs and activists.
The European Commission has opened proceedings against 20 large online platforms since the DSA took effect. Of those, nine - TikTok, X, Microsoft, Google, Pinterest, Meta, LinkedIn, Shein and Apple - all fall under the Irish Digital Services Coordinator, Coimisiún na Meán.
The Irish regulator this week launched its first formal investigation under the DSA into whether X users are able to appeal the company’s content removal decisions, whether they are properly informed of the outcome of any such decisions and whether X has an easily accessible internal complaints handling system.
In a statement, Coimisiún na Meán said it "cooperates closely with the European Commission and other EU member state Digital Services Coordinators on the consistent implementation of the DSA for online platforms established in Ireland, as a member of the European Board for Digital Services, and is supporting some of the European Commission’s open investigations".
Undoubtedly, the most high-profile of those investigations has been into Elon Musk’s social media giant X.
In December 2023, the Commission investigated X’s blue check option on the basis that the $8 monthly charge was no deterrent to an individual determined to boost disinformation (the Commission has already made a preliminary finding against X); further investigations focus on whether independent analysts are being shut out of X’s data systems and if the company’s "community notes" system of fact-checking, and its policies on hate speech and violent content, are effective.
However, two years on, the main complaint against X remains unfinished, amid suspicion that the Commission is fearful of alienating President Trump.
"The real gap is enforcement," said Professor Eileen Culloty of DCU.
"All the legislation introduced in recent years, all the existing legislation around elections - why aren't they being enforced?
"We see illegal content, we see electoral interference, we see public service and independent media being attacked, we’re starting to see MEPs questioning the legitimacy of NGOs and environmental groups. If you're serious about defending democracy, why not act on those things rather than add more stakeholder forums and guidance documents?"
EU officials counter that the DSA was deliberately set up so as to militate against Commission over-reach, and that every case will need to be watertight in order to withstand a challenge at the European Court of Justice.
The Seville-based European Centre for Algorithmic Transparency (ECAT) can pore over hundreds of thousands of lines of code, as well as documentation, to figure out if X or others are facilitating the artificial boosting of one particular message to a disproportionate number of voters in a way that renders an election free, but not fair.
Yet, tech companies have very expensive lawyers who will in turn demand thousands of pages of documents and seek to rebut every line of accusation, say officials.
Michael McGrath insists the Democracy Shield will strengthen and update the entire system of preventing electoral interference, by learning from the vivid experience of events like the Moldovan and Romanian elections, and, in due course, better resourcing fact-checkers, civil society, independent and public service media.
"The ambition of the Centre for Democratic Resilience is to do better in terms of situational awareness - this upfront knowledge, and then to support that on the response side," he said.
Other experts suggest there is merit in bundling together new elements reflecting lived experience, so long as they’re well-funded.
"We know fact-checkers make a difference," said Professor Rasmus Kleis Nielsen, of the Reuters Institute for the Study of Journalism.
"We know it's important to have the kind of research that the European Digital Media observatory does. We know it's a good idea to keep these things separate from public authorities to ensure the independence of such work. These are all sensible, incremental proposals.
"But if public authorities really want to act they need to commit significant funding. It’s not clear the Democracy Shield represents a real qualitative jump in terms of money," he said.
Michael McGrath has said the Centre for Resilience will be a work in progress, and that further legislation may be warranted, as the threat evolves.
He was adamant that none of these measures will amount to censorship, insisting that freedom of speech is a non-negotiable pillar of the EU’s Charter of Fundamental Rights.
That notion has been rejected by far-right groups, while Hungary and Slovakia are expected to oppose the Democracy Shield outright.
Hungarian MEP Kinga Gál, vice-president of the far-right Patriots for Europe group, said on X that "fact checkers" and "politically motivated" NGOs would gain control over public debate.
"This is not about defending democracy, it is about protecting Brussels’ liberal narratives and silencing patriotic voices," she said.
The Irish Commissioner has his work cut out for him.