In August 2014 a blog entry entitled "The Zoe Post", by the ex-boyfriend of video game designer Zoe Quinn, falsely insinuated that Quinn's game had received a favourable review because of her sexual relationship with a journalist.
The post sparked a co-ordinated misogynistic online harassment campaign against Quinn and other women in the gaming industry which became known as gamergate.
It was the first, widespread and organised trolling or abuse campaign of the internet age and has been credited with initiating the emergence of the alt-right online.
Ten years on and this kind of co-ordinated online abuse happens on an almost daily basis and has become standard practice in US politics.
The day after Joe Biden’s disastrous debate performance, long before he dropped out of the race, the online abuse of Kamala Harris started.
Whereas Gamergate started on the relatively obscure websites of 4chan and 8chan, the attack on Ms Harris was immediately more visible on platforms such as X, TikTok and Instagram and even came from the Republican campaign.
There were several memes that implied a sexual narrative about Ms Harris, manipulated audio falsely showing the vice-president slurring remarks and altered images.
"I think the Internet has enabled networked misogyny in a way that previously we didn't see at that scale," Claire Wardle of Cornell University, who has been researching disinformation for almost a decade, told RTÉ News.
Ms Wardle says things feel more dangerous now, where you have swatting - a practice where people call in fake threats that cause a police SWAT team to arrive at someone’s house, and deepfake pornographic videos which makes any sort of public profile more hostile for women, not just in the world of politics.
Ben Decker, CEO of US threat analysis firm Memetica, says that as social media has developed it has become normal practice to try and exploit platform and game algorithms to create outrage.
"What used to be dirty political tricks and fighting, we've watched kind of increase at scale," he said.

He recounts the riots orchestrated by former Trump campaigner Roger Stone in Florida in 2000, during the dispute of the Bush-Gore election.
"Back then you had a couple of hundred guys in chinos, but then the internet created the scale that existed on January 6th when you almost had an overthrow of the government."
Disinformation has been central to US elections since we had the Russian troll-farm in St Petersburg trying to affect the 2016 election, but Mr Decker says that 2024 is the first AI election.
"Even in the time since Kamala Harris announced her presidency, the Trump campaign has actually started using synthetic media, AI generated photos, some showing or purporting to show Taylor Swift fans supporting Trump's campaign."
He says that where in 2016 you had a poorly photoshopped Russian meme - an image or video, often humorous that spreads quickly online - this year, you’ll have a very realistic image of a scene that was designed to inspire action.
"I think the difference more than anything else is scale," he said. "You can produce thousands of images at a speed with AI that you could not do with Photoshopped memes."

Former Trump campaign manager, Steve Bannon, famously said in 2018 - "the real opposition is the media, and the way to deal with them is to flood the zone with shit".
He meant that by filling the media landscape with so much disinformation, so that it becomes difficult for people to tell what’s real and what’s not and so undermines the media, which he saw as left leaning.
Mr Decker says AI allows people to fill that zone at a much larger scale.
"We've already seen Iranian troll networks actually using generative AI content to target US communities," added Mr Decker. "And I think that's very much the canary in the coal mine of how this is going to play out over the coming months."
The other big difference between 2024 and 2020, that both Mr Decker and Ms Wardle point to, is that social media platforms have reduced the size of their trust and safety teams.
In reaction to the foreign interference in the 2016 campaign, companies like Facebook and Twitter invested heavily in teams tasked with detecting and removing disinformation from the platforms.
"There's no doubt that there's foreign interference campaigns happening," said Ms Wardle. "And I feel less confident that they're going to get caught in 2024 than I would have done in 2020. These teams have lost a lot of talent and a lot of institutional knowledge."
As a result of cuts companies will rely more on automated systems which he says by design are easier to game.
In many ways the biggest difference with this election is that you don’t need as much disinformation, the country is already so divided that the damage is already done.
"I don't think you need as many falsehoods," says Ms Wardle. "You just need content that reinforces people's existing belief systems."
After 6 January 2021, many right-wing voices were thrown off mainstream social media platforms like Twitter and Facebook and they migrated to more right-wing platforms with less restrictions on abusive content like Telegram, Gab or Rumble.

You’ve also had the creation of platforms specifically designed for Republican content like Trump’s Truth Social, and My Pillow’s Frank Social, and then there’s a plethora of podcasts such as Steve Bannon’s 'The War Room'.
This new right-wing media landscape has meant that tackling misinformation on mainstream platforms, might not make that much of a difference as much of the audience has already left.
"You really don't hear the other side at all," says Ms Wardle. "You're not stumbling across anything that might challenge your view."
Ms Wardle’s biggest worry around disinformation isn’t for the election campaign but rather what might happen post-election on 6 November.
"I think there's two things, there will be AI generated videos that suggest that there has been wrongdoing in the voting system, and I also think Trump will use evidence supplied by the Democrats to show that it was a safe vote and claim it was AI generated."
This so-called 'liar's dividend' goes back to Steve Bannon’s idea that you can’t tell truth from reality. Not only can you create false content, but you can also claim that any genuine content you don't like has been manufactured.
"Whatever happens on election day, it's going be so freaking close," says Ms Wardle. "There's going to be this insane opportunity for Trump to say, you can't believe it that you saw."