Facebook said it is tightening its live streaming video rules after the service was used to broadcast gun attacks on two Mosques in New Zealand, which left 50 people dead.

In an online post, the social media giant's chief operating officer, Sheryl Sandberg, said many people have "rightly questioned how online platforms such as Facebook were used to circulate horrific videos of the attack". 

"In the wake of the terrorist attack, we are taking three steps: strengthening the rules for using Facebook Live; taking further steps to address hate on our platforms; and, supporting the New Zealand community," she added.

Facebook is looking into barring people who have previously violated the social network's community standards from live streaming on its platform, according to the company's chief operating officer.

The social network is also investing in improving software to quickly identify edited versions of violent video or images to prevent them from being shared or re-posted.

Ms Sandberg said: "While the original New Zealand attack video was shared Live, we know that this video spread mainly through people re-sharing it and re-editing it to make it harder for our systems to block it.

"People with bad intentions will always try to get around our security measures."

The company identified more than 900 different videos showing portions of the streamed violence.

The social network is using artificial intelligence tools to identify and remove hate groups in Australia and New Zealand, Ms Sandberg said. She added that those groups will be banned from Facebook services.

This week, the social media giant announced it would ban praise or support for white nationalism and white separatism as part of a stepped-up crackdown on hate speech.

The ban will be enforced starting next week on the leading online social network and its image-centric messaging service, Instagram.