Facebook says it supports "the overall goals" of the proposed online safety bill, which threatens to fine or even block technology firms that do not comply with online safety codes.

Monika Bickert, the social media firm's vice president for global policy management and counter-terrorism, said she hoped the company would be "very involved" in the discussions around the bill as it progresses.

Ms Bickert was speaking in Dublin following the publication of Facebook's social white paper on content moderation worldwide - which it says is its contribution to the discussion around regulation, rather than a set of proposals.

In it the company outlines a number of potential approaches to content regulation, but also warns about the knock-on effects that could come with certain rules.

However the document was immediately criticised by European Union officials, with industry commissioner Thierry Breton describing the document as "insufficient".

He also said Facebook needed to adapt to European standards and not the other way around.

Ahead of the white paper launch Facebook CEO Mark Zuckerberg suggested the firm should face a level of regulation somewhere between that faced by a telecoms firm and a newspaper.

In its white paper the firm suggests the need for clearer rules around social media site's policies, as well as better structures for users to report issues and appeal decisions.

However it also warns that overly prescriptive rules could have unintended consequences.

For example, Ms Bickert suggested that setting time limits on responding to reported content could force firms to prioritise complaints based on when they were received, rather than the nature of the content itself.

Facebook has been accused of acting too slowly to deal with issues; including the attempted manipulation of elections, the publication of exploitative content, hate speech and threats of violence on its platforms.

It has also been accused of trying to push the issue of moderation on to governments and regulators, when it is something it could deal with itself.

Following a meeting with Mr Zuckerberg Vera Jourova, the European Commission's vice president for values and transparency, said Facebook could not "push away all the responsibility".

"It will not be up to governments or regulators to ensure that Facebook wants to be a force of good or bad," she said.

Ms Bickert said Facebook had invested heavily in improving its handling of content and had refined its processes over the years.

She said they now had 35,000 people working on safety and security worldwide, while it was also constantly refining its algorithms to try to catch offensive content as quickly as possible.