Facebook founder, chief executive and chairman Mark Zuckerberg will begin two days of answering questions in front of US legislators later today.
With so much controversy surrounding the social network right now, the questions are likely to be many, the topics varied.
In particular, they will want to know more about the background to and the handling of the improper sharing with UK political consultancy Cambridge Analytica of data gathered by a third-party app.
But there is also likely to be questioning around fake news and the alleged manipulative role played by Russian actors in the democratic process in the US and elsewhere.
Last night, Mr Zuckerberg’s opening statement to the joint Senate committee to who he will testify tomorrow was released.
The Facebook boss accepts responsibility for the social network's failure to protect private data and prevent manipulation of the platform.
"But it’s clear now that we didn’t do enough to prevent these tools from being used for harm as well," he is set to tell the committee.
"That goes for fake news, foreign interference in elections, and hate speech, as well as developers and data privacy.
"We didn’t take a broad enough view of our responsibility, and that was a big mistake. It was my mistake, and I’m sorry.
"I started Facebook, I run it, and I’m responsible for what happens here."
But contrition, as welcome as it is even at this late stage, will be no substitute for explanation and justification of actions or inactions in the eyes of those he will give answers to.
So, here are some questions we think they should ask him that get to the nub of what this controversy is all about:
Why did it take it three years to close the loophole that allowed third party app developers to gather not only data from Facebook users but also their friends, after it was pointed out to the company in 2011?
When Facebook first launched in 2004 the vision according to Zuckerberg was that more apps should be social. To allow that to happen, the platform enabled people to log into third party apps and share who their friends were and some information about them. But that loophole ultimately enabled the app, "thisisyourdigitallife", built by University of Cambridge professor Aleksandr Kogan in 2013, to reach not only into profiles of users but also their friends, under the guise of a personality test.
That ultimately allowed the data of 87 million people to be accessed, including 45,000 here in Ireland. But the Office of the Data Protection Commissioner here had warned Facebook about the loophole in 2011 and 2012 during routine and follow-up audits. Yet it wasn’t finally closed until 2014. Why was that, and who is responsible?
If "thisisyourdigitallife" had access to huge amounts of user data before the loophole was closed in 2013/14, what other apps also did and what did they do with that data?
Aleksandr Kogan and the team he worked with are clearly clever people. But it stretches credibility to think that they were the only ones who had the idea of gathering data in this way at that time. Facebook has already said it is working to establish this and surely by now it must have a pretty good idea if there are other apps also implicated.
So the committee should ask the Facebook founder how many apps were doing similar things, what data could they have collected, where is that data now and what potentially could that data have been used for? Even in the last 48 hours another group of apps was suspended from the platform while the company investigates a report by CNBC that they also broke the rules on data gathering and sharing. There surely must be more.
Why did Facebook not tell the public and regulators that personal information gathered by a third party had been improperly shared with Cambridge Analytica when it first found out about it in 2015?
Facebook knew that the "thisisyourdigitallife" app had gathered data which was then provided to Cambridge Analytica a full three years ago. It was told by journalists at the Guardian. Facebook sought and received assurances that the data had been deleted. But if it needed to be deleted then by definition it was sensitive.
Therefore, why did nobody in Facebook think this was something that those who owned the data (i.e the users) should be told about? Any why also were regulators not informed so that they could independently put pressure on Cambridge Analytica to dispose of the information and verify this had happened?
Is it not time for Zuckerberg to cede some control of Facebook at very least, if not resign completely?
In the corporate world, the buck stops at the top. Time and again in the last fortnight, Mark Zuckerberg has said he is responsible for this mess and he is sorry. But if he is responsible, should he not relinquish some of his control of the social network. Not only is he the founder and chief executive, but he is also the chairman of the board and controls 59.7% of the voting stock. In reality, that means that he cannot be fired. Not exactly a model corporate governance set up for one of the world’s largest tech firms.
You could make the argument that he is a brilliant mind and that nobody knows the company better than the 33-year-old, which puts him in the best place to fix the problems. But equally, hasn’t he been best placed in the past 11 years to get it right and foresee what might happen as the company and its influence grew? Last week, he was asked about this by journalists during a conference call and responded that he was the best person to continue leading Facebook because life is about learning from mistakes. But the committee should nonetheless ask once more whether it’s time for a fresh pair of eyes at the top.
Right now can you say without hesitation that the voting preferences of Facebook users around the world will not be manipulated by fake news on the platform and the malicious use of their own personal information against them?
Mr Zuckerberg has repeatedly pointed out recently that along with the US midterms there are presidential elections in India, Brazil, Mexico, Pakistan, as well as other electoral contests elsewhere this year. Facebook has said there is a huge amount of work to be done to right the wrongs that have been identified and that this work will take years.
If that’s the case, don’t users and others have the right to know how vulnerable they are to the threat of fake news and the possibility that their personal information is being used to try to manipulate how they vote? At least then they can adapt their behaviour accordingly and be alert to the threats. Otherwise, we risk a repeating cycle over the coming years of claims that the outcomes of democratic processes around the world have been undermined by a social network that some would argue is out of control.
Comments welcome via Twitter to @willgoodbody