Analysis: social media platforms have a range of tools to address the problem of cyberbullying, but many may not be aware of them
Many parents and teachers might be wondering what to do when cyberbullying takes place, especially in the aftermath of a recent tragic case where a young girl in Ireland died after years of cyberbullying. Some are calling for legislative action to make cyberbullying illegal and there have been significant efforts in the UK and EU to regulate social media platforms’ liability for harmful content on their sites.
As I have discussed previously, it is necessary to ensure that platforms take responsibility for this matter and that their responses and tools provided are effective. But we also need to be careful about how we design such regulation lest it fail to address the problem as some of the legislation in the past did.
Regulation alone will not solve the issue of bullying and cyberbullying as it is a broader cultural and social problem. Australia introduced legislation where the government-appointed e-safety commissioner has the authority to review online content that had previously been reported to the company but where action has not been taken. If the commissioner established that the content needed to be taken down, the company would face the prospect of a fine if it did not take the content down by a designated deadline.
Last year's review of the 2015 legislation shows that the commissioner worked effectively with the companies to swiftly remove much of the reported content relatively quickly. Something similar has recently been proposed in Ireland.
From RTÉ Radio 1's This Week, Carole Coleman reports on the impact cyber-bullying is having on young people and examines calls for stronger criminal sanctions against offenders
Regardless of the outcome of the calls for regulation, most of the major social media platforms, such as Facebook, Snapchat, Instagram and Twitter are currently employing tools that aim to address the problem of cyberbullying. It is the responsibility of the industry to ensure that these tools are well known to children, but also to teachers and parents.
Many will be familiar with blocking and reporting, which allow users to prevent abusive people from further contacting them or to report abusive content to the platform. But fewer may have heard of Safety or Help centres, which provide advice as to what to do when cyberbullying happens and how to assist the child on and off the platform itself.
The EU Kids Online network recently completed surveys with 9 to 17-year old internet users in 14 European countries. The results of the survey in Norway, for instance, where independent use of digital media is very high among children, show that almost two thirds have used the blocking button, with about 10% saying they did not know what it was or that they have not seen it. Two-fifths have used a reporting button and about 15% did not know what a reporting button was or said they had otherwise not seen it. While only 10% have actually used Safety or Help centres, more than half say they know what they are, though 35% did not know what these were or have never seen them.
From RTÉ Radio 1's Morning Ireland, James O'Higgins Norman from the National Anti-Bullying Research and Resource Centre on how over one in ten schoolchildren have been cyberbullied
Results in the same survey from Serbia also indicate that children are less familiar with Safety and Help centres than with reporting and blocking tools. There, 84% of children have seen blocking options, 76% have seen reporting and 59% have seen help or safety centres.
Some bigger companies, like Facebook, have developed more nuanced tools like social reporting, which are based on neuroscience research, and allow users to reach out to users who they feel had bullied them. They could contact those who offended them with pre-made messages, designed to trigger empathy, and ask them to take down hurtful posts. In an earlier small scale study I did in Norway, I found that only 13% of the 152 children surveyed had actually seen this tool.
Of course, awareness is only part of the equation and the other issue is ensuring that these tools are actually helpful. Taking down content may not solve the offline component of the problem and, as focus group respondents pointed out, it may not address the relational aspect of the problem that two or more teens are having.
From RTÉ Radio 1's Drivetime, Tanya Lokot from DCU on what can be done about hate online
Blocking or muting content may leave the victim wondering what is now being said about them behind their back, while reporting to the company may provoke more mocking or even retaliation against the victim. Reporting may be perceived as a sign of weakness and the child may fear that the perpetrator will find out who reported them once their content is taken down so they may avoid reporting even if they know that it is available. Telling adults that they feel bullied online or excluded from a community may not be an easy action for a young person to take, and adult involvement is not always well handled.
My respondents pointed out it might also be difficult to persuade the perpetrator to listen to advice from the Safety Center telling him or her to be a good digital citizen by avoiding hurting others and refraining from bullying. 48% of respondents in Norway thought that social media platforms were able to help children "a little" when bullying happened, 11% thought platforms were able to help "a lot" and 23% believed they were not able to help at all. These findings highlight the point that bullying and cyberbullying require multi-stakeholder responses and sustained investment into capacity building with everyone involved.
The views expressed here are those of the author and do not represent or reflect the views of RTÉ