Cutting content in a crisis

The responsibility for maintaining online safety relies on content moderators particularly in times of crisis. However, not all platforms even have moderation systems in place and so disinformation, misinformation, propaganda, and fake news often circulate freely. The time of the COVID-19 pandemic was a case in point, but the propagation of fake news occurs during times of political change and in the wake of other kinds of crises and socioeconomic upheaval. However, there is much content online that is illegal rather than simply being fake and that must be removed summarily.

Some social media platforms and websites do have individuals and even teams who are tasked with checking user-generated content to ensure it does not contravene the law. Elena Martellozzo, Paula Bradbury, Ruth Spence, and Jeffrey DeMarco of Middlesex University, London, UK, and Paul Bleakley of the University of New Haven, West Haven, USA, point out that during and after the COVID-19 pandemic there was a surge in the volume of illegal content. They report details of their findings and the implications in the International Journal of Technology, Policy and Management.

The researchers have looked at the experience of content moderators during this period and their findings offer new insights into how this important online role can affect the moderators’ mental well-being. Indeed, the upward trend in illegal material being shared online, exacerbated by lockdown measures during the pandemic, put the content moderators under immense pressure. There was a heightened risk of personal burnout, mental health problems, and even trauma when it came to particular kinds of illegal content that required moderation. The new findings suggest that there is an urgent need to improve the working conditions and personnel backup for such moderators.

Lessons drawn from the pandemic era should provide service providers and their staff, including their content moderators, useful guidance for the improvement of working conditions. Employers must prioritize mental health support, fair compensation, and comprehensive training, the research suggests. This is especially important given the role played by content moderators in helping to remove illegal content from the internet.

The researchers add that clear communication, professional development opportunities, and tailored support mechanisms, particularly for those working remotely or in a hybrid work environment, are important considerations for employers and service providers.

Martellozzo, E., Bleakley, P., Bradbury, P., Spence, R. and DeMarco, J. (2024) ‘Supporting digital key workers: addressing the challenges faced by content moderators during and after the COVID-19 pandemic’, Int. J. Technology, Policy and Management, Vol. 24, No. 2, pp.212–228.