Books

The Moderator: Immersed in a world of dark, disturbing content for €13 an hour

Chris Gray recounts his time working as a content moderator for Facebook where he says there was inadequate training for policing the internet and workers suffered depression and mental illness

Facebook content moderators watch toxic content to decide whether it can be shown to viewers. Picture: Getty

The ethics of social media companies are never far from our minds – just look at the current news cycle regarding Elon Musk’s Twitter acquisition. What often gets lost in these discussions is the human element such as job losses. Or, in the case of Chris Gray, former content moderators who have experienced post-traumatic stress simply from carrying out their duties.

Gray’s name will be familiar to anybody who’s followed Facebook’s controversies over recent years. ...