In digital healthcare, who moderates the moderators?

The collapse of The BeatBullying Group raises serious questions about duty-of-care in digital support services, and to what extent such services need to be regulated. 

As an independent consultant to the charity’s mental health service, Mindfull, I was aware of the lengths that were taken to ensure rigorous safeguarding procedures worthy of the trust of users and third party stakeholders. My experience of working with the charity, and with the staff there, was entirely positive. It’s for this reason that the events of the last few weeks – with the sudden closure of services and secrecy – have been such a surprise, to myself and to many others.

The charity’s safeguarding protocol involved a sophisticated hierarchy of moderators, administrators and mental health professionals – all watched by NetMod, a piece of algorithmic technology that picks up on keywords and text patterns. The aim was to provide a comprehensive safety net for users so that they could depend on the service. Apparently, in spite of such measures, there was no safety net for the service itself.

Since the service was taken offline, communication from The BeatBullying Group’s Trustees has been slim-to-none. Staff and consultants were aware of delays in paying salaries some months ago, but were unaware that the service was under threat. Emails since the closure have been deliberately vague, speaking of “intense negotiations with a number of third parties”, but revealing no details. For those professionals with concern for the service, all that’s really known is that the Trustees decided that it would be better to pull the plug on services rather than openly discusses the challenges of keeping it online.

I have no reason to doubt that the charity’s trustees did what they thought right for users, or that they consulted legal professionals and mental health experts before switching off services. But had they sought the views of those of us working for the services, I have no doubt that the idea of switching services off without notice would have received no serious consideration.

At the heart of any digital support service is trust. Those turning to an anonymous service like BeatBullying are often doing so because they find it difficult to find places of trust elsewhere; it may have been ripped away from them by troubled family lives or abusive parents. BeatBullying promised to be a place of trust and dependability, and the thousands of young people using the service saw it as such. That trust and dependability has disappeared. Not just because of the loss of the service, but because of the manner in which it was switched off. To do so without prior notice or consultation has the whiff of a foster parent that has tucked their children into bed and then vanished during the night.

Were it a face-to-face support service or rehabilitation centre with regular day users, it would surely be immediately investigated if it closed its doors to users without notice – particularly as a registered charity with public funding. There would be an explicit duty-of-care to service users. At present, online support services appear to be held to a different standard, or none at all. For the public and mental health professionals to have faith in such services, this must be addressed.

As for the future of The BeatBullying Group, and its services, staff and volunteers continue to wait patiently for news whilst worrying about those young people who were depending on the service. For those feeling let down by the charity, the only consolation a conclusion may bring rests with the hope that important lessons may be learned.

Anyone needing support can access a 24 hour listening confidential listening service through