In this guest blog, Dr Andrew Hider introduces the key themes of the workshop he’ll be delivering at this year’s Restraint Reduction Conference. Focusing on a new system of data analytics he examines how this can be used to oversee the use of restrictive interventions and ensure that clinicians and senior leaders are notified when an incident occurs.
It is verging on cliché to say that mental healthcare is complex to provide, and mental healthcare systems are complex to run. We work in a system that tries to support people with complex problems. In practice, there is often little consensus about the nature, causes and appropriate response to the kind of problems that users of our services experience. Our understanding is limited. There’s much we don’t know, and there’s much that we are probably some way off from knowing.
In the midst of complexity our duty is to, at first, be confident that we know what is actually happening in our services, with our eye first and foremost on the safety of people who end up in our care. We can’t ever hope to prevent all adverse incidents, and eradicate clinical errors, but unless we are confident that we know what is going on in services we work in and that some of us are responsible for leading, we are failing. More of our energy should be spent on oversight.
In the light of the huge opportunity that technology gives to improve oversight and monitoring of complex systems, it is interesting that the use of technology to support operational delivery of mental health services remains relatively limited in comparison to its potential scope. My view is that this may be due to a persistent reluctance to apply quantitative methods. The roots of this persistence are in themselves complex but may be characterised as a ‘Tin Man’ issue. Mental health services are relational in nature. We try to heal, and are often healed ourselves, through relationships. Relationships are our technology. Relationships resist quantification – they are messy, entangled, laden with emotion. They are valuable intrinsically and in mental health services they should be the most salient feature of the environment. Mental health services should run substantially on ‘heart’, and they do, and the heart is often beating fast, activated by compassion, care and adrenaline.
Data on the other hand, are relational only in an operational sense. To lapse into cliché again, they are cold, hard, facts. They are reductive. They transform human behaviour into numbers. Some people see this as a bad thing: as dehumanising and as inevitably failing to capture the rich complexity of human beings. Many clinicians hate numbers, hate graphs, hate data driven decisions. We act on what we see, and when you are with a distressed person, and they’re asking you for help, your first response wouldn’t (and shouldn’t) be to turn on a computer and run an algorithm. If you did, you’d be like the Tin Man. The tin man doesn’t have a heart, and that’s scary.
Human behaviour, though, is part of nature and – if psychology has taught us anything in its first 100 years – often predictable. When you start looking at the behaviour of large numbers of people, patterns start to emerge (the workshop will show some examples of this). As we are part of nature – why wouldn’t they? People living in closed systems, exposed to certain sorts of relationships, tend to behave in similar ways. This is one of the messages of ‘Positive and Proactive Care’ (or certainly one of its underlying assumptions). Closed systems that are characterised by a high component of coercion, where one group of people holds power over another group, can easily degrade and become abusive. People feeling unfairly controlled tend to fight against control; fighting in an active way is what we call ‘violence’.
So, assuming that we will always find ourselves needing for clinical reasons, to sometimes keep people safe through coercive means when they are distressed and behaviourally unsettled, we should be using our knowledge about human behaviour to oversee the systems where this type of coercion occurs. This involves using data and with technological advances we now have the opportunity to use granular data for more comprehensively, to oversee the behavioural status of a mental health organisation in more detail:
- To analyse the impact of individual and system wide interventions.
- To check in detail when and where behaviour causing safety issues is occurring.
- To take a properly analytic approach to the conditions most likely to give rise to violence and restrictive intervention in our services, and, potentially, to use predictive algorithms to give us probabilistic estimates of the safety status of our services over defined time periods in the future.
My presentation will discuss the development of an operational system that allows for live, fully granular oversight of behavioural data in a multi-site mental health organisation, and that automates feedback of data to service users, clinicians and directors in order to support clinician and board level responsibility and accountability for monitoring the use of restrictive interventions. I will talk about experiences of the use of this system in day to day operational delivery of both clinical and governance functions. I’ll end off by talking about potential future developments in the use of predictive analytics to detect predictive risk signals in large multivariate data sets that might help us identify early on when systems are developing problems, and that may assist clinicians and boards to prioritise resources and take action pre-emptively in order to support improved safety.
All of the above is very technical – yes – but behavioural data should be part of the beating heart of our mental healthcare systems, not our tin man. Clinicians should, perhaps, start seeing it as an aid to humane and compassionate care rather than a reductive enemy only suited to the IT department and best kept well away from the therapy room.
Dr Andrew Hider is a Consultant Clinical and Forensic Psychologist and is Clinical Director of Ludlow Street Healthcare Group.