has a legal duty to monitor the deprivation of liberty safeguards. Regulations give them powers to visit hospitals and care homes, interview patients or residents, and inspect the paperwork in relation to anyone who is deprived of their liberty under the safeguards. They are required, by law, to produce an
report for parliament,
as soon as possible after the end of the financial year
‘when requested to do so by the secretary of state’.* The safeguards came into force in April 2009. It is now March 2011, and this is the first report we have seen by CQC (here
). There have been other reports, though not by the CQC. The NHS Information Centre produced what is essentially a statistical report into the DoLS last year (here
); the report told us that there was very low and uneven uptake of the safeguards, but offered little analysis of why that might be the case. The Mental Health Alliance report last year (here
) helped to fill that vacuum of analysis; they drew from their considerable pool of experience of their members to paint a picture of a legal framework that had ‘barely begun to function’. Key problems identified by the Alliance included poor understanding of the safeguards by both managing authorities and supervisory bodies, and in some cases outright resistance to their application or manipulation of their provisions. They stressed that there were many disincentives to their use – excessive bureaucracy, unwelcome additional scrutiny of care practices, empowering detainees and relatives who might object to care plans – coupled with very little real threat of litigation if they were not applied, or applied inappropriately. Some might say that this was a situation when a regulator, who had oversight not only of the managing authorities of care homes and hospitals, but of the supervisory bodies themselves, could have a powerful impact on compliance.
Today the CQC published their report. When the CQC’s first report into the Mental Health Act was published (here
), there was a certain amount of fanfare – a press release on the front of their website, distribution to various media networks, reports in major newspapers. Since I’ve been awaiting this report for some time I’ve been checking the CQC’s pages for special reports into mental health
– the hyperlink for the DoLS report is still dead. It’s hardly a great metric for quality, but the length of the report has absolutely left me gasping. Eighteen pages, including the title page. I’m not joking when I say I have checked and rechecked several times to make sure I’m not just reading the executive summary, and the real report is elsewhere. By contrast the Care and
Social Services Inspectorate Wales (Wales, who was expected in the original impact assessment to have one-twentieth the number of DoLS authorisations as England) has produced a report twice as long, with statistical tables and a press release to accompany it (here
The real comparator, in my eyes, should be the way the CQC, and formerly the Mental Health Act Commission (MHAC), monitor the Mental Health Act. The depth, quality and insight of the MHAC reports into the functioning of the Mental Health Act left nothing to be desired, and much to be admired. They were eloquent and considered. They drew from the experiences of the Mental Health Act Commissioners themselves, reported the comments and experiences of detainees, and even engaged in complex legal analysis (if you read the acknowledgements, the reports are informed by the perspective of range of academic and legal practitioners). They were rich with a sense of tradition and history, even quoting from John Perceval
to highlight parallels, from across the centuries, in the experiences of those detained for mental health treatment. And they were long. Placed Amongst Strangers
, 312 pages. In Place of Fear
, 443 pages. Rights, Risks, Recovery
, 290 pages. The first and only monitoring report by the CQC into the Mental Health Act is noticeably shorter, at 124 pages, but the quality is still high and the analysis still penetrating. There is so much to be said
about the deprivation of liberty safeguards, being an incredibly complex legal framework and having such poor compliance, that it is really disappointing that they have not approached their legal duties to monitor and report with the same degree of rigour and care that the former Mental Health Act Commissioners took.
The content of the report
The content of the report tells us very little that wasn’t already in the public domain, or in the two reports previously issued. It tells us that ‘some’ managing authorities ‘are demonstrating good practice’, ‘many’ PCTs and councils ‘made good progress’, but offers little by way of in depth analysis of what this means. The main methodology the CQC has relied upon for reporting on how supervisory bodies are implementing the safeguards appears to be some kind of survey (but no copies of the survey are appended to the report). This seems to follow an increasing trend by the CQC as a whole to rely upon self-report, rather than inspection and observation, to inform their judgments of how organisations are functioning. In November 2010 Paul Burstow, Minister for Care Services, announced that yearly inspection of council’s by CQC would end. DoLS used to be inspected within this framework, but according to my correspondence with the Department of Health and CQC no decisions have yet been taken about how this will continue without the performance assessment framework.
The survey ‘asked about’:
- What mechanisms the organisations had put in place to make authorisations.
- How they were dealing with the expectations now being placed on them.
- The consistency of the threshold that assessors appeared to be applying when determining whether there was or wasn’t a deprivation of liberty.
- What arrangements were in place in their area to support the professional development of those acting as assessors
They state that only 60% of councils supplied sufficient detail in response to their requests, and less than half of PCT’s. In research, we worry about selection bias, whereby participants taking part in a study or responding to a survey are not representative of the wider population, because there are systematic differences between the sample group and those who didn’t take part. In this case, it’s highly possible that the PCT’s and councils who did not respond, or who supplied insufficient detail in their response, are the ones that we should be worrying about. Failure to comply with a request for information by the regulator is an indicator that either they are concerned that the information they hold may not be wholly positive, they simply don’t hold it, or they are under higher levels of work pressure than responding bodies. Thus, the sample the CQC has relied upon is likely, if anything, to overestimate compliance – but they don’t qualify their findings by making this explicit.
Respondents were allocated to a category on the basis of how well their responses indicate they are implementing the safeguards: comprehensive (14% councils, 37% PCT’s), solid (21% councils, 45% PCTs), partial (31% councils, 10% PCTs), early (33% councils, 8% PCTs). Without more information about
what the survey asked, and [edit: the survey was exactly those four questions quote above] what kinds of responses were received, it’s hard to view this information as telling us anything of real value. Set against a wider picture of low uptake of the safeguards, against the picture or poor understanding and compliance painted by the Mental Health Alliance report, these vague and ill-defined categories tell us nothing about how supervisory bodies are failing to comply and where exactly they need to improve. Making recommendations for ‘training’ is hopelessly vague – what should the trainers be focussing on?
Despite the CQC’s wider aims to focus on ‘outcomes’ for service users rather than processes, the subjects the CQC has asked about are strikingly process-oriented. Where is the perspective of detainees and families in their dealings with managing authorities and supervisory bodies?
In any case, there actually are some processes in the deprivation of liberty safeguards that it would be really useful to know about – for instance, how reliably are supervisory bodies referring unpaid representatives to advocacy services (when they are required to do under s39D Mental Capacity Act
)? Are supervisory bodies tending to use paid representatives or IMCA’s (section 39C) for unbefriended persons, and are there differences in outcome between these? Are supervisory bodies following Department of Health’s guidance
not to avoid appointing representatives on the basis they might oppose the safeguards, and to refer cases to the Court of Protection when disputes are ongoing? How are supervisory bodies dealing with the sometimes intractable situations where deprivation of liberty is not in a person’s best interests, but it is actually occurring? These things, and more, I would like to have known – and would have expected to be contained within the CQC report. But if anything, the report’s authors appear just as curious about these issues as its readers will be.
The section on managing authorities is particularly interesting to me, as I have recently been reviewing a sample of CQC inspection reports for dementia care homes, for signs that they are looking into matters relating to the deprivation of liberty safeguards. To put my study of inspection reports in context, in 2009 the CQC issued guidance on how it was going to monitor the deprivation of liberty safeguards (available here (doc)
). Amongst other things, the guidance stated that
During 2009/10 we will always include a reference to the Mental Capacity Act deprivation of liberty safeguards in the Management and Administration section of the report…The report will say:
- whether there are any people living at the home subject to a deprivation of liberty authorisation, and if so whether the Mental Capacity Act deprivation of liberty safeguards and authorisation conditions are being met
- whether anyone living at the home is having their liberty deprived without an authorisation, and if so what has been done to make sure that the law will be met in future
- that (where relevant) individual people’s experiences of the authorisation, the care they receive under it, and any requirements and recommendations made, have been included in the relevant outcome areas of the report
The long and the short of it is that the vast majority of the sample of inspection reports I looked at made no mention of DoLS whatsoever. Of those that did, the comments were primarily to the effect that staff had said they had been on training on the safeguards, or that the manager said she knew about them. Literally not one of the reports looked at reported on paperwork, or the relevant individual’s experiences. Only a tiny handful mentioned how many people were subject to authorisation, and I happen to know that those involved situations where deprivation of liberty came to light following criminal assault by the care providers themselves. It simply cannot be the case that these inspectors were not coming across cases of authorised deprivation of liberty; the region I looked at has one of the highest rates of application and authorisation. It is also in the South West, which the CQC says is ‘good’ at inspecting for DoLS. It also seems unlikely that no care homes were visited where unlawful deprivation of liberty was occurring. The answer must lie with the inspectors themselves either not being aware of, or not complying with, the guidance. Interestingly, the CQC has subsequently (and silently) released new guidance, which omits the sections quoted above (available here
). Shifting the goalposts of how inspections are conducted without public consultation, or even publicisation, is increasingly typical of the CQC. I have already described here
how they are currently reviewing how visitation under the Mental Health Act is conducted, without public announcement or consultation.
One area of real concern is that, given the low uptake of the safeguards and poor compliance highlighted elsewhere, people are being deprived of their liberty without authorisation, and thus without access to the safeguards. For someone who has a mental disorder, who lacks mental capacity, and who is deprived of their liberty, the likelihood that they will have the wherewithal and ability to alert someone to their situation is pretty slim. This is precisely the kind of situation where it would be hoped the regulator would pick up on unlawful deprivation of liberty and make requirements of managing authorities to seek authorisation. Yet we have no data, in this report, on how often that is happening. I have been collecting data myself on how often CQC refers care home managing authorities to supervisory bodies for possible unauthorised deprivation of liberty (the CQC having refused to give me this data on the basis it would be in this report – it isn’t), and the picture so far is that it is overall a very rare occurrence. Unlawful deprivation of liberty is not likely to be picked up by desktop self-assessments. It is not likely to be picked up by scanning the internet
. Given that in most circumstances family themselves will not be aware of the existence of the safeguards, and in some cases may themselves have commissioned the care that is a ‘deprivation of liberty’, it seems unrealistic and unsafe to rely on them to alert the authorities to unlawful deprivation of liberty. In the distant past, when care homes were inspected every six months, and inspectors were on first name terms with care providers, there was a possibility that unlawful deprivation of liberty could have been picked up on and dealt with accordingly. I see that possibility as increasingly remote now, with inspection frequency dwindling.
In fairness to the CQC, one difficulty is that the nature of what the courts consider to be a ‘deprivation of liberty’ is itself shifting, and it is shifting in such a way as to make it harder to pick up on in the brief time they have for visits to hospitals and care homes. The CQC’s discussion of managing authorities focuses extensively on two issues, training and restraint. Training is a relatively easy thing for inspectors to check for as it will be documented (whether staff have understood and implemented the training is another matter). Restraint, too, may be fairly apparent to inspectors making visits. The comments made in the report indicate that they view restraint and ‘restrictions on liberty’ as pretty central to whether a deprivation of liberty is occurring:
When using forms of restraint, it must always be considered whether the extent of this restraint means that the person is being deprived of their liberty. (p11)
Some care homes restricted residents’ access by locking doors or having doors with keypads. While this may be necessary to prevent some people from being harmed, the care home also needs to follow the Deprivation of Liberty Code of Practice if the restrictions amount to a deprivation of liberty. (p12)
Tangible and clear cut restrictions such as locked doors are sometimes not even considered in the sense that they may in themselves tip the balance from a restriction of liberty to a deprivation of liberty. (p13)
We came across too many examples of people using services who were being cared for in ways that potentially amounted to an unlawful deprivation of their liberty, and therefore potentially a breach of their human rights. In most cases, this was because services imposed significant restrictions of liberty without any consideration of the Safeguards. (p15)
The problem is that although European case law and the code of practice does stress that ‘The difference between deprivation of liberty and restriction upon liberty is one of degree or intensity’, domestic case law is increasingly moving away from this position. There have been several recent cases where people have been subject to pretty tight restrictions on their liberty, including restraint and locked doors, but have not been found to be deprived of their liberty by the courts (see this
blog post, and this
, and this case
). Increasingly, what influences the courts in whether someone is deprived of their liberty is whether they, or their families, are objecting. But objections could easily happen at times or in places where the inspectors would not see them. And a person could be objecting, deprived of their liberty, yet not subject to particularly restrictive care or restraint (as in this case
). Without meeting with residents and families, the inspectors would be unlikely to pick up on this. That’s probably why they are empowered by the monitoring regulations to interview service users and their families. But how often is this being done? We have no idea. The report does not tell us. Restraint and locked doors may, sometimes, be useful triggers to seek authorisation – but their overlap with ‘deprivation of liberty’ as it has now come to mean is low, and the report does not acknowledge that. In fact, it seems to be wholly uninformed by recent case law.
Reflecting on the CQC’s own monitoring role
Towards the end of the report the CQC acknowledges that its monitoring role is very much in development still. The CQC was formed in 2009, in the same month as the safeguards came into force. As former Mental Health Act Commissioner, Chris Heginbotham, stated this ‘poor timing may have important implications for the effective and accurate implementation of the DoLS’ (here
). But it has been nearly two years**
now, since the CQC was created and the DoLS came into force. They state that ‘the evidence we have gathered during our first year of monitoring has been necessarily limited’ – necessarily limited by what
? Certainly not by time, but perhaps by the resources made available to the project. They state ‘This report also revealed some uncertainty among our own staff, as well as among care home staff, as to which circumstances may require a managing authority to apply to deprive someone of their liberty’. This is a really concerning state of affairs, if the very body charged with regulating the safeguards, of ensuring they are implemented when they should be, is themselves unsure about their application.
The CQC bemoan the infrequency and inaccessibility of the guidance on case law issued by the Department of Health, but that is hardly likely to improve. The Department of Health disbanded its MCA/DoLS implementation team last month – job, apparently, done. One might have hoped that this report itself would be a source of information and guidance for managing authorities and supervisory bodies. Certainly the Mental Health Act Commission reports discussed case law and policy developments in considerable detail. Part of the problem is that the CQC itself seems to lack authoritative expertise on the safeguards. This appears to be reflected in the shallow coverage the report gives to the crucial processes that underpin them: appointing and supporting representatives, referring cases to the Court of Protection, encouraging applications, ensuring assessments are timely, and actions in line with case law. The report reiterates, as surely everybody already knew, that there are ‘problems’ with the safeguards, that people don’t always understand them, that they are complicated. But it offers very little by way of exploration as to how
those problems are occurring. For professionals and families who struggle with the safeguards, this report tells them very little. The overwhelming sense I have when talking to people who are involved with them, is that there is a pressing need for leadership and information. One would have thought that perhaps the CQC could be one source of that. This report suggests it will not be taking up that role any time soon.
*My mistake; in the end there was no annual reporting requirement in the regulations. During the consultation into ‘Mental Capacity (Deprivation of Liberty: Monitoring and Reporting; and Assessments – Amendment) Regulations 2009’, the Department of Health asked:
Do you support the proposal that the Care Quality Commission should provide an annual report to the Secretary of State for Health as soon as possible after the end of each financial year? Do you have views on what this report should contain in respect of the monitoring of Schedule A1?
In the final consultation report it stated ‘All respondents said yes to the provision of an annual report’, although responses were mixed as to whether it should be part of the CQC’s general annual report, or a separate report modelled on the MHAC reports. Respondents asked for the report to cover numerical data on DoLS, details of the methodology used to monitor DoLS, the findings of the CQC in relation to compliance, ‘the experience of people who have been deprived of their liberty’, and other developments and case law. The government decided not to go forwards with the proposal they consulted on (ie. to require an annual report), but to rely on the existing requirement for the CQC to produce a general annual report – which would include DoLS. They left it to the CQC ‘to determine the scope, content and form of the report.’ They made provision, however, for further reports to be produced at the request of the Secretary of State for Health.
**In an article published in Community Care yesterday, Cynthia Bower – CQC Chief Executive – was quoted as saying ‘This was the first year of implementation of the safeguards, and all the organisations involved were feeling their way’. This is a slightly confusing statement – the report states it is for the first year of operation of the safeguards, which would have been until March 2010. But some of its contents relate to the current, soon to end, financial year. If Cynthia Bower’s statement only related to the first year of operation, it would be interesting to know if she feels that at the end of the second year all organisations are ‘feeling their way’, or whether there has been some change. If it reflects the current position, then talking in terms of the ‘first year of implementation’ is somewhat misleading.