As regular readers of this blog will know, I am interested in how human rights can be protected through social care regulation. A while ago I wrote a piece
which discussed how over the last decade the approach to regulation of social care had changed from a methodology that relied upon high levels of inspection to a methodology that used risk-based proportionate inspections. I wanted to follow up on a couple of things from that piece.
How do CQC decide when to inspect a service?
Firstly, I have had some correspondence with CQC since then about ‘Quality and Risk Profiles’ (QRP) in regulation of adult social care. CQC have been really generous with their time in helping me understand their approach. QRP’s are a new methodology used by the CQC to provide information for inspectors about services they regulate. They built upon screening tools used by the former Healthcare Commission, which I critiqued previously. Currently QRP’s are used in monitoring healthcare providers, but CQC are developing the methodology for a similar tool in monitoring compliance in Adult Social Care. After my discussion with CQC my understanding is that the screening approach I critiqued in my piece dated 11 July 2011 is no longer applied in this manner. Instead of inspecting a only sample picked out by the screening tool, today individual inspectors are responsible for monitoring compliance on an ongoing basis. QRPs are not relied upon as an automatic trigger for inspection of a quotient of services. Instead, they are just one source of information inspectors can rely upon when deciding whether a service is compliant.
The Health Select Committee flagged up concerns about possible over-reliance on QRP’s by inspectors in their recent – highly critical – report. They noted that the very serious failings at Mid Staffordshire NHS Trust were not flagged by screening data. They acknowledge that today the QRP is not the only source of information, but say:
…there is a risk that, especially given the increase in inspectors’ caseload detailed earlier, inspectors may be less able to cultivate and monitor other sources of information to complement the QRP. The threshold at which a risk pattern on the QRP is considered significant enough to trigger a visit could also rise. It is important that the CQC is able to rely on its inspectors’ judgement in these cases but there needs to be consistency of approach.’
‘The CQC must ensure its inspectors do not become over-reliant on QRPs. Even if the quality of data included in QRPs was excellent, such a tool could only ever present a patchy picture of the quality of care.’ 
So some key issues for CQC in their latest consultation will be to consider:
- What information inspectors can cultivate to base their decisions on in addition to QRP’s;
- Whether certain types of services should be considered potential ‘information black holes’ where service users are unlikely to flag up abuses or poor practice;
- What guidance inspectors should be given on when to inspect a service for compliance; and whether there should be any automatic triggers for inspection (perhaps abuse allegations by whistleblowers?)
It’s interesting to note that much more responsibility to ascertain compliance now falls on the individual inspector, rather than the organisation as a whole. This may be problematic if inspectors have large caseloads, and will depend on the quality of guidance and supervision available to them. If you have a view on this, then I really recommend responding to their consultation.
Updated figures on the drop-off of adult social care inspections
I wrote in July that recent concern at the drop-off in inspection frequency over the last year (expressed by Community Care
and the Financial Times
) had missed the bigger picture – that the greatest decline in inspection frequency had occurred in 2007 following a decision to change the minimum inspection frequency to once every three years (from biannually). Firstly, CQC have given me some updated data on care home inspection – so I can reproduce a more accurate version of the graph showing how inspection of care homes has dropped off since 2003:
So although the number of care home inspections in 2010/11 was only 38% of the previous year – the bigger picture is that the number of inspections in 2010/11 was only 9% of the number of inspections conducted in 2003/4 (10% if you equalise for the number of services registered).
Another way of looking at this is to consider the number of inspections a care home will receive on average per year. I have produced a graph showing how the average inspection frequency of adult social care residential services has fallen over the years:
Whereas in 2003/4 services were being inspected almost twice yearly, last year services received on average 0.19 visits per annum. For interest I also included figures from Ofsted showing how their inspection of children’s residential services has remained stable and at a much higher rate. Obviously adult care homes are very different from children’s residential services (which are not all for children with disabilities), but I think it is worth considering why the state feels that children’s services still need such high levels of regulatory control whereas services for adults with disabilities do not. Not all adults with disabilities are vulnerable of course, but many are, particularly many of those in residential care. I’m not sure what warrants such a discrepancy in the rate of inspection of both kinds of services.
What was the rationale for reducing inspection frequencies?
So that’s the picture for how inspections have declined over the years, but what justified this drop-off at the time? That question piqued my interest, so I started looking into the consultation by the Department of Health (note – not CSCI or CQC) that preceded these regulatory changes, called ‘Changes to regulatory framework for adult social care services’ (2005). The questions asked in the consultation document could be used in a course on research methods for a lecture entitled ‘how not to do a survey’. The first question is ‘Do you agree that CSCI should be enabled to focus its efforts where these are most needed?’ Talk about a leading question – who on earth would say no to that?! It rather begs the question – why ask? Well, perhaps if you wanted to influence the answers to the following questions – this is called an ‘order effect’ in survey methodology; often beloved of dodgy opinion polls as it elicits a response bias in later questions. The later questions were:
- Do you agree that the statutory inspection frequencies should be amended so that each provider is inspected at least once every three years, but with random and/or more frequent inspections depending on CSCI’s assessment of the quality of its services and risks to service users? (Agree/Disagree/Not Sure)
- If you are not in favour of a minimum three-year period for inspection, what period do you favour (e.g. two years, five years, or no specific minimum requirement at all)? (Two years; One year; None)
- Are you in favour of giving statutory force, via regulations, to requirements on providers to produce annual quality assurance assessments and (where requested) improvement plans? (Agree/Disagree/Not sure)
- Do you support the introduction of penalties for non-compliance with the requirements for annual quality assurance assessments and improvement plans? (Agree/Disagree/Not sure)
The consultation report gives statistics on the responses to each question, and we see that 62% of respondents were in favour of amending the minimum inspection frequency to once every three years. But the problem with the data presented as they are is we have no idea whether support for this proposal was across the board – or markedly more pronounced among service providers themselves (who made up the largest proportion of respondents). I asked the Department of Health
to give me the breakdown of responses by whether they were a public body, a service provider or a campaign group, but unhappily they had lost the analysis. However, they were able to share with me the responses of local authorities, providers and voluntary groups, and also the responses of CSCI and CSCI inspectors.
You can take a look at these yourself, they are quite interesting, although I’m afraid I’m not going to give a detailed analysis here. As a general comment though, the picture is quite mixed – in fact quite a few care providers are keen to have a robust and frequent inspection system so that the public have faith in the services they provide.
What I would like to draw your attention to, however, is the discrepancy between the ‘official’ views of CSCI on reducing inspection frequency, and those of the individual inspectors who responded. CSCI stated:
‘To continue with the current regime would mean that we have to continue frequently to inspect some services, which are very well managed and closely in touch with the needs of their customers, at the expense of focusing enough on those services which are providing a poor service… CSCI warmly welcomes the contents of this consultation document, which it consider essential in enabling it to modernise the way which it regulates, so that it can practice a proportionate regulatory regime, which concentrates firmly on those providers which are not delivering satisfactory services and provides a less demanding regime for those that are. The proposals therefore will enable CSCI to operate in a more flexible risk based way so that it can ensure its resources are used in ways which most benefit people who use regulated services.’
Now, this is quite an interesting argument because nowhere does it suggest that the overall number of inspections is going to be reduced, only that they are going to be more ‘flexibly’ distributed. The rationale is the more appropriate concentration of resources, rather than a reduction in the overall resources needed to inspect. Anyone responding to this consultation could be forgiving for thinking that average inspection levels would remain fairly similar, but some services would receive closer attention and some slightly less. The graphs above show that this is not what happened – inspections reduced across the board. In fact, it would be quite interesting to know how many services today actually do receive more than two visits a year, which was the rationale for reducing the maximum frequency elsewhere.
Some responses came from individual CSCI inspectors, and they took a markedly different tone. Each of the following quotes is from a different inspector’s response:
‘The reduction in statutory inspections is too drastic. A provider can go from being very good (outcomes for SU’s) to poor virtually ‘overnight’. This may be from a change of manager/staff/residents. The Risk assessment approach depends on having good information via complaints/notifications etc. Are we sure that we will get these info/indicators to feed our assessment?’
‘‘The manager of a service is absolutely key to whether a quality service is provided. A good home or service can go downhill very rapidly if a good manager leaves. Just 3 years is too long if the manager has changed. I believe a full inspection should be mandatory for all homes/services just six months after a registered manager has left and every six months until a new registered manager has been in post approximately six months and has proved his/her ability to provide good quality services.’
‘Three years is a long time with no contact. A lot can change in three years and the annual quality assurance assessment may not be robust enough to provide adequate information to make an assessment of the health of the service. Current self-assessment information we receive tends to provide minimal information, lacks evidence to support assertions and when checked against inspections is often found to be inaccurate.’
‘I read with grave concern the proposals made by Liam Byrne to enable CSCI to inspect care homes as a minimum every three years. Although I understand this is a minimum and many homes will be inspected more frequently, I am very concerned this could become the ‘norm’ if inspectors are busy or staff levels are cut. I do not believe there are any care homes where such minimal regulation would be appropriate. I believe such a move would be reckless, irresponsible and dangerous… On so many occasions I have unearthed poor practice by talking to service users and staff in what may appear a good care home. In many cases the registered person has not told me of these incidents. I am absolutely certain this decision will need to be revised once the next care scandal (e.g. Longcare) happens. In Cornwall we are currently investigating poor practices in care homes for people with learning disabilities, unregistered and run by the health service. Minimal or no regulation equals poor practices in many cases. Many of these services will now need to be registered.’
The last comment seems particular prescient in the light of what followed on from Winterbourne View. There was also a response from a business services administrator who agreed with the consultation questions, and a single inspector (who made no written comments) agreed with the reduction in inspection frequency. But overall – the tone of the front line staff within the organisation was much less enthusiastic about cutting inspection frequencies than that of the management tiers. Let us hope that this time, the CQC is able to do better than the Department of Health and listen to the comments of their own staff – which were remarkably accurate in their predictions.
Increasing inspection frequency again
Since Winterbourne View and since coming before the Health Select Committee, CQC have committed to increasing their inspection frequency to annual inspections for each service. Note that this is not a pledge to introduce a statutory minimum, binding by law (indeed, no such minimum exists at all since the Health and Social Care Act 2008), so it will rest upon the public and the select committee holding them to that promise. I think it would be a good start if they can do this, but I am a little sceptical that this is possible with their current financial settlement. I also asked CQC for some financial information on the cost of regulating services, and they were very helpful in providing it. I have produced a graph showing the average cost of an inspection, and the average cost of regulating a service, over the years:
What we can see is that whilst the cost of regulating a service has decreased over the years, the cost of inspecting a service has climbed slowly – and then very dramatically over the last few years. Now, you would expect the gradual increase in cost per inspection if the overall number of inspections has dropped, because there will be overheads that do not decline with the number of inspections. Additionally, if inspections became more targeted on poor services, then they would probably be more resource intensive. The sudden spike last year was explained to me by Alan Pickstock at the CQC, who said:
It’s important to note that the figures for the cost of inspection (£38,432,000, averaging £9,520 per inspection) are artificial ones. The financial year 2010-11 was an extraordinary year, as you know, because it straddled the changeover to the new regulatory system and a disproportionate amount of time was spent on registration, to the detriment of inspection. But we didn’t record how much time was actually spent on each. The figures are therefore calculated on the proportion of time that was spent on inspection in 2009-10 (29%), which we know could not have reached that level in 2010-11.
So, we actually don’t have an accurate picture on how much was spent on inspection last year. But what we can say is this – in order the get the number of inspections up to one per service per year, the CQC’s inspection activity would have to increase five-fold. If we assume the costs have stayed the same as in 2009/10 (which is likely to be an underestimate, given inflation and a new regulatory framework), it is worth asking where the money will come from to achieve this. The amount of money spent on regulating adult social care has fallen over the last decade:
As you can see, the CQC’s total budget
(ie. for healthcare, dentists, Mental Health Act visits, etc etc) is actually less now than the CSCI’s has been in the past. The amount spend on regulating and inspecting adult social care has reduced dramatically. I am no financial whizz, but I just can’t see how – with such a large reduction in the resources available to inspect adult social care, CQC will be able to increase the number of annual inspections by such a large amount? In part it is likely to be through only inspecting against some standards, so the inspections themselves will be less thorough, which is what their consultation considers. But even so – it does seem to be a very tough challenge they have set themselves. It will be unhelped by the Health Select Committee’s reluctance to endorse giving them another 10% for their budget (CQC’s request for the extra funding next year is currently being discussed with the Department of Health). I’d like to hope they can do it, but I can’t help but feel that once again parliament are setting them up to fail by refusing to give them adequate resources for the regulatory regime. I just hope that if that is also the view of the management, they will be able to say it publicly on this occasion.
Out of interest I just made a chart showing the annual expenditure of the CQC, with the total annual expenditure of its three predecessor commissions. I’ve taken these data from the annual reports of each commission. It’s important to note that it’s not a totally fair comparison because these bodies also included Wales, whereas CQC only monitors England. Obviously the overheads and running costs of three separate bodies is likely to be higher than a single one (especially a single one that has introduced home working and cut back on capital expenditure on office space). However, it might be of interest to note that historically the vast majority of regulatory expenditure was spent on social care – whereas it’s less than half now. It’s also interesting to note how little is spent on Mental Health Act visitation; which suggests that the costs of a parallel scheme for DoLS might be a realistic possibility in the longer term.
The CQC’s expenditure is purple, CSCI’s in yellow, the Healthcare Commission is green and the MHAC is blue. The red line is the expenditure of the previous commissions added together.]