One of the features of the Crime Survey for England and Wales (CSEW) is that it provides us with insights into the question of police crime recording practice. It does this by asking survey respondents whether they reported to the police any victimisation disclosed to the survey. This can then be turned into an estimate of the number of crimes reported to the police in England and Wales, which in turn can be compared to what the police themselves recorded in a ‘comparable subset’ of Police Recorded Crime (PRC) crime types.
The ONS publishes a ‘User Guide to Crime Statistics for England and Wales’, the most recent iteration of which was published in January 2018. This includes a chart that illustrates the relationship (as a ratio) between the CSEW estimate of crime reported to the police, and the number recorded by the police. It tells us a very interesting story, in four parts.
(Notes: the offences included in the comparable sub-set for the period year ending December 1981 to year ending December 1999 differ slightly from those used from the year ending March 2003 onwards, due to changes in offence coverage).
1. From 1981 to 1999, we can see that for the ‘comparable subset’ of crime (see the User Guide for details) the police recorded between 5 and 6 crimes for every 10 that the CSEW (or rather its predecessor before 2012, the British Crime Survey) found had been reported to them by the public.
2. The National Crime Recording Standard (NCRS) was introduced in April 2002 to ‘promote accurate and consistent crime recording between police forces’ and ‘to take a victim oriented approach to crime recording’. It can be seen to have immediately resulted in the recording ratio rising to around 9 in 10. The NCRS took a couple of years to bed in, and the recording ratio peaked in 2003/04.
3. From 2003/04 to 2012/13, we can see a gradual erosion in the ratio, bottoming out at around 7 crimes in every 10. This is generally understood to have reflected a progressively weaker emphasis on the NCRS, compounded by the prevailing emphasis on police ‘performance’ that prioritised crime reduction and created perverse incentives for police forces to not record crime where possible.
4. Subsequent scrutiny, particularly by HMIC (in 2012) and the Public Administration Select Committee (in 2013), highlighted wide variations in crime recording practice and resulted in a renewed concern for crime data integrity, reflected in a rolling programme of HMIC inspections. This focus drove up the crime recording ratio from 2013/14 onwards, to the point that there are now more than 12 crimes recorded by the police for every 10 reported to the CSEW.
So, when arguments rage over whether the CSEW or police recorded crime are ‘better’ sources to understand what is happening to crime, we can see that at least in aggregate police recorded crime has a distinctly chequered history.
That is not to say, however, that the CSEW is perfect either. While seeming counterintuitive, that final point above, that the ratio has now reached 12 police recorded crimes to 10 CSEW reported crimes, highlights some important differences between the coverage of the CSEW and PRC, even within broadly (but not perfectly) ‘comparable’ crime types.
Most obviously, the CSEW only includes adult householders aged 16 of over in its headline published data, excluding younger respondents from estimates (although data on 10 to 15-year olds are published separately); this will have a particular impact on offence types with proportionately more child victims. Other exclusions include businesses and people not living in domestic households, such as students living in university halls, armed service personnel living in barracks, care home residents, prisoners, and the homeless. It is also believed that some ‘harder to survey’ residents of normal housing are under-represented, most notably young men in inner-city areas – who tend to be relatively heavily victimised. When the CSEW and police recorded crime tell different stories about trends – as has very clearly been the case recently – these details are part of the explanation.
It should also be noted that the design of the CSEW, with a rolling quarterly sample asking about victimisation in the year prior to interview, means the reference period differs slightly from the PRC data, with some of the offences reported to the CSEW occurring up to a year prior to the PRC data period. The implication is that short-term trends may differ slightly, with PRC reflecting changes before the CSEW.
Breaking down the analysis
What I didn’t know until I happened across them relatively recently, is that the ratios mentioned above are also available for a number of subsets of crime, in tables (specifically, Table UG14) that accompany the crime statistics User Guide. These data in turn shed additional light on the strengths and weaknesses of both the CSEW and police recorded crime (PRC), but also trends over time.
Here I have charted the data, with 95 per cent confidence intervals shown. There are eight crime categories, covering the years 2002/03 to 2016/17 (although only every other year is labelled on the horizontal x-axis for legibility). To view the chart in full click on the image below.
The first thing to say is that increases in the PRC:CSEW recording ratio are evident in the most recent year(s) for all crime types. Beyond that, however, there are interesting differences by crime type that I will examine in turn – although I will deal with violence last.
It is perhaps worth repeating briefly what the charts show: it is the ratio between the number of crimes of a given type (e.g. violence) recorded by the police and the number of crimes of that type the CSEW estimates were reported to the police by adult victims – based on the number of CSEW respondents who said they reported their victimisation of that crime type to the police. So, if 8 crimes were recorded by the police for every 10 the CSEW estimates were reported to them, then the ratio is 0.8. Note that a broadly ‘comparable subset’ of crimes is used so that the PRC and CSEW figures used are measuring a reasonable approximation of the same things.
So, let’s look at each crime type in turn.
– Robbery: From 2002/3 to 2011/12 we can see a gradual erosion from 0.65 to 0.50. This is then followed by apparent increases, albeit with generally much larger confidence intervals (probably reflecting the fact robberies are relatively rarely reported to the CSEW, the sample for which has been falling) and a rather anomalous low point in 2015/16. The likelihood that an important proportion of police recorded robbery victims are aged under 16 suggests that under-recording may in fact be even greater than it appears here (whether that is due to no crimes recorded at all, or recorded but against other categories of crime, for example theft).
– Theft from the person: A generally gradual erosion from 0.67 (in 2002/3 and again 2004/5 to 2006/7) to 0.5 in 2014/15, followed by a sharp increase to 0.75 by 2016/17. This suggests theft from the person offences are now recorded by the police more often than ever.
– Domestic burglary: We should expect the CSEW and PRC data to be well-aligned given that the focus is domestic burglary, the CSEW samples from residential households, and burglary is generally well-reported. Starting from a relatively high point of 0.84 the ratio reached 0.88 in 2004/5 shortly after the introduction of the NCRS, before falling to 0.6 in 2012/13. A marked increase is then seen in the last few years, reaching 0.89. This suggests that there was consistent and progressively more marked under-recording of burglary between 2004/5 and 2012/13 (some of which may have involved classification as other offences, such as criminal damage), which has now been largely reversed.
– Vehicle-related theft: At the end of 2016, one quarter of the 37.3m vehicles registered in Great Britain were company owned. Bearing in mind that CSEW is a household survey, this should serve to increase the overall PRC:CSEW ratio, and indeed that is what we see, with the ratio touching 0.99 in 2003/4. However, as with other crime types the ratio falls to 0.84 in 2011/12 before rising to 1.16 in 2016/17.
– Bicycle-related theft: Seems to be the poor cousin of the crime types here alongside theft from the person, suggesting police forces have always been sceptical about recording them (unless, for example, the CSEW for some reason over-samples the victims of bicycle theft). Even here, however, we see the ratio rise in the last year, to 0.71, up from 0.50 in 2014/15.
– Criminal damage to vehicles and dwellings: Given their very consistent patterns, we can take these two together – and conclude that they seem to have provided especially fruitful ground for suppressing recorded crime, both falling sharply from highs in 2003/4 (respectively 1.24 and 1.03) to lows on either side of 0.6, before rising from 2012/13 to around 1.00.
Finally, I want to give a little more attention to violence, given the extreme pattern we can see in the chart and underlying data, especially in the last few years.
– First we can see that the recording ratio rose to a peak of 1.01 in 2007/08, later than other crime types, and likely in part driven by the ongoing impact of the NCRS. It then eroded at a fairly consistent rate to reach 0.83 by 2012/13, suggesting progressively increased under-recording by the police during this period.
– Since 2012/13, however, we can see four years of sharp increases in the recording ratio, especially in the last year, reaching 2.01 – so two crimes recorded by the police for every one reported to them according to the CSEW.
How can we explain this?
– In 2014, HMIC (now HMICFRS) published ‘making the victim count’, which estimated that violent crime was the category of crime most under-recorded by the police nationally, with only two-thirds (67 per cent) of reports being recorded when they should have been. Since then, forces have been subject to a programme of inspections of their crime data integrity by HMICFRS, with clear signs of improved adherence to the NCRS. This has been reflected in increases in police recorded violent crime across almost all forces – but as high as 62 per cent in South Yorkshire, 61 per cent in Greater Manchester and 52 per cent in Durham in the year to September 2017 when compared to the previous year. In 2017, ONS reported that ‘[t]he 65% increase in police recorded violent crime between the years ending March 2013 and March 2016 has been largely driven by the police response to findings of 2 recent HM Inspectorate of Constabulary (HMIC) inspections’.
– In respect of domestic violence, it is considered likely that victims have become more willing in recent years to report their victimisation to the police, not least given the amount of effort and scrutiny that has been committed to improving the police response to victims in recent years. At the same time, the CSEW estimates are derived from the main face-to-face victimisation module of the survey, which is known to be associated with under-reporting these offences and has been subject to ‘capping’ (this latter point, which means only the first 5 victimisations of a particular crime type reported by an interview subject have been counted, is too technical to explain here, but you can read more about it elsewhere; it is now being revised). On the other hand, what is called the ‘finished incident rule’ only applies to police recorded crime – so a victim attending a police station to report being assaulted 10 times by their spouse is counted as one offence, while in the CSEW it would be counted as 10 (subject to capping).
– One particular change to police recorded crime that I confess I was not aware of, is that from April 2015 a change was made to the Home Office Counting Rules for recorded crime, such that reports of crime from professional third parties had to be recorded and counted. The ONS reports (in the User Guide to Crime Statistics for England and Wales) that this is ‘thought to have led to some increases in crimes against vulnerable people such as victims of child abuse, domestic abuse and elder abuse’. Many of these victims will not be included in the CSEW, either because they are under 16, don’t live in domestic housing (for example, because they are in a care setting), or because they are reluctant to disclose victimisation to the police (for example in the case of domestic abuse).
– Finally, it is possible that some forms of violence – notably public order offences – would be ‘coded out’ of the CSEW (not being victim-based) but counted by the police in violent crime statistics, underlining the point that the ‘comparable subset’ of offences is imperfect.
What can we conclude?
There is a lot of Devil-in-the-detail here, but it needs to be grasped to appreciate the relative strengths and weaknesses of the Crime Survey and Police Recorded Crime series, and therefore interpret the respective (and differing) trends seen in the data.
Having examined the detail, I think we can draw three conclusions.
First, that while all crime types seem have been under-recorded by the police in the past, some have been affected more than others. Criminal damage seems to be an obvious example where under-recording has been especially high.
Second, that all crime types have seen increases to their recording ratios in recent years, suggesting that changes in police recording practices have impacted across the board – if not evenly.
Third, violence continues to be a particularly difficult test for the credibility of both police recorded crime and the CSEW, especially given the obvious divergence between the two. It seems likely that any discussion of which is ‘best’ must at least be considered problematic, given the way they seem to measure (in overall terms) rather different things. Nevertheless, it is true to say that the CSEW has not been affected by changing police recording practices nor the apparently increased willingness of victims to report allegations to the police. For the crime and victim types it covers, the CSEW is much the more consistent source of insights into trends.
A final observation, set against this rather complex background, is that the quarterly release of quite detailed crime statistics, reporting a rolling 12-month period, seems at times to generate rather more heat than light. In no small part, this seems to be because the arguments about the ‘best’ source for insights into trends are constantly being rehashed, with widespread cherry-picking to support particular perspectives and agendas.
My personal opinion is that the public, police, wider criminal justice system and policy makers might be better served by a less frequent update, perhaps even reverting to an annual digest with all of the caveats clearly spelled out once a year rather than four times.