You might think postcodes are simple but they are not. So many people use out of date or obsolete postcodes without knowing it. At UKCrimeStats, we have not just the standard 1.7 million live postcodes, but 700,000 obsolete ones too. We have also just added an additional 70,000 new postcodes that have been released. Keeping postcodes up to date is a work in progress and we work at it all the time.
Today we are delighted to be publishing the first in a series of posts from Nick Ross, probably best known as the long-time presenter of Crimewatch and author of a recent book, Crime: How to solve it – and why so much of what we’re told is wrong – which if you’ll forgive the plug, is one of the most interesting, well-written and thought-provoking books I’ve read for a long time. So you should read it – and no, before you ask, I’m not on commission !
Today then, we have a treat for you – the publisher has agreed to let Nick reproduce on www.ukcrimestats.com the entire Chapter 9 on Crime Statistics.
Over to Nick.
Chapter 9: Statistics
False facts are highly injurious to science because they often endure so long. – Charles Darwin
If crime is a normal part of the human repertoire, why is the crime rate so low? The question sounds perverse given that crime statistics have caused public consternation and political paranoia. But the answer is instructive. All crime statistics vastly underrate actual victimisation. And among the flakiest of all are the figures most people think are most reliable: those that come from the police.
It is essential to grasp how untrustworthy their records are – and how misleading they can be – to understand why the police get so distracted and why the courts are so feeble at controlling crime. But we also need to find a better metric, because if we can’t measure crime properly we can’t tell if it’s going up or down, we can’t calibrate our responses, and we can’t know if our solutions are making things better or worse.
It all once seemed so simple: if you want to know the crime, ask a policeman. The police are the experts and there is something comfortably definite about police statistics, not least that they can be traced back to actual victims. When the figures are published as tables or graphs they seem so tangible they must be real. Despite long-standing criticisms most policy-makers and commentators still take them at face value. The government even insisted that police statistics should be plotted on street maps and posted online so that, in theory, citizens can judge how safe they are. (I was privileged to be in at the pilot stage of one of these, in the English West Midlands, which showed a huge and obviously dangerous hotspot. It turned out to be the police station where suspects were searched for drugs or stolen property.)
There are three glaring problems in trusting police experience of how big or bad things are, and they all go back to a fundamental problem: crime, by definition, is illicit. As a general rule, people who break the law try not to draw attention to themselves. Sometimes their misdeeds are conspicuous, like a smash-and-grab raid in the high street, but mostly crime is surreptitious, intimate or even virtual. Every now and then someone will confess to a whole string of offences that were unknown to the police, but as a general rule, bullies, fraudsters, drink drivers, drug dealers, music pirates and rapists try to keep their crimes a dirty secret.
Accordingly, we expect the police to go and find crime for themselves. But officers rarely come across a crime in progress and, oddly, when they are proactive they actually distort the picture. A swoop on clubs will expose drug problems; a search for knives will uncover weapons. One area may have had a blitz on burglary, another on domestic violence or uninsured drivers. The arrival of a new chief constable or borough commander can have a huge impact on how the police operate, whom they target and what they prosecute. Some chiefs will turn a blind eye to street prostitution, others will clamp down on it. Often this gives rise to the perverse effect that better policing is rewarded with higher crime rates: if the police persuade more victims of rape to come forward, their tally of sexual offences will surge. Curiously, we can also get ‘more crime’ if those in government demand it.
Officers have often been given targets, such as two arrests per month, and charges are inflated (from, say, drunkenness to harassment – which counts as violence) to meet the quota. The Police Federation, which represents the rank and file in Britain, has justifiably called it ‘ludicrous’.
Similarly disturbing crime waves happen when charities or statutory agencies launch an initiative, or when the media mount a big investigation. Who knew child sex abuse was common until ChildLine came along?
But we the public are by far the biggest source of police intelligence. In other words, police crime figures are largely what we as witnesses, victims and occasional informants choose to tell them. Which is surprisingly little. Even if we see a crime actually taking place. According to a poll for the Audit Commission, almost two-thirds of us would walk on by. We can’t be bothered, don’t want to get involved or don’t think the police would do anything anyway. The reality may be worse than that survey suggests. Avon and Somerset Police set up a small experiment in which a plainclothes officer blatantly brandished bolt cutters to steal bikes, and though at least fifty people clearly saw what he was doing, not one person intervened or rang 999. The video went online and proved extremely popular.
That leaves the great bulk of recorded crime figures in the hands of victims. And, again, a big majority of us have reasons to keep quiet. When people are asked about crime they’ve suffered and whether or not they asked for help, it turns out that only 44 per cent of personal crimes are reported to the police. Even that reporting rate is a big improvement, caused partly by the spread of mobile phones. And it doesn’t count many of at least 9 million business crimes a year, most of which we only hear about through surveys, or commercial frauds which companies and managers would rather not make public.
Why do we suffer so in silence? The answer is fear, guilt and cynicism. In many ordinary crimes, and some extraordinary ones too, private citizens want to stay clear of the authorities. This is often the case in pub brawls, fights at parties, clashes in the street, domestic violence and a lot of sexual assaults which are too embarrassing to talk about. I saw this for myself when auditing crime in Oxford over two weeks for the BBC. On a typical Friday night at the John Radcliffe Hospital we filmed twelve people wounded badly enough to come to A&E, all male, all the result of alcohol, one with a bottle wound just beneath the eye, one with a double-fractured jaw, and one in a coma. But the police recorded only seven violent crimes that night, including some not hurt badly enough to have needed medical attention. Even more surprising, there was little correlation between the severity of the injury and the likelihood of telling the police.
A pioneering emergency surgeon – we shall meet him later – has systematically checked hospital records over many years and is blunt: ‘Police figures are almost hopeless when it comes to measuring violent crime.’
Then there are crimes people tend not to make a formal fuss about. Sometimes the victims perceive what is technically a crime to be normal, as with childhood bullying and theft among school kids. This is even true with full-blown rape, which you might think needs few definitions, but, as we shall see later, it is not just perpetrators who deny it happened; half of all women who have been attacked in a manner that fulfils the legal description do not consider themselves to have been raped. Many victims blame themselves and some are very vulnerable. One of the worst aspects of concealed crime is often dismissed as antisocial behaviour and is targeted at people with disabilities, causing huge distress and sometimes serious harm.
More often it’s simply not worth the effort of telling the police, as when an uninsured bicycle is stolen. In fact, some official theft rates do more to measure changes in insurance penetration than trends in stealing. One of the reasons that street crime appeared to rise steeply in the late 1990s was that mobile phone companies were promoting handset insurance. On the other hand, people are cautious if they are insured and don’t want to jeopardise their no-claims bonus, as where a car is vandalised or broken into.
Apologists for the official figures sometimes demur from such pettifogging and claim that at least the more serious crimes will be recorded. Not so: under-reporting is rife in stabbings or even shootings, so much so that British police chiefs want the medical profession to break patient confidentiality and report patients treated for knife or gunshot wounds.
Even murder is surprisingly hard to count. First it has to be discovered. Britain’s biggest peacetime killer, the family physician Harold Shipman, probably killed 218 patients over a span of thirty years, but none was regarded as homicide until shortly before his arrest in 1998. There are thousands of missing persons and no one knows if they are dead or alive unless a body turns up. Even with a corpse, pathologists and coroners may disagree on whether death was natural, accidental, suicide or at the hands of others; and scientific advances can suggest different causes from one year to the next. The statistical effects of all this are not trivial. Prosecutors can have a big effect too. Most years in England and Wales about 100 cases that are initially recorded as homicide become ‘no longer recorded’ as homicide because of reclassification. On the other hand, other defendants have the book thrown at them, as when reckless misadventure was reclassified as homicide after fifty-eight Chinese nationals suffocated while being smuggled into Britain in 2000, or when twenty-one cockle-pickers drowned in Morecambe Bay four years later.
Since in Britain murder is relatively rare, multiple deaths like these, or the fifty-two killed in the 7/7 bomb attacks, can massively distort the figures, warping short-term trends. Long term trends are even more difficult because of gaps in the records, especially from the age before computers, when information was kept locally on cards or paper.
Which opens another can of worms.
A third of all crime reported to the police is not recorded as a crime.
A great deal depends on whether an officer considers that an offence has taken place and, if so, whether it gets put down in the logs, when it is recorded and how it is categorised. Traditionally the police have a great deal of discretion. Retired officers will sometimes readily concede that, in years gone past, many quite unpleasant crimes were not taken very seriously: people who were racially abused, young men ‘having a scrap’, and even serious bodily harm if inflicted by a husband on his wife. Apart from anything else, turning a blind eye could save a lot of work.
There will always be a lot of wriggle room. When is a young man with a screwdriver equipped for burglary; when is a small amount of drugs not worth bothering about; when is a discarded handbag indicative of a mugging; when is it best to turn a blind eye in the hope of gaining some intelligence; when is a drunken brawl best dealt with by calming people down; when if someone reports a disturbance should one finish one’s paperwork or rush round and intervene? Not infrequently these ambiguities are manipulated cynically, with offences being shuffled from one category to another to reflect better on police performance. As one officer famously put it, the books are frequently ‘cooked in ways that would make Gordon Ramsay proud’.
In recent years Home Office counting rules have greatly improved consistency. Even so, in 2000 the Police Inspectorate found error rates ranging from 15 to 65 per cent and in 2013 the Office of National Statistics was still sufficiently concerned about big discrepancies that it warned police may be tinkering with figures to try to fulfil targets.
Moving the goalposts
Even if all crime were reported and consistently recorded, police statistics can be terribly misleading. Lawyers, legislators and officials keep changing the rules. Karl Marx came across the problem somewhat before I did, correctly noting in 1859 that an apparently huge decrease in London crime could ‘be exclusively attributed to some technical changes in British jurisdiction’.
The most blatant example of moving the goalposts was between 1931 and 1932 when indictable offences in London more than doubled because of a decision to re-categorise ‘suspected stolen’ items as ‘thefts known to the police’. More recently, changes in counting rules led to an apparent and terrifying surge in violent crime in 1998 and then again in 2002. It started as a noble idea to get more uniformity and be more victim-focused but resulted in completely redefining violent crime. From that point on, half of all police-recorded violence against the person involved no injury.
In 2008 violence was reclassified again and this time many less serious offences were bumped up to big ones. For example, grievous bodily harm now included cases where no one was badly hurt. Inevitably the Daily Mail reported ‘violent crime up 22 per cent’.
It is not just journalists who get confused. Many political advisers and university researchers are also taken in, which can lead to silly ideas and unproductive policy. People often get irate at those who refuse to take police statistics at face value. ‘We all know what they mean,’ they say. It is as though challenging the figures is somehow to be soft on crime. But we don’t know what they mean, and nor do the police.
International comparisons of police statistics are even more unreliable. Different countries have different laws, different customs and very different reporting rates. On the face of it, Australia has seventeen kidnaps per 100,000 while Columbia has only 0.6. Swedes suffer sixty-three sex crimes for only two per 100,000 in India.
Some people actually believe this stuff.
Evidently they don’t read the warning on the crime statistics tin. The Home Office has long warned that ‘police-recorded crime figures do not provide the most accurate measure of crime’, and for years the FBI was so cautious it sounded almost tongue-in-cheek: police data ‘may throw some light on problems of crime’. Yet however shallow, however defective, however inconsistent the figures, they have almost always been treated as far more meaningful than they are. Police responses, policy-makers’ strategies and public opinion navigated according to a tally which sometimes reflects the real world and sometimes doesn’t.
It is not as though we didn’t have a better mousetrap. Back in 1973 when crime was racing up the political agenda, the US Census Bureau started asking people for their actual experience of crime. For the first time they could get consistent data from year to year and from state to state. It was explosive stuff and immediately confirmed how incomplete police statistics were. The UK was already beginning to track crime as part of a General Household Survey, but from 1982 it followed the US lead with dedicated victimisation polls called the British Crime Survey or BCS. Other countries soon followed suit and over eighty now use a common methodology. That means we can now compare crime across borders as well as time.
The big picture
There is a lot wrong with the British Crime Survey. For a start, its name. The BCS only audits England and Wales – Scotland started its own SCS – and by the time they finally rebadged it (as the Crime Survey for England and Wales) the term BCS had become ingrained. So, confusingly, historical reports have to be called BCS and new ones, CSEW. If Wales goes its own way it may have to be rebranded yet again. It is also expensive. Since barely a quarter of the population suffers any sort of crime in any year you have to talk to a lot of citizens before you come up with a representative sample of, say, attempted burglary victims, let alone people who have suffered major trauma. That requires almost 50,000 face-to-face questionnaires, and not everyone will give up forty-five minutes for intrusive questions. It means researchers must doggedly go back to find the hard-to-get-at people, especially where victimisation is at its worst, and get them to trust in the anonymity of the process. It’s not like an opinion poll; it’s a mini-census that costs £100 per interview.
Even so it leaves a lot of gaps. Most obviously, it leaves out business crime, which has had to have a separate survey of its own. It is also hopelessly unreliable on rare crimes – one would have to interview almost a million people to get representative data on homicide. For a long time it missed out on under-sixteens too, fearing parents might object, but that has now been sorted. Past surveys also neglected homeless people and those in communal dwellings like student halls of residence, old people’s homes or hostels. An increasingly significant problem is that it largely ignores internet crime, but then so does almost everyone. And it almost certainly undercounts the most vulnerable in society who are victimised repeatedly and whose complaints are arbitrarily capped at five. Finally, being national, it has limited value in describing local crime.
Yet for all that, it has a huge advantage. Respondents may misremember or lie, but there is no reason to assume that memories or candour will change much from one year to the next. In other words, these big victimisation surveys have a power to describe trends.
So why did surveys like the BCS/CSEW take so long to catch on with the politicians, press and public?
The answer is, they didn’t come up with the right answers. Governments wanted to look competent, but since victim surveys uncovered far more crime than was realised hitherto they made the problem look even worse: the BCS revealed 8 million crimes a year compared to 3 million recorded by the police. Perhaps unsurprisingly, the early reports were met with a ‘conspiracy of silence’. One of the pioneers, Jan van Dijk, describes how his home country, the Netherlands, reacted with dismay in 1989 when the first international survey put it top of the league table for several property crimes, including burglary. The findings were lambasted for weeks by Dutch politicians, the media and criminologists.
On the other hand, crime surveys came to be disparaged by curmudgeons, including most journalists, because from 1995 they started to show crime was coming down. In fact in ten years, BCS crime fell 44 per cent, representing 8.5 million fewer crimes each year. Critics believed that this was just not credible and preferred police statistics which were far less encouraging and sometimes – on vandalism for example – continued in the opposite direction.
Thus it was that the British media continued to report that crime was climbing long after it had peaked and, incredibly, they went on with their rising crime agenda throughout a decade and a half of steep decline.
That is a story in itself.
© Nick Ross 2014
Crime: how to solve it and why so much of what we’re told is wrong. Biteback, £17.99
For background, detailed references and more see www.thecrimebook.com
Deflating or inflating the impact of crime relative to the size of the population has long been a hot topic amongst criminologists when measuring crimes committed against a person. Using static residential population has limited utility unless it is a relatively static area that doesn’t have people coming and going into and out of it. How this translates into every day scenarios is probably best illustrated by the Westminster area of London - an area where the population swells during the day to receive a large daytime working population, a different set of people who hit the bars and restaurants in the evening and then a 3rd set who live there (but may not even work there) – a small fraction of those who are around during the day. With daytime populations adjusted crime rates, areas in Westminster instead of coming top nationally for crime rates come around half-way, i.e. average.
So, we now have daytime population data for the following;
- Postcode Sector
- Postcode District
- Lower Layer Super Output Area (LSOA)
These are all available with a subscription and I think daytime-population adjusted crime rates gives a much truer and fairer picture of relative risk, if only because most crime happens during waking hours.
We have also made postcode-matched LSOA daytime crime rates free to view – just type in your postcode to the search box and click on the link – Get Daytime Population Crime rated figures – for the daytime rate to contrast and compare.
Ok, we’ve uploaded Scottish crime data. Compared to England, Wales and Northern Ireland, it’s very much less detailed. No categorisation of crime type, totals over a whole year (2007-08 and 2011-12) matched to Scottish Datazone. So what this means is that you can type in a Scottish postcode to the searchbox and it will automatically match that postcode centroid to the relevant Datazone – a small census area equivalent to Lower Layer Super Output Area and NI Super Output Area. We have then percentage ranked it to the rest of Scottish datazones. One word of warning; Scottish Crime Rates are per 10,000 residents so figures look about 10 times higher than elsewhere on this website. We are going to change this shortly to be consistent. I’m guessing after the Scottish Referendum, crime data will be made more freely available than it is now. So I think we can well and truly call ourselves UKCrimeStats now.
In the meantime, we have also updated the site to April 2014. As always, if you see anything unusual, don’t hesitate to get in touch.
As we are the only aggregators of the crime data, our monthly updates take a little bit longer here at UKCrimeStats and we like to check things through and ask questions which is one of our analytical USPs. We now have about 18.5 million crimes and ASB incidents, spread out over a bit more than a million locations, over 39 months. It also means we notice errors and like to raise them with the Home Office – an outstanding issue for example is how Thames Valley Police have been locating some crimes inside Warwickshire which has had a major distortionary impact on what would be a very quite part of the world. To be fair, big datasets always have errors but after 3 years, we need to be seeing far fewer.
Regarding the Thames Valley/Warwickshire issue, we have noticed that this is a bigger problem, involving many other Police Forces, than we realised and are still awaiting a response from the Home Office – here is a spreadsheet of locations shared between Police Forces – there are 8,000 shared locations. The bit that gets me is how Devon and Cornwall (& 4 other Police Forces) have managed to locate some events in Belleville Road in Clapham. You would have to expect some shared locations with the anonymisation of crime locations – all events are located to the nearest snap-point which might be between 1 metre and 4 acres away (if all snap-points were all evenly distributed). The problem comes when that snap-point falls outside of the Police Force area and outside of the boundary of the Police neighbourhood team. My view is that crimes should never be located outside either that of the covering Police Force or neighbourhood team. If it is simply too sensitive to do that, then make it a crime with no-location, ascribed to the relevant PF and neighbourhood.
In the meantime, we have noticed Gwent Police’s figures appear to have changed for February 2014 – we are in the throes of updating this in the next day or so because the earlier figures look wrong. It would be a huge advance if all Forces kept a changelog with the history of each monthly file with an explanation of why it had to be changed and what the impact is on the underlying figures. This would be in keeping with the Crime Data Guidance which is still not followed. So British Transport Police saying every single month without fail “Crime data refresh, all months” without any explanation, should really be totally unacceptable.
UKCrimeStats is now updated to December 2013. We now have 37 months of data and over 3 calendar years. Since the very beginning, we’ve made huge strides and with your support, will continue to do so. If you like what we’re doing, please do like us on the facebook link.
We have introduced another free facility on UKCrimeStats. All valid postcodes across England, Wales and Northern Ireland are now matched to Lower Layer Super Output Areas and Northern Ireland Super Output Areas and percent-ranked according to crime rate – crimes per 1,000 residents over the last 24 months. We think this is the easiest way to compare all the crime categories across different areas relative to the national average. Try it – just enter in your postcode and wait a little bit while the results are calculated.
When Police.uk started back in January 2011, the UK had precisely the following number of criminal offences and ASB – 1,470 – which were categorised in the following way on www.police.uk ;
Burglary – 9
Robbery – 4
Violent – 144
Vehicle – 5
Other – 1293
ASB – 15
So out of a maximum potential of 1,470, breaking it down into 6 categories was not actually a very big step towards granularity even when the need for anonymisation was taken into account. Clearly, the “Other” category clearly did not tell you very much at all and contained a significant number of crimes that required no categorical or geospatial anonymity at all like bicycle theft. Crime data should be understood to exist in 3 subsets – about the crime, the offender and the victim. What we have here on www.ukcrimestats.com is limited information about the crime.
(The United States incidentally has over 5,000 types of criminal offences and arguably have long lost count).
So I thought it would be helpful to paste up a couple of images of the evolution of crime categories that we use which are passed down to us by our rival, the tax-funded monopoly (because it unquestionably has first use and discriminatory access to the data), www.police.uk .
Anyway, as we then anticipated, there have been a number of new categories created as illustrated above in response to public and commercial interest.
In more detail, here they are.
|Anti-Social Behaviour (from Dec 10)|
|Robbery (from Dec 10)|
|Burglary (from Dec 10)|
|Vehicle (from Dec 10)|
|Violent and Sexual Offences (from Dec 10)|
|Other Crime (From Dec 10)|
|Theft – Shoplifting (from Sept 11)|
|Drugs (from Sept 11)|
|Criminal Damage and Arson (from Sept 11)|
|Public Disorder and Weapons (from Sept 11 to May 2013)|
|Theft – Other (from Dec 10)|
|Bike Theft (from May 13)|
|Theft From the Person (from May 13)|
|Possession of Weapons (from May 13)|
|Public Order (from May 13)|
This has been running for a while but I realised it was time to tell you. Here at UKCrimeStats, we make lots of incremental improvements all the time – so much work goes into this platform. So, if you type in your postcode to the searchbox and scroll down the results to the bottom, you will see the matching Lower Layer Super Output Area and Middle Super Output Area. We’ll also be doing this shortly with Northern Ireland Super Output Areas. Still no monthly crime data from Scotland. Who knows, by the time it is available, it may not be in the UK anymore and we’ll need a new website name !
October 2013 update coming through shortly.
We now have monthly Northern Ireland Crime data stretching back to September 2011. Here is the Police Service of Northern Ireland page. We have also added postcode sector and postcode districts for Northern Ireland so you can see crime data for these. And last but certainly not least, we have added all 890 Northern Ireland Super Output Areas. As always, if you spot any bugs, please tell us on firstname.lastname@example.org. Neighbourhoods and postcodes automatically matched to NISOAs coming shortly.