Guest post – Nick Ross republishes his chapter on Crime Statistics from the Crime Book

Nick Ross

Today we are delighted to be publishing the first in a series of posts from Nick Ross, probably best known as the long-time presenter of Crimewatch and author of a recent book, Crime: How to solve it – and why so much of what we’re told is wrong – which if you’ll forgive the plug, is one of the most interesting, well-written and thought-provoking books I’ve read for a long time. So you should read it – and no, before you ask, I’m not on commission !

Today then, we have a treat for you – the publisher has agreed to let Nick reproduce on www.ukcrimestats.com the entire Chapter 9 on Crime Statistics.

Over to Nick.

Chapter 9: Statistics

False facts are highly injurious to science because they often endure so long. – Charles Darwin

If crime is a normal part of the human repertoire, why is the crime rate so low? The question sounds perverse given that crime statistics have caused public consternation and political paranoia. But the answer is instructive. All crime statistics vastly underrate actual victimisation. And among the flakiest of all are the figures most people think are most reliable: those that come from the police.

It is essential to grasp how untrustworthy their records are – and how misleading they can be – to understand why the police get so distracted and why the courts are so feeble at controlling crime. But we also need to find a better metric, because if we can’t measure crime properly we can’t tell if it’s going up or down, we can’t calibrate our responses, and we can’t know if our solutions are making things better or worse.

It all once seemed so simple: if you want to know the crime, ask a policeman. The police are the experts and there is something comfortably definite about police statistics, not least that they can be traced back to actual victims. When the figures are published as tables or graphs they seem so tangible they must be real. Despite long-standing criticisms most policy-makers and commentators still take them at face value. The government even insisted that police statistics should be plotted on street maps and posted online so that, in theory, citizens can judge how safe they are. (I was privileged to be in at the pilot stage of one of these, in the English West Midlands, which showed a huge and obviously dangerous hotspot. It turned out to be the police station where suspects were searched for drugs or stolen property.)

There are three glaring problems in trusting police experience of how big or bad things are, and they all go back to a fundamental problem: crime, by definition, is illicit. As a general rule, people who break the law try not to draw attention to themselves. Sometimes their misdeeds are conspicuous, like a smash-and-grab raid in the high street, but mostly crime is surreptitious, intimate or even virtual. Every now and then someone will confess to a whole string of offences that were unknown to the police, but as a general rule, bullies, fraudsters, drink drivers, drug dealers, music pirates and rapists try to keep their crimes a dirty secret.

Accordingly, we expect the police to go and find crime for themselves. But officers rarely come across a crime in progress and, oddly, when they are proactive they actually distort the picture. A swoop on clubs will expose drug problems; a search for knives will uncover weapons. One area may have had a blitz on burglary, another on domestic violence or uninsured drivers. The arrival of a new chief constable or borough commander can have a huge impact on how the police operate, whom they target and what they prosecute. Some chiefs will turn a blind eye to street prostitution, others will clamp down on it. Often this gives rise to the perverse effect that better policing is rewarded with higher crime rates: if the police persuade more victims of rape to come forward, their tally of sexual offences will surge. Curiously, we can also get ‘more crime’ if those in government demand it.

Officers have often been given targets, such as two arrests per month, and charges are inflated (from, say, drunkenness to harassment – which counts as violence) to meet the quota. The Police Federation, which represents the rank and file in Britain, has justifiably called it ‘ludicrous’.

Similarly disturbing crime waves happen when charities or statutory agencies launch an initiative, or when the media mount a big investigation. Who knew child sex abuse was common until ChildLine came along?

But we the public are by far the biggest source of police intelligence. In other words, police crime figures are largely what we as witnesses, victims and occasional informants choose to tell them. Which is surprisingly little. Even if we see a crime actually taking place. According to a poll for the Audit Commission, almost two-thirds of us would walk on by. We can’t be bothered, don’t want to get involved or don’t think the police would do anything anyway. The reality may be worse than that survey suggests. Avon and Somerset Police set up a small experiment in which a plainclothes officer blatantly brandished bolt cutters to steal bikes, and though at least fifty people clearly saw what he was doing, not one person intervened or rang 999. The video went online and proved extremely popular.

That leaves the great bulk of recorded crime figures in the hands of victims. And, again, a big majority of us have reasons to keep quiet. When people are asked about crime they’ve suffered and whether or not they asked for help, it turns out that only 44 per cent of personal crimes are reported to the police. Even that reporting rate is a big improvement, caused partly by the spread of mobile phones. And it doesn’t count many of at least 9 million business crimes a year, most of which we only hear about through surveys, or commercial frauds which companies and managers would rather not make public.

Why do we suffer so in silence? The answer is fear, guilt and cynicism. In many ordinary crimes, and some extraordinary ones too, private citizens want to stay clear of the authorities. This is often the case in pub brawls, fights at parties, clashes in the street, domestic violence and a lot of sexual assaults which are too embarrassing to talk about. I saw this for myself when auditing crime in Oxford over two weeks for the BBC. On a typical Friday night at the John Radcliffe Hospital we filmed twelve people wounded badly enough to come to A&E, all male, all the result of alcohol, one with a bottle wound just beneath the eye, one with a double-fractured jaw, and one in a coma. But the police recorded only seven violent crimes that night, including some not hurt badly enough to have needed medical attention. Even more surprising, there was little correlation between the severity of the injury and the likelihood of telling the police.

A pioneering emergency surgeon – we shall meet him later – has systematically checked hospital records over many years and is blunt: ‘Police figures are almost hopeless when it comes to measuring violent crime.’

Then there are crimes people tend not to make a formal fuss about. Sometimes the victims perceive what is technically a crime to be normal, as with childhood bullying and theft among school kids. This is even true with full-blown rape, which you might think needs few definitions, but, as we shall see later, it is not just perpetrators who deny it happened; half of all women who have been attacked in a manner that fulfils the legal description do not consider themselves to have been raped. Many victims blame themselves and some are very vulnerable. One of the worst aspects of concealed crime is often dismissed as antisocial behaviour and is targeted at people with disabilities, causing huge distress and sometimes serious harm.

More often it’s simply not worth the effort of telling the police, as when an uninsured bicycle is stolen. In fact, some official theft rates do more to measure changes in insurance penetration than trends in stealing. One of the reasons that street crime appeared to rise steeply in the late 1990s was that mobile phone companies were promoting handset insurance. On the other hand, people are cautious if they are insured and don’t want to jeopardise their no-claims bonus, as where a car is vandalised or broken into.

Apologists for the official figures sometimes demur from such pettifogging and claim that at least the more serious crimes will be recorded. Not so: under-reporting is rife in stabbings or even shootings, so much so that British police chiefs want the medical profession to break patient confidentiality and report patients treated for knife or gunshot wounds.

Even murder is surprisingly hard to count. First it has to be discovered. Britain’s biggest peacetime killer, the family physician Harold Shipman, probably killed 218 patients over a span of thirty years, but none was regarded as homicide until shortly before his arrest in 1998. There are thousands of missing persons and no one knows if they are dead or alive unless a body turns up. Even with a corpse, pathologists and coroners may disagree on whether death was natural, accidental, suicide or at the hands of others; and scientific advances can suggest different causes from one year to the next. The statistical effects of all this are not trivial. Prosecutors can have a big effect too. Most years in England and Wales about 100 cases that are initially recorded as homicide become ‘no longer recorded’ as homicide because of reclassification. On the other hand, other defendants have the book thrown at them, as when reckless misadventure was reclassified as homicide after fifty-eight Chinese nationals suffocated while being smuggled into Britain in 2000, or when twenty-one cockle-pickers drowned in Morecambe Bay four years later.

Since in Britain murder is relatively rare, multiple deaths like these, or the fifty-two killed in the 7/7 bomb attacks, can massively distort the figures, warping short-term trends. Long term trends are even more difficult because of gaps in the records, especially from the age before computers, when information was kept locally on cards or paper.

Which opens another can of worms.

A third of all crime reported to the police is not recorded as a crime.

A great deal depends on whether an officer considers that an offence has taken place and, if so, whether it gets put down in the logs, when it is recorded and how it is categorised. Traditionally the police have a great deal of discretion. Retired officers will sometimes readily concede that, in years gone past, many quite unpleasant crimes were not taken very seriously: people who were racially abused, young men ‘having a scrap’, and even serious bodily harm if inflicted by a husband on his wife. Apart from anything else, turning a blind eye could save a lot of work.

There will always be a lot of wriggle room. When is a young man with a screwdriver equipped for burglary; when is a small amount of drugs not worth bothering about; when is a discarded handbag indicative of a mugging; when is it best to turn a blind eye in the hope of gaining some intelligence; when is a drunken brawl best dealt with by calming people down; when if someone reports a disturbance should one finish one’s paperwork or rush round and intervene? Not infrequently these ambiguities are manipulated cynically, with offences being shuffled from one category to another to reflect better on police performance. As one officer famously put it, the books are frequently ‘cooked in ways that would make Gordon Ramsay proud’.

In recent years Home Office counting rules have greatly improved consistency. Even so, in 2000 the Police Inspectorate found error rates ranging from 15 to 65 per cent and in 2013 the Office of National Statistics was still sufficiently concerned about big discrepancies that it warned police may be tinkering with figures to try to fulfil targets.

 

 

Moving the goalposts

Even if all crime were reported and consistently recorded, police statistics can be terribly misleading. Lawyers, legislators and officials keep changing the rules. Karl Marx came across the problem somewhat before I did, correctly noting in 1859 that an apparently huge decrease in London crime could ‘be exclusively attributed to some technical changes in British jurisdiction’.

The most blatant example of moving the goalposts was between 1931 and 1932 when indictable offences in London more than doubled because of a decision to re-categorise ‘suspected stolen’ items as ‘thefts known to the police’. More recently, changes in counting rules led to an apparent and terrifying surge in violent crime in 1998 and then again in 2002. It started as a noble idea to get more uniformity and be more victim-focused but resulted in completely redefining violent crime. From that point on, half of all police-recorded violence against the person involved no injury.

In 2008 violence was reclassified again and this time many less serious offences were bumped up to big ones. For example, grievous bodily harm now included cases where no one was badly hurt. Inevitably the Daily Mail reported ‘violent crime up 22 per cent’.

It is not just journalists who get confused. Many political advisers and university researchers are also taken in, which can lead to silly ideas and unproductive policy. People often get irate at those who refuse to take police statistics at face value. ‘We all know what they mean,’ they say. It is as though challenging the figures is somehow to be soft on crime. But we don’t know what they mean, and nor do the police.

 

International comparisons of police statistics are even more unreliable. Different countries have different laws, different customs and very different reporting rates. On the face of it, Australia has seventeen kidnaps per 100,000 while Columbia has only 0.6. Swedes suffer sixty-three sex crimes for only two per 100,000 in India.

Some people actually believe this stuff.

Evidently they don’t read the warning on the crime statistics tin. The Home Office has long warned that ‘police-recorded crime figures do not provide the most accurate measure of crime’, and for years the FBI was so cautious it sounded almost tongue-in-cheek: police data ‘may throw some light on problems of crime’. Yet however shallow, however defective, however inconsistent the figures, they have almost always been treated as far more meaningful than they are. Police responses, policy-makers’ strategies and public opinion navigated according to a tally which sometimes reflects the real world and sometimes doesn’t.

It is not as though we didn’t have a better mousetrap. Back in 1973 when crime was racing up the political agenda, the US Census Bureau started asking people for their actual experience of crime. For the first time they could get consistent data from year to year and from state to state. It was explosive stuff and immediately confirmed how incomplete police statistics were. The UK was already beginning to track crime as part of a General Household Survey, but from 1982 it followed the US lead with dedicated victimisation polls called the British Crime Survey or BCS. Other countries soon followed suit and over eighty now use a common methodology. That means we can now compare crime across borders as well as time.

The big picture

There is a lot wrong with the British Crime Survey. For a start, its name. The BCS only audits England and Wales – Scotland started its own SCS – and by the time they finally rebadged it (as the Crime Survey for England and Wales) the term BCS had become ingrained. So, confusingly, historical reports have to be called BCS and new ones, CSEW. If Wales goes its own way it may have to be rebranded yet again. It is also expensive. Since barely a quarter of the population suffers any sort of crime in any year you have to talk to a lot of citizens before you come up with a representative sample of, say, attempted burglary victims, let alone people who have suffered major trauma. That requires almost 50,000 face-to-face questionnaires, and not everyone will give up forty-five minutes for intrusive questions. It means researchers must doggedly go back to find the hard-to-get-at people, especially where victimisation is at its worst, and get them to trust in the anonymity of the process. It’s not like an opinion poll; it’s a mini-census that costs £100 per interview.

Even so it leaves a lot of gaps. Most obviously, it leaves out business crime, which has had to have a separate survey of its own. It is also hopelessly unreliable on rare crimes – one would have to interview almost a million people to get representative data on homicide. For a long time it missed out on under-sixteens too, fearing parents might object, but that has now been sorted. Past surveys also neglected homeless people and those in communal dwellings like student halls of residence, old people’s homes or hostels. An increasingly significant problem is that it largely ignores internet crime, but then so does almost everyone. And it almost certainly undercounts the most vulnerable in society who are victimised repeatedly and whose complaints are arbitrarily capped at five. Finally, being national, it has limited value in describing local crime.

Yet for all that, it has a huge advantage. Respondents may misremember or lie, but there is no reason to assume that memories or candour will change much from one year to the next. In other words, these big victimisation surveys have a power to describe trends.
So why did surveys like the BCS/CSEW take so long to catch on with the politicians, press and public?

The answer is, they didn’t come up with the right answers. Governments wanted to look competent, but since victim surveys uncovered far more crime than was realised hitherto they made the problem look even worse: the BCS revealed 8 million crimes a year compared to 3 million recorded by the police. Perhaps unsurprisingly, the early reports were met with a ‘conspiracy of silence’. One of the pioneers, Jan van Dijk, describes how his home country, the Netherlands, reacted with dismay in 1989 when the first international survey put it top of the league table for several property crimes, including burglary. The findings were lambasted for weeks by Dutch politicians, the media and criminologists.

On the other hand, crime surveys came to be disparaged by curmudgeons, including most journalists, because from 1995 they started to show crime was coming down. In fact in ten years, BCS crime fell 44 per cent, representing 8.5 million fewer crimes each year. Critics believed that this was just not credible and preferred police statistics which were far less encouraging and sometimes – on vandalism for example – continued in the opposite direction.

Thus it was that the British media continued to report that crime was climbing long after it had peaked and, incredibly, they went on with their rising crime agenda throughout a decade and a half of steep decline.

That is a story in itself.

 

© Nick Ross 2014

Crime: how to solve it and why so much of what we’re told is wrong. Biteback, £17.99

 

For background, detailed references and more see www.thecrimebook.com

 

 

 

One thought on “Guest post – Nick Ross republishes his chapter on Crime Statistics from the Crime Book

Leave a Reply