New postcodes added and June 2014 update on the way

Posted: August 4th, 2014  Author:   No Comments »

You might think postcodes are simple but they are not. So many people use out of date or obsolete postcodes without knowing it. At UKCrimeStats, we have not just the standard 1.7 million live postcodes, but 700,000 obsolete ones too. We have also just added an additional 70,000 new postcodes that have been released. Keeping postcodes up to date is a work in progress and we work at it all the time.

FacebookTwitterGoogle+Share

Guest post – Nick Ross republishes his chapter on Crime Statistics from the Crime Book

Posted: July 9th, 2014  Author:   No Comments »

Nick Ross

Today we are delighted to be publishing the first in a series of posts from Nick Ross, probably best known as the long-time presenter of Crimewatch and author of a recent book, Crime: How to solve it – and why so much of what we’re told is wrong – which if you’ll forgive the plug, is one of the most interesting, well-written and thought-provoking books I’ve read for a long time. So you should read it – and no, before you ask, I’m not on commission !

Today then, we have a treat for you – the publisher has agreed to let Nick reproduce on www.ukcrimestats.com the entire Chapter 9 on Crime Statistics.

Over to Nick.

Chapter 9: Statistics

False facts are highly injurious to science because they often endure so long. – Charles Darwin

If crime is a normal part of the human repertoire, why is the crime rate so low? The question sounds perverse given that crime statistics have caused public consternation and political paranoia. But the answer is instructive. All crime statistics vastly underrate actual victimisation. And among the flakiest of all are the figures most people think are most reliable: those that come from the police.

It is essential to grasp how untrustworthy their records are – and how misleading they can be – to understand why the police get so distracted and why the courts are so feeble at controlling crime. But we also need to find a better metric, because if we can’t measure crime properly we can’t tell if it’s going up or down, we can’t calibrate our responses, and we can’t know if our solutions are making things better or worse.

It all once seemed so simple: if you want to know the crime, ask a policeman. The police are the experts and there is something comfortably definite about police statistics, not least that they can be traced back to actual victims. When the figures are published as tables or graphs they seem so tangible they must be real. Despite long-standing criticisms most policy-makers and commentators still take them at face value. The government even insisted that police statistics should be plotted on street maps and posted online so that, in theory, citizens can judge how safe they are. (I was privileged to be in at the pilot stage of one of these, in the English West Midlands, which showed a huge and obviously dangerous hotspot. It turned out to be the police station where suspects were searched for drugs or stolen property.)

There are three glaring problems in trusting police experience of how big or bad things are, and they all go back to a fundamental problem: crime, by definition, is illicit. As a general rule, people who break the law try not to draw attention to themselves. Sometimes their misdeeds are conspicuous, like a smash-and-grab raid in the high street, but mostly crime is surreptitious, intimate or even virtual. Every now and then someone will confess to a whole string of offences that were unknown to the police, but as a general rule, bullies, fraudsters, drink drivers, drug dealers, music pirates and rapists try to keep their crimes a dirty secret.

Accordingly, we expect the police to go and find crime for themselves. But officers rarely come across a crime in progress and, oddly, when they are proactive they actually distort the picture. A swoop on clubs will expose drug problems; a search for knives will uncover weapons. One area may have had a blitz on burglary, another on domestic violence or uninsured drivers. The arrival of a new chief constable or borough commander can have a huge impact on how the police operate, whom they target and what they prosecute. Some chiefs will turn a blind eye to street prostitution, others will clamp down on it. Often this gives rise to the perverse effect that better policing is rewarded with higher crime rates: if the police persuade more victims of rape to come forward, their tally of sexual offences will surge. Curiously, we can also get ‘more crime’ if those in government demand it.

Officers have often been given targets, such as two arrests per month, and charges are inflated (from, say, drunkenness to harassment – which counts as violence) to meet the quota. The Police Federation, which represents the rank and file in Britain, has justifiably called it ‘ludicrous’.

Similarly disturbing crime waves happen when charities or statutory agencies launch an initiative, or when the media mount a big investigation. Who knew child sex abuse was common until ChildLine came along?

But we the public are by far the biggest source of police intelligence. In other words, police crime figures are largely what we as witnesses, victims and occasional informants choose to tell them. Which is surprisingly little. Even if we see a crime actually taking place. According to a poll for the Audit Commission, almost two-thirds of us would walk on by. We can’t be bothered, don’t want to get involved or don’t think the police would do anything anyway. The reality may be worse than that survey suggests. Avon and Somerset Police set up a small experiment in which a plainclothes officer blatantly brandished bolt cutters to steal bikes, and though at least fifty people clearly saw what he was doing, not one person intervened or rang 999. The video went online and proved extremely popular.

That leaves the great bulk of recorded crime figures in the hands of victims. And, again, a big majority of us have reasons to keep quiet. When people are asked about crime they’ve suffered and whether or not they asked for help, it turns out that only 44 per cent of personal crimes are reported to the police. Even that reporting rate is a big improvement, caused partly by the spread of mobile phones. And it doesn’t count many of at least 9 million business crimes a year, most of which we only hear about through surveys, or commercial frauds which companies and managers would rather not make public.

Why do we suffer so in silence? The answer is fear, guilt and cynicism. In many ordinary crimes, and some extraordinary ones too, private citizens want to stay clear of the authorities. This is often the case in pub brawls, fights at parties, clashes in the street, domestic violence and a lot of sexual assaults which are too embarrassing to talk about. I saw this for myself when auditing crime in Oxford over two weeks for the BBC. On a typical Friday night at the John Radcliffe Hospital we filmed twelve people wounded badly enough to come to A&E, all male, all the result of alcohol, one with a bottle wound just beneath the eye, one with a double-fractured jaw, and one in a coma. But the police recorded only seven violent crimes that night, including some not hurt badly enough to have needed medical attention. Even more surprising, there was little correlation between the severity of the injury and the likelihood of telling the police.

A pioneering emergency surgeon – we shall meet him later – has systematically checked hospital records over many years and is blunt: ‘Police figures are almost hopeless when it comes to measuring violent crime.’

Then there are crimes people tend not to make a formal fuss about. Sometimes the victims perceive what is technically a crime to be normal, as with childhood bullying and theft among school kids. This is even true with full-blown rape, which you might think needs few definitions, but, as we shall see later, it is not just perpetrators who deny it happened; half of all women who have been attacked in a manner that fulfils the legal description do not consider themselves to have been raped. Many victims blame themselves and some are very vulnerable. One of the worst aspects of concealed crime is often dismissed as antisocial behaviour and is targeted at people with disabilities, causing huge distress and sometimes serious harm.

More often it’s simply not worth the effort of telling the police, as when an uninsured bicycle is stolen. In fact, some official theft rates do more to measure changes in insurance penetration than trends in stealing. One of the reasons that street crime appeared to rise steeply in the late 1990s was that mobile phone companies were promoting handset insurance. On the other hand, people are cautious if they are insured and don’t want to jeopardise their no-claims bonus, as where a car is vandalised or broken into.

Apologists for the official figures sometimes demur from such pettifogging and claim that at least the more serious crimes will be recorded. Not so: under-reporting is rife in stabbings or even shootings, so much so that British police chiefs want the medical profession to break patient confidentiality and report patients treated for knife or gunshot wounds.

Even murder is surprisingly hard to count. First it has to be discovered. Britain’s biggest peacetime killer, the family physician Harold Shipman, probably killed 218 patients over a span of thirty years, but none was regarded as homicide until shortly before his arrest in 1998. There are thousands of missing persons and no one knows if they are dead or alive unless a body turns up. Even with a corpse, pathologists and coroners may disagree on whether death was natural, accidental, suicide or at the hands of others; and scientific advances can suggest different causes from one year to the next. The statistical effects of all this are not trivial. Prosecutors can have a big effect too. Most years in England and Wales about 100 cases that are initially recorded as homicide become ‘no longer recorded’ as homicide because of reclassification. On the other hand, other defendants have the book thrown at them, as when reckless misadventure was reclassified as homicide after fifty-eight Chinese nationals suffocated while being smuggled into Britain in 2000, or when twenty-one cockle-pickers drowned in Morecambe Bay four years later.

Since in Britain murder is relatively rare, multiple deaths like these, or the fifty-two killed in the 7/7 bomb attacks, can massively distort the figures, warping short-term trends. Long term trends are even more difficult because of gaps in the records, especially from the age before computers, when information was kept locally on cards or paper.

Which opens another can of worms.

A third of all crime reported to the police is not recorded as a crime.

A great deal depends on whether an officer considers that an offence has taken place and, if so, whether it gets put down in the logs, when it is recorded and how it is categorised. Traditionally the police have a great deal of discretion. Retired officers will sometimes readily concede that, in years gone past, many quite unpleasant crimes were not taken very seriously: people who were racially abused, young men ‘having a scrap’, and even serious bodily harm if inflicted by a husband on his wife. Apart from anything else, turning a blind eye could save a lot of work.

There will always be a lot of wriggle room. When is a young man with a screwdriver equipped for burglary; when is a small amount of drugs not worth bothering about; when is a discarded handbag indicative of a mugging; when is it best to turn a blind eye in the hope of gaining some intelligence; when is a drunken brawl best dealt with by calming people down; when if someone reports a disturbance should one finish one’s paperwork or rush round and intervene? Not infrequently these ambiguities are manipulated cynically, with offences being shuffled from one category to another to reflect better on police performance. As one officer famously put it, the books are frequently ‘cooked in ways that would make Gordon Ramsay proud’.

In recent years Home Office counting rules have greatly improved consistency. Even so, in 2000 the Police Inspectorate found error rates ranging from 15 to 65 per cent and in 2013 the Office of National Statistics was still sufficiently concerned about big discrepancies that it warned police may be tinkering with figures to try to fulfil targets.

 

 

Moving the goalposts

Even if all crime were reported and consistently recorded, police statistics can be terribly misleading. Lawyers, legislators and officials keep changing the rules. Karl Marx came across the problem somewhat before I did, correctly noting in 1859 that an apparently huge decrease in London crime could ‘be exclusively attributed to some technical changes in British jurisdiction’.

The most blatant example of moving the goalposts was between 1931 and 1932 when indictable offences in London more than doubled because of a decision to re-categorise ‘suspected stolen’ items as ‘thefts known to the police’. More recently, changes in counting rules led to an apparent and terrifying surge in violent crime in 1998 and then again in 2002. It started as a noble idea to get more uniformity and be more victim-focused but resulted in completely redefining violent crime. From that point on, half of all police-recorded violence against the person involved no injury.

In 2008 violence was reclassified again and this time many less serious offences were bumped up to big ones. For example, grievous bodily harm now included cases where no one was badly hurt. Inevitably the Daily Mail reported ‘violent crime up 22 per cent’.

It is not just journalists who get confused. Many political advisers and university researchers are also taken in, which can lead to silly ideas and unproductive policy. People often get irate at those who refuse to take police statistics at face value. ‘We all know what they mean,’ they say. It is as though challenging the figures is somehow to be soft on crime. But we don’t know what they mean, and nor do the police.

 

International comparisons of police statistics are even more unreliable. Different countries have different laws, different customs and very different reporting rates. On the face of it, Australia has seventeen kidnaps per 100,000 while Columbia has only 0.6. Swedes suffer sixty-three sex crimes for only two per 100,000 in India.

Some people actually believe this stuff.

Evidently they don’t read the warning on the crime statistics tin. The Home Office has long warned that ‘police-recorded crime figures do not provide the most accurate measure of crime’, and for years the FBI was so cautious it sounded almost tongue-in-cheek: police data ‘may throw some light on problems of crime’. Yet however shallow, however defective, however inconsistent the figures, they have almost always been treated as far more meaningful than they are. Police responses, policy-makers’ strategies and public opinion navigated according to a tally which sometimes reflects the real world and sometimes doesn’t.

It is not as though we didn’t have a better mousetrap. Back in 1973 when crime was racing up the political agenda, the US Census Bureau started asking people for their actual experience of crime. For the first time they could get consistent data from year to year and from state to state. It was explosive stuff and immediately confirmed how incomplete police statistics were. The UK was already beginning to track crime as part of a General Household Survey, but from 1982 it followed the US lead with dedicated victimisation polls called the British Crime Survey or BCS. Other countries soon followed suit and over eighty now use a common methodology. That means we can now compare crime across borders as well as time.

The big picture

There is a lot wrong with the British Crime Survey. For a start, its name. The BCS only audits England and Wales – Scotland started its own SCS – and by the time they finally rebadged it (as the Crime Survey for England and Wales) the term BCS had become ingrained. So, confusingly, historical reports have to be called BCS and new ones, CSEW. If Wales goes its own way it may have to be rebranded yet again. It is also expensive. Since barely a quarter of the population suffers any sort of crime in any year you have to talk to a lot of citizens before you come up with a representative sample of, say, attempted burglary victims, let alone people who have suffered major trauma. That requires almost 50,000 face-to-face questionnaires, and not everyone will give up forty-five minutes for intrusive questions. It means researchers must doggedly go back to find the hard-to-get-at people, especially where victimisation is at its worst, and get them to trust in the anonymity of the process. It’s not like an opinion poll; it’s a mini-census that costs £100 per interview.

Even so it leaves a lot of gaps. Most obviously, it leaves out business crime, which has had to have a separate survey of its own. It is also hopelessly unreliable on rare crimes – one would have to interview almost a million people to get representative data on homicide. For a long time it missed out on under-sixteens too, fearing parents might object, but that has now been sorted. Past surveys also neglected homeless people and those in communal dwellings like student halls of residence, old people’s homes or hostels. An increasingly significant problem is that it largely ignores internet crime, but then so does almost everyone. And it almost certainly undercounts the most vulnerable in society who are victimised repeatedly and whose complaints are arbitrarily capped at five. Finally, being national, it has limited value in describing local crime.

Yet for all that, it has a huge advantage. Respondents may misremember or lie, but there is no reason to assume that memories or candour will change much from one year to the next. In other words, these big victimisation surveys have a power to describe trends.
So why did surveys like the BCS/CSEW take so long to catch on with the politicians, press and public?

The answer is, they didn’t come up with the right answers. Governments wanted to look competent, but since victim surveys uncovered far more crime than was realised hitherto they made the problem look even worse: the BCS revealed 8 million crimes a year compared to 3 million recorded by the police. Perhaps unsurprisingly, the early reports were met with a ‘conspiracy of silence’. One of the pioneers, Jan van Dijk, describes how his home country, the Netherlands, reacted with dismay in 1989 when the first international survey put it top of the league table for several property crimes, including burglary. The findings were lambasted for weeks by Dutch politicians, the media and criminologists.

On the other hand, crime surveys came to be disparaged by curmudgeons, including most journalists, because from 1995 they started to show crime was coming down. In fact in ten years, BCS crime fell 44 per cent, representing 8.5 million fewer crimes each year. Critics believed that this was just not credible and preferred police statistics which were far less encouraging and sometimes – on vandalism for example – continued in the opposite direction.

Thus it was that the British media continued to report that crime was climbing long after it had peaked and, incredibly, they went on with their rising crime agenda throughout a decade and a half of steep decline.

That is a story in itself.

 

© Nick Ross 2014

Crime: how to solve it and why so much of what we’re told is wrong. Biteback, £17.99

 

For background, detailed references and more see www.thecrimebook.com

 

 

 

FacebookTwitterGoogle+Share

Top 10 Postcode Sectors for Bike Theft

Posted: September 12th, 2013  Author:   1 Comment »

We have just run the update for May 2013 and are currently debugging a few issues before we go ahead with June and then July. But I have for some time been fascinated by the crime of Bike Theft. We put it to the Home Office to include this as a sector 2 years ago and are glad to see that since May they have included it. Last year, we also entered the ONS Geovation competition to win sponsorship money to build an anti-bike theft app. In fact, I event went to visit a major bike retailer to drum up some support but alas, none was forthcoming. Anyway, one month’s data doesn’t tell you everything. It takes time to build up a picture, but to give you a taster, here are the top 10 Postcode Sectors for Bike Theft in May 2013 and the number of reported bikes stolen.

W9 4 108
OX4 1 41
OX1 3 30
OX1 4 29
CB5 8 22
CB1 2 20
MK40 3 19
PO1 1 18
PO1 2 18
CO1 1 17

Oxford and Cambridge you’d have to expect. But the real surprise to me was Maida Vale, W9 4 coming in at number 1 across England and Wales – 2.5 times worse than the worst area in Oxford.

FacebookTwitterGoogle+Share

Crime by LSOA – now publicly available on UKCrimeStats

Posted: July 3rd, 2013  Author:   2 Comments »

Take a look at our new Crime by LSOA page – that’s Lower Layer Super Output Area. Nowhere else will you find such a comprehensive resource.

http://www.ukcrimestats.com/LSOA/

We’ve actually had this data for some time and we have all the historical data going back to December 2010. All the LSOA individual pages are free to view. If you want to run reports and export, this requires a pad-for login at just £9.99 a non-recurring month. So if you try to run a report, it will just take you through to the membership section.

FacebookTwitterGoogle+Share

UKCrimeStats now updated to April 2013 – Postcode Sectors and Districts now free to view

Posted: June 28th, 2013  Author:   2 Comments »

More upgrades coming through as well !

Thankyou for all your support. As always, any questions please email us on crime@economicpolicycentre.com

 

FacebookTwitterGoogle+Share

September 2012 data now live on UKCrimeStats

Posted: November 1st, 2012  Author:   1 Comment »

and as you would expect with temperatures dropping and nights getting longer, we have seen a fall across England and Wales on aggregate of around 10% from August to September.

FacebookTwitterGoogle+Share

Crime in Essex – some hard numbers – as presented on BBC Essex today

Posted: October 30th, 2012  Author:   1 Comment »

This morning I was invited onto Dave Monk’s Programme BBC Radio Essex  at 10.10 a.m. to talk briefly about UKCrimeStats and crime in Essex compared to the rest of the country. I explained where we got the data from, what the ASB categories were, talked generally about crime data and put Essex in a national context.

I ran some reports looking at crime rate over the last 12 months for all 43 Police Forces. The combined ASB and crime rate per 1000 residents in Essex came in at 100.3547 which gave Essex a middle ranking range of 25 out of 43. Where there was real difference of course was in the breakdown – it had the 6th highest vehicle crime rate, the 11th highest robbery rate and 35th lowest ASB rate. If you’ll forgive the plug, you can get all of these with a login on our site which only costs £9.99 a month.

I came on straight after a spokesman from the Green Party who didn’t agree with elected PCCs and so announced that they would be not be putting up a candidate, not least because of the £5000 deposit per candidate. I have much sympathy with the latter point – £5000 is certainly too high and a barrier to entry. And yet, it is a flawed argument to say that because turnout will be low the solution is to revert to even less democracy with a committee of unelected officials. Nor is it somehow wrong that politicians get involved. There is nothing much more politically valuable than the democratic oversight of how resources are allocated to fight crime and to boot them out if people think they have not done a good enough job.  It’s a bit short-sighted of the Green Party as well to be so dismissive of this when there is ample scope for research into the relationship between the pollution of neurotoxins and crime. Crime or the lack of it is quite intricately linked to the environment.

For all that, where I have found the elections thus far a disappointment is the absence of data in the debate by PCCs – and we will be doing something about this very soon.

 

FacebookTwitterGoogle+Share

August 2012 data now live on UKCrimeStats

Posted: October 3rd, 2012  Author:   1 Comment »

We have also introduced a new feature for our members – it is now possible to drill down and ask within a selected Police Force or municipality, which neighbourhood hood or subdivision had the biggest, least, total or increase or decrease in this that or the other type of crime between time point A and B.

I ran it for the Metropolitan Police for the last month and was staggered to see the biggest increase was in the neighbourhood of Pembridge – from at total combined ASB and Crime of 132 in July 2012 to 318 in August 2012 driven mostly by  the categories of”Drugs” and “Other Crime”. I’ve double-checked the data though and this is clearly what it says.

FacebookTwitterGoogle+Share

Why I support elected PCCs – in a nutshell

Posted: September 8th, 2012  Author:   1 Comment »

Here goes;

There seems to be a sizeable difference of opinion between what the general public think the Police can achieve and what the Police think the general public actually want.  Elected PCCs should go some way to closing that gap. But this new system will take time to bed down. As political wags will tell you, the first test of a democratic society is not the first election, but the second and it’s just the same with these new elected officials. We will need a good 10 years of PCCs in order to tell how successful they have been.

And expect some failures as well as successes – which it what happens with democracy.

FacebookTwitterGoogle+Share

UKCrimeStats upgrade – now all wards matched to all Municipal authorities

Posted: September 6th, 2012  Author:   1 Comment »

Following on from a suggestion by one of our members and thanks to our dedicated team of programmers, we have recently invested in making sure that all wards – the smallest electoral division – are matched to the relevant County Council, London Borough,  Metropolitan Council etc. This was actually harder to do that it sounds !

We have also included a full listing of all wards under the relevant local authority – e.g. Leeds City Council which has 33 wards and makes comparisons easier. With membership, you can now export as csv files all the wards and their underlying crime data in a time series matched with the names of the authorities.

 

 

 

FacebookTwitterGoogle+Share