Quote of the day

“I find economics increasingly satisfactory, and I think I am rather good at it.”– John Maynard Keynes

Wednesday, 25 October 2023

A good look at issues with gathering unemployment data

 


author-image
DAVID SMITH

Newport, we have a problem: how much can we trust ONS figures?

The Times
Share
Save

For those of us whose life is governed by the calendar of official statistical releases — sad but partly true in my case — something has been missing this week. The Office for National Statistics (ONS) has released some information on the labour market, but not the usual range of data.

We have had information on pay, which is now outstripping inflation but shows signs of decelerating, and on job vacancies, which are falling quite sharply. They are now below one million and have fallen by more than 250,000 over the past year. There was also limited information on employment, which appears to be falling. What was missing, however, was what used to be regarded as the centrepiece of these data releases: what is happening to unemployment, and to employment, on the ONS’s preferred measure, based on the labour force survey (LFS).

That information has been delayed until October 24, as was announced by the statistical agency in a surprise press release on Friday. That does not sound too dramatic. However, behind that week-long delay lies significant problems with what used to be regarded as a gold standard of statistical measurement.

It may surprise some readers to learn that the unemployment and employment statistics are based not on an actual count, but on what is claimed to be the largest household survey in the UK. Just as opinion polls of, say, 1,000 people can give a good indication of the state of the political parties, so the LFS, which covered more than 53,000 individuals in nearly 25,000 households in the latest quarter, can do the same for the labour market.

That at least is the idea, and it has worked well until now. But the survey has run into problems, some of which are a direct result of the pandemic, and some of which were emerging before it. A survey is only as good as its sample and the LFS is plagued by falling response rates and the difficulty of getting to all the types of people it wants to. Face-to-face interviews, which became impossible during Covid, gave way to phone polling, but this suffers from the problem that many people no longer use landlines, particularly those in rented accommodation. The survey is thus biased towards older people.

Nye Cominetti, an economist with the Resolution Foundation think tank, points out that the survey, which dates back 50 years, had a working-age sample of 99,000 in 1992, but just 31,000 in the latest quarter. For some groups in that sample, the numbers are not large enough to be accurate.

You may ask why this matters. The answer is that accurate information on the labour market matters a lot for policy, and for the interest ratesetters at the Bank of England, where it has become a key indicator.

Even on the most basic question of how many people are in work, the LFS throws up a conundrum. The latest official figures for employment based on it a month ago showed that total employment in May to July this year, just under 33 million, was fractionally lower than at the end of 2019, before the pandemic. Two other employment measures, however, show a very different picture. The ONS’s workforce jobs measure, largely based on a survey of employers, showed 36.7 million people in work in June, a rise of a million since December 2019. So-called real-time HMRC data, covering employees only, so not the self-employed, also showed a rise of more than a million compared with late 2019 levels, to 30.1 million.

Covid forced the ONS to rely on phone polling but many people no longer use landlines
Covid forced the ONS to rely on phone polling but many people no longer use landlines
GETTY IMAGES

These are big differences, with significant implications. If there has been a big rise in employment since before Covid, simple arithmetic would suggest there has been a sizeable drop in productivity. There are other implications. The ONS publishes figures for numbers of EU and non-EU workers in the UK but says these should be “used with caution” because they are based on 2021 patterns of migration. A plan to reweight the LFS this month has now been put off until March next year.

This is not the only problem faced by the ONS. There are longstanding issues with the reliability of migration statistics. Separately, it stunned economists and inadvertently launched a thousand conspiracy theories when it released revised GDP data at the start of last month, lifting the UK from worst G7 performer during the pandemic to mid-table. The Office for Statistics Regulation is conducting a review, though it was asked to do so by the ONS, which believes such revisions are a fact of life for GDP data and that there’s nothing much to see here.

Some people attribute the ONS’s problems to its shift from London to Newport in 2006-07. The move cost the ONS hundreds of experienced staff who decided that the other side of the Severn Bridge was not for them. But that was a long time ago and, pun intended, it is now water under the bridge. Newport, which has become something of a base for commuters to Cardiff and Bristol, has featured in a couple of recent surveys as the second and third most desirable place to live in the UK respectively, which surprised me. But there is still an issue of having the country’s statistical agency so far from the seat of national government.

As for the reliability of economic statistics, it is only fair to point out that the pandemic posed intense problems, many of which have also been experienced by statistical agencies of other nations. It is also fair to say that the ONS was innovative in its use of new data sources during Covid, and that its infection survey, now discontinued, provided the most accurate information on the spread of the disease.

We should not be too gloomy. Economic statistics are moving from the era of the clipboard and face-to-face interview to big data which, privacy concerns aside, should provide more complete information. Statistics should become better, not worse, though the ride is proving to be a bumpy one.

David Smith is Economics Editor of The Sunday Times
david.smith@sunday-times.co.uk

No comments:

Post a Comment