Another week another headache for the Office for National Statistics (ONS) and those who rely on their data to inform economic decisions.

A few weeks ago we explained how estimating GDP had proved more difficult than normal during the Covid-19 pandemic, and how this has resulted in big data revisions further down the track.

This week, the Labour Force Survey (LFS) is under scrutiny.  Response rates, which were 50 percent a decade ago, are now deemed so low (15 percent this year) that the survey has lost its 'national statistics' quality mark. Again, the pandemic has played a part as the survey's data collection method was forced to change (from face-to-face to telephone) during the first national lockdown. ONS are in the process of overhauling the survey design to restore quality, but in the meantime  are placing greater emphasis on administrative data (namely Department for Work and Pensions data on 'out-of-work' benefit claimants and HMRC Pay-As-You-Earn records) to ascertain what is happening in the labour market, both of which have their own limitations when used for statistical purposes.

What does this mean for Buckinghamshire?

Firstly, some context. The Labour Force Survey is a survey of the employment circumstances of the UK population. It is a large survey and has provided policy makers with official measures of employment and unemployment for the last 30 years. Key variables from the LFS and its boost samples are used to generate the Annual Population Survey (APS), from which we are provided with local level estimates of labour market participation.

As Buckinghamshire's APS sample size is relatively small, we've long cautioned against using the survey to estimate local levels of unemployment, but have historically deemed it 'reliable enough' to use to estimate levels of employment, economic inactivity, self-employment and work-related training.  The declining response rates issue however means we should now treat all local data derived from the survey with a large degree of caution.

When assessing the labour market fortunes of our residents, or measuring the success of local labour market policies and initiatives, we must avoid definitive takes from one data source, ensuring instead that we are triangulating data from a variety of sources, including local qualitative intelligence, before drawing conclusions.

Whilst we'd all love perfect data, the reality is that there are always limitations.  And unfortunately, at this moment in time, data from some of our key sources is less reliable than normal.