Thursday , December 5 2019
Home / Offsetting Behaviour / Burton on Stats

Burton on Stats

Summary:
Tony Burton's back on deck with another column over at The Spinoff, this time on the problems at StatsNZ. You'll probably enjoy reading the whole thing. Here's an extended snip In itself the review is great. The public service response, not so much. Government statistician and Stats NZ chief executive Liz MacPherson has stepped down, but she was merely unlucky enough to be the boss when the most public of many, many Stats NZ initiatives was mishandled. In my view she has sacrificed herself for the greater complacency of the state services commissioner.The Stats NZ website camouflages its failure behind a forest of initialisms: the New Zealand Progress Indicators (NZPI); Indicators Aotearoa New Zealand (IANZ); the slow and clumsy introduction of the Integrated Data Infrastructure (IDI);

Topics:
Eric Crampton considers the following as important: ,

This could be interesting, too:

Eric Crampton writes Rubbish statistics

Eric Crampton writes Afternoon Roundup

Eric Crampton writes Around the traps

Eric Crampton writes Stats and IDI funding

Tony Burton's back on deck with another column over at The Spinoff, this time on the problems at StatsNZ.

You'll probably enjoy reading the whole thing.

Here's an extended snip
In itself the review is great. The public service response, not so much. Government statistician and Stats NZ chief executive Liz MacPherson has stepped down, but she was merely unlucky enough to be the boss when the most public of many, many Stats NZ initiatives was mishandled. In my view she has sacrificed herself for the greater complacency of the state services commissioner.

The Stats NZ website camouflages its failure behind a forest of initialisms: the New Zealand Progress Indicators (NZPI); Indicators Aotearoa New Zealand (IANZ); the slow and clumsy introduction of the Integrated Data Infrastructure (IDI); SoFIE. And so on.

The last was nearly a decade ago and gives a flavour of the common or garden Stats NZ debacles that you rarely hear about outside the small community of expert data analysts. SoFIE, the Survey of Family, Income and Employment, was run between 2002 and 2010. It was a longitudinal survey that followed the same people over time and is the only way to properly understand issues like social mobility and retirement income where we need to know what happens in families over many years.

Ministers of all parties have been enthusiastic about these studies and the initial funding for SoFIE was for eight years. Unless you intend to build roads, hospitals and the like, that is as secure as funding gets in government.

Yet in 2011 SoFIE’s funding was not renewed. A comparison with SoFIE’s Australian cousin HILDA makes it easy to understand why. While fewer than 50 studies in total use SoFIE, more than 100 used HILDA in 2018 alone. Remember, this is the data that helps us understand long term poverty, whether education really gives New Zealanders a “fair go”, what matters in the long run for staying healthy, and whether we are saving enough for retirement.

This gap between Australia and New Zealand is a gap in understanding what matters most to New Zealanders. The SoFIE equivalent of under-recruiting census staff was making life difficult for the analysts who turn data into information that makes a difference in people’s lives.

Trinh Le of economic research agency MOTU, one of the superhumanly persistent people who did manage to use SoFIE, believes it was superior to HILDA in some ways, particularly its more inclusive sampling and at times more in-depth coverage.

However in 2010 Stats NZ brought in charges for using SoFIE – $95 for a half-day per terminal in the data lab, and $115 an hour for checkers to check the output. “Research usually takes years, so this added enormous cost. So many projects had to be cancelled due to this,” Le says. One Marsden Fund project had to switch to using HILDA because the data lab costs were unaffordable, she says.

StatsNZ seeking cost-recovery isn't completely nuts, but you might worry about it for work that's already had to pass a public interest hurdle for permission to access the data in the first place. The real comparison with HILDA isn't the charges, but the reason for them. Stats makes it very very hard to do work on microdata unless you're happy to do it in the data lab. You aren't required to be in a datalab to use HILDA. They don't have to do cost recovery because they've used a model that's lower cost.

Stats has long seemed to see itself as the guardian and storehouse of data. In 2017, the Government Statistician suggested I 'watch this space' for better access to confidentialised microdata. It's more than two years later; I wonder whether it'll ever happen.

Leave a Reply

Your email address will not be published. Required fields are marked *