BLOG 1: The boring side of public data analytics

Published on: 26/06/2020

When we think of driven data innovation, we are used to think about fast developments, hackathons and fancy data visualization: a lone data wizard making breakthrough discoveries on its laptop in a weekend sprint.

The reality is less fancy. It is true that you can build stunning interactive visualisations in a matter of minutes. But this is only the tip of the iceberg.

For data analytics to make a genuine difference in crucial matters such as delivering child benefits quickly to the right people, or making accurate predictions for epidemics, it is crucial to have access to a large pool of high quality, trusted data, and to run and test algorithms iteratively before they are deployed. And this requires long term effort and investment into “boring” activities such as defining data standards, documenting data assets through metadata, curating data, ensuring the political agreement and legal basis for data sharing.

For instance, in the Danish case, it has taken three years to build up a completely functional data unit including the human ressources and the necessary data from relevant authorities, which today delivers savings for municipalities of more than 60 million EUR yearly (and avoided fines to citizens). Similarly, the recent Covid-19 crisis has shown how no matter how fancy the models developed were, there were fundamental issues with the availability and quality of the data that could not be solved in a matter of weeks. Only Germany had a real time Intensive Care Units availability register, but even most fundamentally, countries – and even regions – had inconsistent base registries that counted deaths in different ways. Containing the epidemics and reducing death counts required a long-term investment in high quality data that cannot be built during the emergency.

One of the lessons learnt from the study is that if we promise decision makers fancy and fast results, we are likely to create a disappointment that will harm data driven innovation in the long term. To grasp the opportunities of data analytics, we need a long-term investment into core, infrastructural activities for data quality and access, and give credit to the often invisible “data stewards” that make it work – as New Zealand has done in its strategy. This might not be sexy, but it saves lives.


Do you agree? This is just one of the findings from the case studies. You can find more in the downloads section on your left.


Wed, 01/07/2020 - 10:15

What are the possible procedures to ensure data quality? Frameworks for data collection and quality evaluation? What would criteria and indicators look like?

Mon, 06/07/2020 - 15:37

Indeed, the path towards delivering public value may be long when starting out with a data analytics project or strategy. It is key to define realistic milestones along the way so progress can be monitored and demonstrated to all the stakeholders involved.

To keep everyone on board, motivated and focused, always keep your eye on the prize: what is the public value that you want to generate? Is it addressing subversive crime, as is the case in one of the flagship projects of the Dutch Data Agenda Government? Or is it preventing citizens from incurring debt in wrongly received social benefits payments, as we've seen in the Welfare Fraud Analytics approach in Denmark?