Skip to main content

Recommendation 13

Recommendation 13: Manage location data quality by linking it to policy and organisational objectives, assigning accountability to business and operational users and applying a “fit for purpose” approach

Implementation guidance Related information

help
Why

 

  • Research indicates that poor data quality is costing organisations an average of €8.4 million per annum and this is likely to worsen as information environments become increasingly complex.
  • Improved data quality is a primary source of value for many IT-enabled business initiatives. Data quality has the potential to improve labour productivity by as much as 20% but, on the other hand, research shows that 40% of the anticipated value of all business initiatives is never achieved (source: Measuring the Business value of Data Quality, Gartner 2011). Poor data quality in both the planning and execution phases of these initiatives is a primary cause. Poor data quality also affects operational efficiency, risk mitigation and agility by compromising the decisions made in each of these areas.
  • INSPIRE is creating a data infrastructure where we can anticipate reuse of the data. Public administrations are publishing open data. The same data is reused in many circumstances and, unlike other resources, the value of this data increases rather than decreases with use. Consequently, there is a need for a balanced approach to managing data quality and metadata across different EU Member States to support effective reuse.
  • Managing data quality with a common approach/framework will enable a seamless exchange of data between different public service providers reusing this data. This can be done when administrations share their data through a common service for example.
  • Managing data quality with a common approach will also enable the exchange of data between data providers. These can define “fitness for purpose” quality levels which include frequency of updates, produce data of a specific level of quality/detail with the adequate level of resources and define appropriate licensing. Data providers can also contribute to and enhance each other’s data, thus sharing resources.
  • As more business processes become digitalised, data quality becomes the limiting factor for overall process quality.

[Top]

How
How

 

Fit for purpose data quality design approach

  • Determine what is meant by and what is needed in terms of data quality. The dimensions of data quality include timeliness, accuracy, completeness, integrity, consistency, compliance to specifications / standards / legislation, well-described etc.
  • Achieving perfect data quality on all data quality dimensions (typically ranging from three to six but sometimes up to several hundred) is impossible to achieve at reasonable cost for most organisations. Instead, it becomes essential to define clearly what is meant by "fit for purpose" data quality. By initiating an ex-post evaluation of existing data quality issues against data quality best-practice guidance, an organisation can define what “good enough” data quality means and develop and apply a framework for analysis. This framework will enable common data quality language, better communication of issues, and less confusion and better positioning of governance.
  • Establish a clear line of sight between the impact of data and data quality improvement. This can be best achieved by:
    • Identifying the application systems and external services that produce data to support business activities and policy making;
    • Measuring conformance of data to quality parameters set out in the data policy on an agreed frequency;
    • Assessing the current business value in terms of the existing data quality level and engaging with relevant stakeholders to assess the value of improving specific data quality items.
  • Use data profiling techniques early and often to assess data quality and present profiling results in a way that appropriate issues can be acted upon, identifying outliers, anomalies, cross-referencing errors, gaps etc. A useful approach is to design and implement data quality dashboards for critical information such as authentic data and to embed this as a business-as-usual IT process.
  • Establish a data quality standard which incorporates multilingualism to ensure consistency and appropriateness in the way key enterprise data is applied and reported across the National and European Data Infrastructures.

Common metadata approach

  • Data quality standards are linked to data standards; ensure completeness and adequacy of the metadata, this will support reusability.
  • Implement an agreed metadata standard across the public sector, which is based on or is consistent with the INSPIRE approach.
  • When using common metadata standards, agree among the different stakeholders on the meaning of each metadata field, this ensures semantic interoperability of data.

Combining authoritative and non-authoritative data

  • Combine authoritative and non-authoritative data for enhancing public services but define a framework or use cases where this is allowed, so as not to create legal uncertainty or infringement in public service delivery.
  • Identify authoritative data and non-authoritative data using the quality framework, standardise the referencing of this authoritative/non-authoritative data for example with a specific metadata field in a common standard.
  • Allow the combined publication of authoritative data and non-authoritative data on common platforms so as to favour marketplaces driving innovation in public services.

Data quality governance

  • Make data quality a recurring agenda item at the information governance steering group meetings to ensure the data quality improvement roadmap is aligned with the information governance vision and strategy.
  • Establish data quality responsibilities as part of the information steward role.
  • Establish a cross-unit or cross-organisation special interest group for data quality, led by the Information Management team or equivalent body.
  • Establish a data quality review as a release management "stage gate" review process.
  • Communicate the benefits of better data quality regularly to departments by benchmarking improvements with other similar organisations or creating a regular data quality bulletin and highlighting what could be achieved with better data quality management.
  • Leverage external/industry peer groups by inviting them to present at special interest group meetings.
  • Encourage feedback from users to report problems and help improve data quality. This process can be incorporated in licensing agreements.
  • Use artificial intelligence (AI) and machine learning techniques to make suggestions for improving data quality.
  • Involve citizens and the private sector actively in enhancing public data quality (completeness, correctness, predications, metadata completion, …), potentially leveraging technology to support these processes, such as digital platforms.

[Top]

help
Challenges

  • Chief data officers (CDOs) and information management leaders continue to struggle with getting data quality onto their digital business agendas. This is often due to an overemphasis on enabling technology rather than a focus on organisational culture, people and processes.
  • Few organisations attempt to use a consistent, common language for understanding business data quality. Instead, they maintain divergent and often conflicting definitions of the same logical data.
  • Information leaders struggle to make data quality improvements beyond the level of a project and do not embed them at the programme level as part of their digital business information culture.
  • Required data quality may come at a price that is not affordable.
  • Drawing together data from multiple sources for analysis increases the possibility that effort will be needed to transform data to a form where it can be used.

[Top]

help
Best Practices

[Top]

Bar chart dark blue 32
LIFO Monitoring

The Location Information Framework Observatory (LIFO) monitors the implementation of EULF Blueprint recommendations in European countries. Read about the implementation of Recommendation 13 in the LIFO Country Factsheets or the LIFO European State of Play Report. Explore the results for selected countries at LIFO Interactive Dashboards - Recommendations.

[Top] 

Puzzle
Related Frameworks: European Interoperability Framework (EIF)

EIF Pillars Recommendations
Underlying Principle 9: Multilingualism

Recommendation 16: Use information systems and technical architectures that cater for multilingualism when Establishing a European public service. Decide on the level of multilingualism support based on the needs of the expected users. 

Interoperability Layer 5: Semantic Interoperability Recommendation 31: Put in place an information management strategy at the highest possible level to avoid fragmentation and duplication. Management of metadata, master data and reference data should be prioritised.
Basic Component 3: Base registries Recommendation 37: Make authoritative sources of information available to others while implementing access and control mechanisms to ensure security and privacy in accordance with the relevant legislation.
Basic Component 3: Base registries Recommendation 38: Develop interfaces with base registries and authoritative sources of information, publish the semantic and technical means and documentation needed for others to connect and reuse available information.
Basic Component 3: Base registries Recommendation 40: Create and follow data quality assurance plans for base registries and related master data.
Basic Component 4: Open data Recommendation 42: Publish open data in machine-readable, non-proprietary formats. Ensure that open data is accompanied by high quality, machine-readable metadata in non-proprietary formats, including a description of their content, the way data is collected and its level of quality and the licence terms under which it is made available. The use of common vocabularies for expressing metadata is recommended.

[Top]

Puzzle
Related Frameworks: UN-GGIM Integrated Geospatial Information Framework (IGIF)


Strategic Pathway 4: Data

Documentation Elements

Implementation Guide

Appendices

Data Themes

Custodianship, Acquisition and Management

Data Supply Chains

Data Curation and Delivery

Actions Tools
1. Getting Organised  
Data Framework

APP4.1: Data Theme Description Template

The Global Fundamental Geospatial Data Themes

Data Inventory APP4.2: Data Inventory Questionnaire
Dataset Profile APP4.3: Dataset Profile Template
2. Planning for the Future  
Data Gap Analysis APP4.4: Gap Analysis Matrix
Data Theme Roadmap APP4.5: Data Theme Roadmap Template
3. Capturing and Acquiring Data  
Data Acquisition Programme  
4. Managing Data Sustainably  
Data Governance APP4.7: Data Governance Roles and Responsibilities
Data Management Plan APP4.8: Data Management Plan Elements
Maintained Metadata APP4.9: Metadata Creation Checklist
5. Maintaining Accurate Positioning  
Maintained Geodetic Infrastructure APP4.11: Guidance for Improving Geodetic Infrastructure
6. Integrating Data  
Data Supply Chains  
Data Interoperability  

[Top]

Marker Small 2
ELISE Resources

Type Resource Date
Webinar Emerging approaches for data innovation in Europe 2022
Training INSPIRE training platform: Data harmonisation 2014
Training INSPIRE training platform: Metadata and catalogue services 2014
Training INSPIRE training platform: Geospatial data quality 2017
Training INSPIRE training platform: INSPIRE data specifications 2018
Training INSPIRE training platform: Procedures for data and metadata harmonisation 2018
Training INSPIRE training platform: Examples of data transformation 2018
Training INSPIRE training platform: Metadata and data validation for INSPIRE  2018
Training INSPIRE training platform: Principles for data and metadata harmonisation according to INSPIRE 2020
Pilot / Testbed Emerging approaches for data innovation in Europe 2022

[Top]

help
Further Reading

[Top]

Version: EULF Blueprint v5.1