Catch-up on the INSPIRE Conference
The ARE3NA Team had a very busy and interesting time at the INSPIRE Conference in Aalborg (Denmark), as reported last month. You can now find details of our presentations (and some videos) on the conference website, including the work we reported in other news relating to mappings between INSPIRE metadata and the DCAT application profile.
Looking for tools and rating them
The conference was also a chance to quiz around 20 conference delegates on the tools they are using to support INSPIRE implementation across a number of tasks and to rate them. This followed a recommendation from EU Member States to explore tool rating mechanisms and helpful inputs from our workshop participants at the conference.
Results of our quiz
Even though this was only a small group, we found a lot of variation in the potential tools that could be used. To present these initial results, we have looked at tools mentioned by more than one conference delegate and then averaged the rating given by all delegates. This gave us a list of ‘top tools’ (see Table 1) for each type of INSPIRE implementation task (discovery, view, download etc.), as well as examples of other potential tools (see Table 2).
Table 1: Average rating of most frequently cited tools reported by INSPIRE Conference 2014 delegates (n.b. other tools came a close second in all categories)
Table 2: List of other tools reported by INSPIRE Conference 2014 delegates
(n.b. presented as reported and in increasing number of times cited)
Tools were clearly found in most types of activity supporting INSPIRE implementation (including a tie for Data Portals/Geoportals). An exception was the case of software for registries. Partly in response to this, ARE3NA will be promoting the Re3gistry software in the coming months, including the upcoming release of version 0.4.
All these will be used as inputs later in the year, where ARE3NA will be gathering more details about the tools used for INSPIRE implementation and for the reuse of INSPIRE data, metadata and services in e-government.
This small exercise is not statistically representative but it has helped our understanding of how sharing tools and ranking them could work through ARE3NA. We would, however, already like to discuss this more and have some feedback/opinions to contribute to our activities.
Please tell us:
- If you are surprised about the ‘top tools’ in the different types listed
- If you think there are other tools we should know about (and what rating you would give them)
- If you think rating is useful, what you think five stars (or INSPIRE ‘pineapples’) could mean and what other details you think people looking for tools would be interested in
Please mail us with your comments and suggestions to: email@example.com