1 year ago

European Commission Project team DISCLAIMER


Figure 9: Quality

Figure 9: Quality metrics from OpenDataMonitor With no funding or resources available to update or extend the OpenDataMonitor project and platform after the project completed, its report needs to be built upon to delve further into the concepts of quality and quantity metrics, and more general automated metrics. As part of this report, we asked portal owners if they were aware of the OpenDataMonitor project. Almost all supported it, and several told us they were keen to learn from it. One said: ‘It’s interesting… [it] seems to know more about our data than we do.’ One of the benefits of a project like OpenDataMonitor and the European Data Portal’s metadata quality assessment is that they enable portal owners to assess the quality of data accessed via their portal alongside portals in other regions and countries. Learnings from these projects can help portal owners plan for portal enhancements and argue for greater funding. 60

Metadata quality assessments on the European Data Portal 88 The EDP’s Metadata Quality Assurance (MQA) monitors the quality of metadata that is harvested from portals across the European Union or stored manually with the EDP metadata creation form. 89 It is based on validation against the metadata standard DCAT-AP 1.1, and is run on a weekly basis. The European Data Portal includes a Metadata Quality Dashboard 90 , providing information about every catalogue included in the portal, distribution availability and data schema violations. The catalogue dashboard provides the same information as the overall dashboard gives, but only for the selected catalogue. Metadata quality overviews can be downloaded per catalogue 91 . Distribution statistics Each dataset, in accordance with the DCAT standard, is required to have an Access URL, with a Download URL recommended. A HTTP GET request is executed to check against URLs, and distribution statistics are calculated. Distribution information on the dashboard includes: Error status codes the ratio of machine readable distributions; and most commonly used distribution formats. Catalogue availability measures the percentage of available distributions of the catalogue and Catalogue machine readability measures if a dataset is machine readable if at least one of its distributions is machine readable 88 European Data Portal, 2016, Data Portal User Manual v. 1 89 European Data Portal, 2016, Data Portal User Manual v. 1 90 European Data Portal, Metadata Quality Dashboard 91 European Data Portal 61