Improving data quality in the linked open data: A survey

5Citations
Citations of this article
28Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

The Linked Open Data (LOD) is "web of data", a different paradigm from "web of document" commonly used today. However, the huge LOD still suffers from data quality problems such as completeness, consistency, and accuracy. Data quality problems relate to designing effective methods both to manage and to retrieve information at various data quality levels. Based on review from papers and journals, addressing data quality requires some standards functioning to (1) identification of data quality problems, (2) assessment of data quality for a given context, and (3) correction of data quality problems. However, mostly the methods and strategies dealing with the LOD data quality were not as an integrative approach. Hence, based on those standards and an integrative approach, there are opportunities to improve the LOD data quality in the term of incompleteness, inaccuracy and inconsistency, considering to its schema and ontology, namely ontology refinement. Moreover, the term of the ontology refinement means that it copes not only to improve data quality but also to enrich the LOD. Therefore, it needs (1) a standard for data quality assessment and evaluation which is more appropriate to the LOD; (2) a framework of methods based on statistical relational learning that can improve the correction of data quality problems as well as enrich the LOD.

Cite

CITATION STYLE

APA

Hadhiatma, A. (2018). Improving data quality in the linked open data: A survey. In Journal of Physics: Conference Series (Vol. 978). Institute of Physics Publishing. https://doi.org/10.1088/1742-6596/978/1/012026

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free