Ontology Quality Assurance with the Crowd

0Citations
Citations of this article
12Readers
Mendeley users who have this article in their library.

Abstract

The Semantic Web has the potential to change the Web as we know it. However, the community faces a significant challenge in managing, aggregating, and curating the massive amount of data and knowledge. Human computation is only beginning to serve an essential role in the curation of these Web-based data. Ontologies, which facilitate data integration and search, serve as a central component of the Semantic Web, but they are large, complex, and typically require extensive expert curation. Furthermore, ontology-engineering tasks require more knowledge than is required in a typical crowdsourcing-task. We have developed ontology-engineering methods that leverage the crowd. In this work, we describe our general crowdsourcing workflow. We then highlight our work on applying this workflow to ontology verification and quality assurance. In a pilot study, this method approaches expert ability, finding the same errors that experts identified with 86% accuracy in a faster and more scalable fashion. The work provides a general framework with which to develop crowdsourcing methods for the Semantic Web. In addition, it highlights opportunities for future research in human computation and crowdsourcing.

Cite

CITATION STYLE

APA

Mortensen, J. M., Musen, M. A., & Noy, N. F. (2013). Ontology Quality Assurance with the Crowd. In Proceedings of the 1st AAAI Conference on Human Computation and Crowdsourcing, HCOMP 2013 (pp. 54–55). AAAI Press. https://doi.org/10.1609/hcomp.v1i1.13111

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free