Towards a generic infrastructure for sustainable management of quality controlled primary data

1Citations
Citations of this article
16Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Collecting primary data in scientific research is currently being performed in numerous repositories. Frequently, these repositories have not been designed to support long-term evolution of data, processes, and tools. Furthermore, in many cases repositories have been set up for the specific needs of some research project, and are not maintained any longer when the project is terminated. Finally, quality control and data provenance issues are not addressed to a sufficient extent. Based on the experiences gained in a joint project with biologists in the domain of biodiversity informatics, we propose a generic infrastructure for sustainable management of quality controlled primary data. The infrastructure encompasses both project and institutional repositories and provides a process for migrating project data into institutional repositories. Evolution and adaptability are supported through a generic approach with respect to underlying data schemas, processes, and tools. Specific emphasis is placed on quality assurance and data provenance. © 2010 Springer-Verlag Berlin Heidelberg.

Cite

CITATION STYLE

APA

Buchmann, T., Jablonski, S., Volz, B., & Westfechtel, B. (2010). Towards a generic infrastructure for sustainable management of quality controlled primary data. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 6428 LNCS, pp. 130–138). https://doi.org/10.1007/978-3-642-16961-8_29

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free