Sign up & Download
Sign in

Towards a generic infrastructure for sustainable management of quality controlled primary data

by Thomas Buchmann, Stefan Jablonski, Bernhard Volz, Bernhard Westfechtel
Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) ()

Abstract

Collecting primary data in scientific research is currently being performed in numerous repositories. Frequently, these repositories have not been designed to support long-term evolution of data, processes, and tools. Furthermore, in many cases repositories have been set up for the specific needs of some research project, and are not maintained any longer when the project is terminated. Finally, quality control and data provenance issues are not addressed to a sufficient extent. Based on the experiences gained in a joint project with biologists in the domain of biodiversity informatics, we propose a generic infrastructure for sustainable management of quality controlled primary data. The infrastructure encompasses both project and institutional repositories and provides a process for migrating project data into institutional repositories. Evolution and adaptability are supported through a generic approach with respect to underlying data schemas, processes, and tools. Specific emphasis is placed on quality assurance and data provenance.

Cite this document (BETA)

Readership Statistics

10 Readers on Mendeley
by Discipline
 
 
 
by Academic Status
 
40% Ph.D. Student
 
10% Librarian
 
10% Student (Master)
by Country
 
20% United States
 
20% Denmark
 
10% Switzerland

Sign up today - FREE

Mendeley saves you time finding and organizing research. Learn more

  • All your research in one place
  • Add and import papers easily
  • Access it anywhere, anytime

Start using Mendeley in seconds!

Already have an account? Sign in