Towards a generic infrastructure for sustainable management of quality controlled primary data

by Thomas Buchmann, Stefan Jablonski, Bernhard Volz, Bernhard Westfechtel
Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) ()
Get full text at journal


Collecting primary data in scientific research is currently being performed in numerous repositories. Frequently, these repositories have not been designed to support long-term evolution of data, processes, and tools. Furthermore, in many cases repositories have been set up for the specific needs of some research project, and are not maintained any longer when the project is terminated. Finally, quality control and data provenance issues are not addressed to a sufficient extent. Based on the experiences gained in a joint project with biologists in the domain of biodiversity informatics, we propose a generic infrastructure for sustainable management of quality controlled primary data. The infrastructure encompasses both project and institutional repositories and provides a process for migrating project data into institutional repositories. Evolution and adaptability are supported through a generic approach with respect to underlying data schemas, processes, and tools. Specific emphasis is placed on quality assurance and data provenance. © 2010 Springer-Verlag Berlin Heidelberg.

Cite this document (BETA)

Readership Statistics

14 Readers on Mendeley
by Discipline
36% Computer and Information Science
29% Biological Sciences
7% Earth Sciences
by Academic Status
43% Ph.D. Student
14% Associate Professor
7% Researcher (at an Academic Institution)
by Country
21% United States
7% Spain
7% Germany

Sign up today - FREE

Mendeley saves you time finding and organizing research. Learn more

  • All your research in one place
  • Add and import papers easily
  • Access it anywhere, anytime

Start using Mendeley in seconds!

Sign up & Download

Already have an account? Sign in