A time-constrained algorithm for integration testing in a data warehouse environment

4Citations
Citations of this article
20Readers
Mendeley users who have this article in their library.

Abstract

A data warehouse should be tested for data quality on regular basis, preferably as a part of each ETL cycle. That way, a certain degree of confidence in the data warehouse reports can be achieved, and it is generally more likely to timely correct potential data errors. In this paper, we present an algorithm primarily intended for integration testing in the data warehouse environment, though more widely applicable. It is a generic, time-constrained, metadata driven algorithm that compares large database tables in order to attain the best global overview of the data set’s differences in a given time frame. When there is not enough time available, the algorithm is capable of producing coarse, less precise estimates of all data sets differences, and if allowed enough time, the algorithm will pinpoint exact differences. This paper presents the algorithm in detail, presents algorithm evaluation on the data of a real project and TPC-H data set, and comments on its usability. The tests show that the algorithm outperforms the relational engine when the percentage of differences in the database is relatively small, which is typical for data warehouse ETL environments.

Cite

CITATION STYLE

APA

Brkić, L., & Mekterović, I. (2018). A time-constrained algorithm for integration testing in a data warehouse environment. Information Technology and Control, 47(1), 5–25. https://doi.org/10.5755/j01.itc.47.1.18171

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free