Manufacturing companies are increasingly exposed to volatile market conditions. In this environment, ensuring a reliable adherence to promised delivery dates, allows for a considerable competitive advantage. However, due to dynamically changing production circumstances and high varieties in production programs, manufacturing companies regularly fail in reaching this logistical target. A main prerequisite for mastering this challenge are excellent Production Planning and Control processes. The quality of transactional data of production processes are a commonly ignored root cause for inadequate detailed scheduling plans although a vast volume of these data are used for updating production job statuses and short-term production plans, deriving conclusions for immediate control interventions as well as monitoring production efficiency. Typically, measures for improving data quality involve implementing integrity constraints in databases and setting up data quality processes as well as dedicated organizational structures. Evidently, these classic approaches do not successfully prevent manufacturing companies from dealing with inadequate data quality in their PPC processes. Consequently, this paper presents a model for increasing the quality of data relevant for production processes by adapting data mining algorithms. This new approach allows to estimate probable values for typical data inconsistencies in transactional data of PPC processes. Several adapted algorithms are benchmarked on real-world data sets of German mid-sized manufacturing companies and evaluated towards their power and efficiency.
Reuter, C., Brambring, F., Weirich, J., & Kleines, A. (2016). Improving Data Consistency in Production Control by Adaptation of Data Mining Algorithms. In Procedia CIRP (Vol. 56, pp. 545–550). Elsevier B.V. https://doi.org/10.1016/j.procir.2016.10.107