A Review of Statistical-Based Fault Detection and Diagnosis with Probabilistic Models

21Citations
Citations of this article
35Readers
Mendeley users who have this article in their library.
Get full text

Abstract

As industrial processes grow increasingly complex, fault identification becomes challenging, and even minor errors can significantly impact both productivity and system safety. Fault detection and diagnosis (FDD) has emerged as a crucial strategy for maintaining system reliability and safety through condition monitoring and abnormality recovery to manage this challenge. Statistical-based FDD methods that rely on large-scale process data and their features have been developed for detecting faults. This paper overviews recent investigations and developments in statistical-based FDD methods, focusing on probabilistic models. The theoretical background of these models is presented, including Bayesian learning and maximum likelihood. We then discuss various techniques and methodologies, e.g., probabilistic principal component analysis (PPCA), probabilistic partial least squares (PPLS), probabilistic independent component analysis (PICA), probabilistic canonical correlation analysis (PCCA), and probabilistic Fisher discriminant analysis (PFDA). Several test statistics are analyzed to evaluate the discussed methods. In industrial processes, these methods require complex matrix operation and cost computational load. Finally, we discuss the current challenges and future trends in FDD.

Cite

CITATION STYLE

APA

Zhu, Y., Zhao, S., Zhang, Y., Zhang, C., & Wu, J. (2024, April 1). A Review of Statistical-Based Fault Detection and Diagnosis with Probabilistic Models. Symmetry. Multidisciplinary Digital Publishing Institute (MDPI). https://doi.org/10.3390/sym16040455

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free