A comparative study of linear and nonlinear regression models for outlier detection

15Citations
Citations of this article
12Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Artificial Neural Networks provide models for a large class of natural and artificial phenomena that are difficult to handle using classical parametric techniques. They offer a potential solution to fit all the data, including any outliers, instead of removing them. This paper compares the predictive performance of linear and nonlinear models in outlier detection. The best-subsets regression algorithm for the selection of minimum variables in a linear regression model is used by removing predictors that are irrelevant to the task to be learned. Then, the ANN is trained by the Multi-Layer Perceptron to improve the classification and prediction of the linear model based on standard nonlinear functions which are inherent in ANNs. Comparison of linear and nonlinear models was carried out by analyzing the Receiver Operating Characteristic curves in terms of accuracy and misclassification rates for linear and nonlinear models. The results for linear and nonlinear models achieved 68% and 93%, respectively, with better fit for the nonlinear model.

Cite

CITATION STYLE

APA

Dalatu, P. I., Fitrianto, A., & Mustapha, A. (2017). A comparative study of linear and nonlinear regression models for outlier detection. In Advances in Intelligent Systems and Computing (Vol. 549 AISC, pp. 316–326). Springer Verlag. https://doi.org/10.1007/978-3-319-51281-5_32

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free