A comparative study of linear and nonlinear regression models for outlier detection
Artificial Neural Networks provide models for a large class of natural and artificial phenomena that are difficult to handle using classical parametric techniques. They offer a potential solution to fit all the data, including any outliers, instead of removing them. This paper compares the predictiv...
Main Authors: | , , |
---|---|
Format: | Conference or Workshop Item |
Published: |
2016
|
Subjects: | |
Online Access: | https://doi.org/10.1007/978-3-319-51281-5_32 https://doi.org/10.1007/978-3-319-51281-5_32 |
Summary: | Artificial Neural Networks provide models for a large class of natural and artificial
phenomena that are difficult to handle using classical parametric techniques. They offer
a potential solution to fit all the data, including any outliers, instead of removing them.
This paper compares the predictive performance of linear and nonlinear models in outlier
detection. The best-subsets
regression algorithm for the selection of minimum variables
in a linear regression model is used by removing predictors that are irrelevant to the task to be learned. Then, the ANN is trained by the MultiLayer
Perceptron to improve the
classification and prediction of the linear model based on standard nonlinear functions
which are inherent in ANNs. Comparison of linear and nonlinear models was carried out
by analyzing the Receiver Operating Characteristic curves in terms of accuracy and
misclassification rates for linear and nonlinear models. The results for linear and
nonlinear models achieved 68% and 93%, respectively, with better fit for the nonlinear
model. |
---|