A comparison of CasPer against other ML techniques for stress recognition

0Citations
Citations of this article
2Readers
Mendeley users who have this article in their library.
Get full text

Abstract

When developing multi-layer neural networks (MLNNs), determining an appropriate size can be computationally intensive. Cascade Correlation algorithms such as CasPer attempt to address this, however, associated research often uses artificially constructed data. Additionally, few papers compare the effectiveness with standard MLNNs. This paper takes the ANUstressDB database and applies a genetic algorithm autoencoder to reduce the number of features. The efficiency and accuracy of CasPer on this dataset is then compared to CasCor, MLNN, KNN, and SVM. Results indicate the training time for CasPer was much lower than the MLNNs at a small cost to prediction accuracy. CasPer also had similar training efficiency to simple algorithms such as SVM, yet had a higher predictive ability. This indicates CasPer would be a good choice for difficult problems that require small training times. Furthermore, the cascading feature of the network makes it better at fitting to unknown problems, while remaining almost as accurate as standard MLNNs.

Cite

CITATION STYLE

APA

Sekoranja, J. M. H. (2019). A comparison of CasPer against other ML techniques for stress recognition. In Communications in Computer and Information Science (Vol. 1142 CCIS, pp. 707–714). Springer. https://doi.org/10.1007/978-3-030-36808-1_77

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free