When developing multi-layer neural networks (MLNNs), determining an appropriate size can be computationally intensive. Cascade Correlation algorithms such as CasPer attempt to address this, however, associated research often uses artificially constructed data. Additionally, few papers compare the effectiveness with standard MLNNs. This paper takes the ANUstressDB database and applies a genetic algorithm autoencoder to reduce the number of features. The efficiency and accuracy of CasPer on this dataset is then compared to CasCor, MLNN, KNN, and SVM. Results indicate the training time for CasPer was much lower than the MLNNs at a small cost to prediction accuracy. CasPer also had similar training efficiency to simple algorithms such as SVM, yet had a higher predictive ability. This indicates CasPer would be a good choice for difficult problems that require small training times. Furthermore, the cascading feature of the network makes it better at fitting to unknown problems, while remaining almost as accurate as standard MLNNs.
CITATION STYLE
Sekoranja, J. M. H. (2019). A comparison of CasPer against other ML techniques for stress recognition. In Communications in Computer and Information Science (Vol. 1142 CCIS, pp. 707–714). Springer. https://doi.org/10.1007/978-3-030-36808-1_77
Mendeley helps you to discover research relevant for your work.