Bagging Technique Using Temporal Expansion Functions

0Citations
Citations of this article
6Readers
Mendeley users who have this article in their library.
Get full text

Abstract

The Bootstrap aggregating (Bagging) technique is widely used in the Machine Learning area, in order to reduce the prediction error of several unstable predictors. The method trains many predictors using bootstrap samples and combine them generating a new power learning tool. Although, if the training data has temporal dependency the technique is not applicable. One of the most efficient models for the treatment of time series is the Recurrent Neural Network (RNN) model. In this article, we use a RNN to encode the temporal dependency of the input data, then in the new encoding space the Bagging technique can be applied. We analyze the behavior of various neural activation functions for encoding the input data. We use three simulated and three real time-series data to analyze our approach. © Springer International Publishing Switzerland 2014.

Cite

CITATION STYLE

APA

Basterrech, S., & Mesa, A. (2014). Bagging Technique Using Temporal Expansion Functions. In Advances in Intelligent Systems and Computing (Vol. 303, pp. 395–404). Springer Verlag. https://doi.org/10.1007/978-3-319-08156-4_39

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free