In various fields, ensemble models by supervised learning are effective, but the models cannot tell us how to modify the input vector so that we will increase the objective variable more than a given threshold or decrease it less than the threshold. In this paper, we propose a method, TRANS-AM, that can discover an input vector satisfying the condition of changing of the objective variable in regression problems by using a property of regression tree. The regression tree splits input space into subspaces. There are subspaces with corresponding objective variables satisfying such a condition. By transforming the input vector to new input vectors belonging to one of the subspaces, we can discover a new input vector whose distance from the original input vector is minimum by satisfying the condition to change the objective variable. The reason for “minimum” is the cost—if the new input vector is far from the original one, we need the significant cost to modify the original input vector to the new one. We evaluated the proposed method through numerical simulations and investigated that the proposed method works well; the ratio of the number of discovered input vectors satisfying the condition per the number of discovered input vectors is 60% for the datasets generated through logistic function.
CITATION STYLE
Tanaka, H., Suzuki, Y., Yoshino, K., & Nakamura, S. (2018). TRANS-AM: Discovery method of optimal input vectors corresponding to objective variables. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 11031 LNCS, pp. 216–228). Springer Verlag. https://doi.org/10.1007/978-3-319-98539-8_17
Mendeley helps you to discover research relevant for your work.