Attention based LSTM with multi tasks learning for predictive process monitoring

2Citations
Citations of this article
11Readers
Mendeley users who have this article in their library.

Abstract

Today, in the artificial intelligence research field, Deep Learning (DL) is one of the fastest-growing techniques because of the power of learning features, that gives a higher level of abstraction of the raw attributes; and related research in Recurrent Neural Networks (RNN) and Long Short-Term Memory Networks (LSTM) have shown exemplary results in neural machine translation, neural image caption generation, NLP and so on. For our research, in order to detect potential problems and to facilitate proactive management, we focus on predictive process monitoring (PPM) as domain area, by predicting business behaviour from historical event logs. Recent research works, LSTM networks have gained attention in PPM and have been proved that they can highly improve prediction accuracy in PPM. According to the literature, we have learned that PPM resembles to an early sequence classification problem in NLP. And, recent trends in DL based NLP, attention mechanism is mostly embedded in neural networks. Inspired by these results, this paper proposes to firstly use Attention Based LSTM with Multi Tasks Learning for PPM.

Cite

CITATION STYLE

APA

Hnin, T., & Oo, K. K. (2019). Attention based LSTM with multi tasks learning for predictive process monitoring. In Proceedings of 2019 the 9th International Workshop on Computer Science and Engineering, WCSE 2019 SPRING (pp. 165–170). International Workshop on Computer Science and Engineering (WCSE). https://doi.org/10.18178/wcse.2019.03.028

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free