Abstract
The occurrence of perioperative critical adverse events will affect the quality of medical services and threaten the safety of patients. Using scientific methods to assess the risk of critical illness in perioperative period is of great significance to improve the quality of medical service and ensure the safety of patients. However, the diagnosis and treatment data of perioperative patients are multi-source and irregular, and only one physiological information can not accurately reflect the patient's condition. In previous studies, it is found that a variety of physiological information can transmit the information of human health or not, which can be used to evaluate critical illness and physical condition. Therefore, this paper integrates the preoperative clinical structure data, intraoperative vital signs monitoring time series data and intraoperative anesthesia event time series data. Based on deep learning technology, the multi-modal data of patients are embedded and mapped into the same recessive semantic space to realize the real-time tracking and early warning of severe events, reduce postoperative complications and improve the early diagnosis efficiency of critical adverse events. The results showed that the performance of the model based on the multi-modal data was better than that based on the real military data.
Author supplied keywords
Cite
CITATION STYLE
Chen, Y., Li, Y., Huang, W., Zhang, J., Yi, B., & Qin, X. (2021). Early-Warning of Peri-operative Critical Event Based on Multimodal Information Fusion. In ACM International Conference Proceeding Series (pp. 71–78). Association for Computing Machinery. https://doi.org/10.1145/3484377.3484389
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.