Supervised learning on molecules has incredible potential to be useful in chemistry, drug discovery, and materials science. Luckily, several promising and closely related neural network models invariant to molecular symmetries have already been described in the literature. These models learn a message passing algorithm and aggregation procedure to compute a function of their entire input graph. In this chapter, we describe a general common framework for learning representations on graph data called message passing neural networks (MPNNs) and show how several prior neural network models for graph data fit into this framework. This chapter contains large overlap with Gilmer et al. (International Conference on Machine Learning, pp. 1263–1272, 2017), and has been modified to highlight more recent extensions to the MPNN framework.
CITATION STYLE
Gilmer, J., Schoenholz, S. S., Riley, P. F., Vinyals, O., & Dahl, G. E. (2020). Message Passing Neural Networks. In Lecture Notes in Physics (Vol. 968, pp. 199–214). Springer. https://doi.org/10.1007/978-3-030-40245-7_10
Mendeley helps you to discover research relevant for your work.