Graph neural network (GNN) is a popular tool to learn the lower-dimensional representation of a graph. It facilitates the applicability of machine learning tasks on graphs by incorporating domain-specific features. There are various options for underlying procedures (such as optimization functions, activation functions, etc.) that can be considered in the implementation of GNN. However, most of the existing tools are confined to one approach without any analysis. Thus, this emerging field lacks a robust implementation ignoring the highly irregular structure of the real-world graphs. In this paper, we attempt to fill this gap by studying various alternative functions for a respective module using a diverse set of benchmark datasets. Our empirical results suggest that the generally used underlying techniques do not always perform well to capture the overall structure from a set of graphs.
CITATION STYLE
Rahman, M. K. (2020). Training Sensitivity in Graph Isomorphism Network. In International Conference on Information and Knowledge Management, Proceedings (pp. 2181–2184). Association for Computing Machinery. https://doi.org/10.1145/3340531.3412089
Mendeley helps you to discover research relevant for your work.