Efficient Graph Convolutional Networks Update for Expanded Single Large Graph 


Vol. 14,  No. 12, pp. 1084-1090, Dec.  2025
10.3745/TKIPS.2025.14.12.1084


PDF
  Abstract

Graph Neural Networks(GNNs) are typically trained on static graphs whose structures remain unchanged. However in real-world scenarios, graphs expand as new nodes and edges are added, requiring the model to be retrained on the entire graph to reflect these updates. The fine-tuning method, often used to mitigate the inefficiency of full retraining, also faces limitations when applied to GNNs; due to the message passing mechanism, improvements in computational efficiency are restricted, and the performance on original nodes often declines. To address these challenges, this paper proposes an efficient Graph Convolutional Networks(GCN) update for expanded single large graphs. The proposed method maximizes computational efficiency by decomposing the message passing operation into two components: pre-computed information from the pre-training and newly required computations. Furthermore it alleviates performance degradation on original nodes, a problem by setting all nodes in the expanded graph as the training target. Experimental results on real-world datasets demonstrate that our proposed method significantly reduces training time compared to full retraining and fine-tuning, while maintaining a performance level comparable to that of full retraining.

  Statistics


  Cite this article

[IEEE Style]

S. J. Yeon and L. K. Yong, "Efficient Graph Convolutional Networks Update for Expanded Single Large Graph," The Transactions of the Korea Information Processing Society, vol. 14, no. 12, pp. 1084-1090, 2025. DOI: 10.3745/TKIPS.2025.14.12.1084.

[ACM Style]

Song Jee Yeon and Lee Ki Yong. 2025. Efficient Graph Convolutional Networks Update for Expanded Single Large Graph. The Transactions of the Korea Information Processing Society, 14, 12, (2025), 1084-1090. DOI: 10.3745/TKIPS.2025.14.12.1084.