Ppormer: A PPR-based Diffusion Transformer for Effective Node Representation Learning on Weighted Graphs 


Vol. 14,  No. 5, pp. 372-378, May  2025
https://doi.org/10.3745/TKIPS.2025.14.5.372


PDF
  Abstract

This paper proposes Ppormer (PPR-based Diffusion Transformer), a novel graph neural network designed to improve node representation learning on weighted graphs. Ppormer incorporates three complementary types of information into the node representation process: (1) Global contextual information, captured via diffusion-based self-attention mechanisms that model semantic relationships across the entire graph; (2) Local structural information, obtained through Graph Convolutional Network (GCN)-based message passing over immediate neighbors; and (3) Global structural information, derived from Personalized PageRank (PPR) to reflect long-range topological relevance based on edge strength. These heterogeneous signals are adaptively fused using the proposed FusionAttention module. Experiments conducted on the Cora dataset, with edge weights computed using Jaccard, Canberra, and Euclidean similarities, demonstrate that Ppormer achieves accuracy scores of 84.00%, 83.94%, and 85.54%, respectively—outperforming all baseline models. These results validate the effectiveness and generalizability of Ppormer in various weighted graph scenarios.

  Statistics


  Cite this article

[IEEE Style]

P. Siyeon and L. K. Yong, "Ppormer: A PPR-based Diffusion Transformer for Effective Node Representation Learning on Weighted Graphs," The Transactions of the Korea Information Processing Society, vol. 14, no. 5, pp. 372-378, 2025. DOI: https://doi.org/10.3745/TKIPS.2025.14.5.372.

[ACM Style]

Park Siyeon and Lee Ki Yong. 2025. Ppormer: A PPR-based Diffusion Transformer for Effective Node Representation Learning on Weighted Graphs. The Transactions of the Korea Information Processing Society, 14, 5, (2025), 372-378. DOI: https://doi.org/10.3745/TKIPS.2025.14.5.372.