graphdeeplearning/graphtransformer
Graph Transformer Architecture. Source code for "A Generalization of Transformer Networks to Graphs", DLG-AAAI'21.
Adapts transformer attention to leverage graph neighborhood structure and uses Laplacian eigenvectors for position encoding instead of sinusoidal embeddings. Supports edge representations for tasks with rich relational information (molecular graphs, knowledge graphs). Built on the benchmarking-gnns framework with batch normalization replacing layer normalization for improved graph learning.
1,019 stars. No commits in the last 6 months.
Stars
1,019
Forks
150
Language
Python
License
MIT
Category
Last pushed
Jul 27, 2021
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/graphdeeplearning/graphtransformer"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Related models
vijaydwivedi75/gnn-lspe
Source code for GNN-LSPE (Graph Neural Networks with Learnable Structural and Positional...
snap-stanford/relgt
Relational Graph Transformer
SamsungSAILMontreal/nino
Code for "Accelerating Training with Neuron Interaction and Nowcasting Networks" [ICLR 2025]
SamsungSAILMontreal/ghn3
Code for "Can We Scale Transformers to Predict Parameters of Diverse ImageNet Models?" [ICML 2023]
chaitjo/gated-graph-transformers
Transformers are Graph Neural Networks!