# SGFormer
**Repository Path**: whs075/SGFormer
## Basic Information
- **Project Name**: SGFormer
- **Description**: No description available
- **Primary Language**: Python
- **License**: Not specified
- **Default Branch**: main
- **Homepage**: None
- **GVP Project**: No
## Statistics
- **Stars**: 0
- **Forks**: 1
- **Created**: 2024-01-15
- **Last Updated**: 2025-08-24
## Categories & Tags
**Categories**: Uncategorized
**Tags**: None
## README
# SGFormer: Simplified Graph Transformers
The official implementation for NeurIPS23 paper "Simplifying and Empowering Transformers for Large-Graph Representations".
Related material: [[Paper](https://arxiv.org/pdf/2306.10759.pdf)], [[Blog](https://zhuanlan.zhihu.com/p/674548352)]
SGFormer is a graph encoder backbone that efficiently computes all-pair interactions with one-layer attentive propagation.
SGFormer is built upon our previous works on scalable graph Transformers with linear complexity [NodeFormer](https://github.com/qitianwu/NodeFormer) (NeurIPS22, spotlight) and [DIFFormer](https://github.com/qitianwu/DIFFormer) (ICLR23, spotlight).
## What's news
[2023.10.28] We release the code for the model on large graph benchmarks. More detailed info will be updated soon.
[2023.12.20] We supplement more details for how to run the code.
## Model and Results
The model adopts a simple architecture and is comprised of a one-layer global attention and a shallow GNN.
The following tables present the results for standard node classification tasks on medium-sized and large-sized graphs.
## Dataset
One can download the datasets (Planetoid, Deezer, Pokec, Actor/Film) from the google drive link below:
https://drive.google.com/drive/folders/1rr3kewCBUvIuVxA6MJ90wzQuF-NnCRtf?usp=drive_link
For Chameleon and Squirrel, we use the [new splits](https://github.com/yandex-research/heterophilous-graphs/tree/main) that filter out the overlapped nodes.
For the OGB datasets, they will be downloaded automatically when running the code.
## Run the codes
Please refer to the bash script `run.sh` in each folder for running the training and evaluation pipeline.
### Citation
If you find our code and model useful, please cite our work. Thank you!
```bibtex
@inproceedings{
wu2023sgformer,
title={Simplifying and Empowering Transformers for Large-Graph Representations},
author={Qitian Wu and Wentao Zhao and Chenxiao Yang and Hengrui Zhang and Fan Nie and Haitian Jiang and Yatao Bian and Junchi Yan},
booktitle={Advances in Neural Information Processing Systems (NeurIPS)},
year={2023}
}
```