Skip to content

Latest commit

 

History

History
41 lines (23 loc) · 1.17 KB

README.md

File metadata and controls

41 lines (23 loc) · 1.17 KB

Comprehensive Graph Gradual Pruning for Sparse Training in Graph Neural Networks

Open-sourced implementation for TNNLS 2023.

Abstract

  1. We propose a graph gradual pruning framework, namely CGP, to reduce the training and inference computing costs of GNN models while preserving their accuracy.

  2. We comprehensively sparsify the elements of GNNs, including graph structures, the node feature dimension, and model parameters, to significantly improve the efficiency of GNN models.

  3. Experimental results on various GNN models and datasets consistently validate the effectiveness and efficiency of our proposed CGP.

Python Dependencies

Our proposed Gapformer is implemented in Python 3.7 and major libraries include:

  • Pytorch = 1.11.0+cu113
  • PyG torch-geometric=2.2.0

More dependencies are provided in requirements.txt.

To Run

Once the requirements are fulfilled, use this command to run:

sh xx.sh

Datasets

All datasets used in this paper can be downloaded from PyG.