PyTorch implementation for Graph Contrastive Learning with Augmentations [poster] [appendix]
Yuning You*, Tianlong Chen*, Yongduo Sui, Ting Chen, Zhangyang Wang, Yang Shen
In NeurIPS 2020.
For the automated version of GraphCL, please refer to https://github.com/Shen-Lab/GraphCL_Automated.
For the extention of GraphCL to hypergraphs, please refer to https://github.com/weitianxin/HyperGCL.
For the most comprehensive collection of graph SSL papers, please refer to https://github.com/ChandlerBang/awesome-self-supervised-gnn.
In this repository, we develop contrastive learning with augmentations for GNN pre-training (GraphCL, Figure 1) to address the challenge of data heterogeneity in graphs. Systematic study is performed as shown in Figure 2, to assess the performance of contrasting different augmentations on various types of datasets.
- The Role of Data Augmentation
- Semi-supervised learning [TU Datasets] [MNIST and CIFAR10]
- Unsupervised representation learning [TU Datasets] [Cora and Citeseer]
- Transfer learning [MoleculeNet and PPI]
- Adversarial robustness [Component Graphs]
Some issues might occur due to the version mismatch. I collect them as follows (keep updating).
KeyError:'num_nodes'
in unsupervised_TU: Shen-Lab#36, Shen-Lab#41AttributeError: 'Data' object has no attribute 'cat_dim'
in transferLearning_MoleculeNet_PPI: Shen-Lab#13- Bugs in subgraph implementation: Shen-Lab#24
- Loss of negative values in transfer learning: Shen-Lab#50
If you use this code for you research, please cite our paper.
@inproceedings{You2020GraphCL,
author = {You, Yuning and Chen, Tianlong and Sui, Yongduo and Chen, Ting and Wang, Zhangyang and Shen, Yang},
booktitle = {Advances in Neural Information Processing Systems},
editor = {H. Larochelle and M. Ranzato and R. Hadsell and M. F. Balcan and H. Lin},
pages = {5812--5823},
publisher = {Curran Associates, Inc.},
title = {Graph Contrastive Learning with Augmentations},
url = {https://proceedings.neurips.cc/paper/2020/file/3fe230348e9a12c13120749e3f9fa4cd-Paper.pdf},
volume = {33},
year = {2020}
}