This repository hosts the official implementation of the paper "Mitigating Emergent Robustness Degradation on Graphs while Scaling-up", as presented at ICLR 2024.
Our research presents innovative strategies to address the challenge of emergent robustness degradation in graph neural networks as scale increases. This repository is structured to facilitate both reproduction of our results and application of our methods to new datasets and problems.
- Python 3.x
- Pip package manager
Clone the repository and install the required Python packages:
git clone https://github.com/chunhuizng/emergent-degradation.git
cd emergent-degradation
pip install -r requirements.txt
To train the model on a specific task, navigate to the appropriate directory and run the desired script. For example, to perform link prediction:
python training.py
To perform adversarial training or to simulate attacks using the moedp model:
python adv_training.py
python pipeline*.py
Replace adv_training.py
with attack.py
to execute an attack.
The repository is organized like GRB to facilitate easy extensions and modifications. Feel free to adapt the code to your requirements and contribute back any useful changes.
Please cite our work if it helps your research:
@inproceedings{yuan2024mitigating,
title={Mitigating Emergent Robustness Degradation on Graphs while Scaling-up},
author={Xiangchi Yuan and Chunhui Zhang and Yijun Tian and Yanfang Ye and Chuxu Zhang},
booktitle={International Conference on Learning Representations},
year={2024},
url={https://openreview.net/forum?id=Koh0i2u8qX}
}
For any questions about the code or contributions you'd like to make, please open an issue or a pull request.
This project is licensed under the MIT License - see the LICENSE file for details.