Large-scale Knowledge Graph Construction with Prompting across tasks
(predictive and generative), and modalities
(language, image, vision + language, etc.)
GenKGC: link prediction as sequence-to-sequence generation for fast inference
KG-Prompt: data-efficient prompt learning-based knowledge graph completion
- [Model Release] Jaunary, 2022: GenKGC - A sequence-to-sequence approach for knowledge graph completion.
- [Model Release] January, 2022: KG-Prompt - A prompt learning-based approach for few-shot knowledge graph completion
***** January, 2022
: GenKGC | KG-Prompt release *****
- GenKGC (Jaunary 31, 2020): GenKGC converts knowledge graph completion to sequence-to-sequence generation with pre-trained language model with relation-guided demonstration and entity-aware hierarchical decoding. It can obtain better or comparable performance than baselines, and achieve faster inference speed compared with previous methods with pre-trained language models. "From Discrimination to Generation: Knowledge Graph Completion with Generative Transformer "
- KG-Prompt (Jaunary 31, 2020): A prompt-tuning approach (knowledge collaborative fine-tuning) for low-resource knowledge graph completion. KG-Prompt leverages the structured knowledge to construct the initial prompt template and learn the optimal templates, labels and model parameters through a collaborative fine-tuning algorithm. It can obtain state-of-the-art few-shot performance on FB15K-237, WN18RR, and UMLS. "Knowledge Collaborative Fine-tuning for Low-resource Knowledge Graph Completion
Journal of Software 2022
"
For help or issues using the models, please submit a GitHub issue.
For other communications, please contact Ningyu Zhang.