Structured Self-Supervised Pretraining for Commonsense Knowledge Graph Completion

Jiayuan Huang, Yangkai Du, Shuting Tao, Kun Xu, Pengtao Xie


To develop commonsense-grounded NLP applications, a comprehensive and accurate commonsense knowledge graph (CKG) is needed. It is time-consuming to manually construct CKGs and many research efforts have been devoted to the automatic construction of CKGs.  Previous approaches focus on generating concepts that have direct and obvious relationships with existing concepts and lack an capability to generate unobvious concepts. In this work, we aim to bridge this gap. We propose a general graph-to-paths pretraining framework  which leverages high-order structures in CKGs to capture  high-order relationships between concepts. We instantiate this general  framework to four special cases: long path, path-to-path, router, and graph-node-path.  Experiments on two datasets demonstrate the effectiveness of our methods. The code will be released via the public GitHub repository.


  • There are currently no refbacks.

Copyright (c) 2021 Association for Computational Linguistics

Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License.