Rspkgc: Relational Soft Prompt Pre-Trained Language Model for Knowledge Graph Completion
19 Pages Posted: 30 Mar 2024
Abstract
Knowledge graph completion (KGC) aims to infer valid triples through link prediction based on entities and relations within a knowledge graph, categorized into closed-world and open-world KGC respectively. Traditional models, designed for closed-world KGC, are limited by their reliance on static data and label sequences, which hampers their ability to represent new entities. Conversely, open-world KGC overcomes these limitations by incorporating textual descriptions for entities and employing text encoders to integrate new entities into existing graphs. The advent of pre-trained language models (PLMs) has significantly enhanced KGC by facilitating the use of prompt engineering. This paper introduces a novel relational soft prompt template that leverages PLMs for improved performance in both open-world and closed-world KGC. Our approach more effectively eliminates any minor changes caused by wording and stability when contrasted with manually crafted prompt templates. Our method outperforms the baselines significantly in KGC tasks, as validated through experiments on the WN18RR, FB15k-237, and Wikidata5M datasets, successfully accommodating both open-world and closed-world assumptions.
Keywords: Knowledge graph completion, Prompt, Pre-trained language model
Suggested Citation: Suggested Citation