Rspkgc: Relational Soft Prompt Pre-Trained Language Model for Knowledge Graph Completion

19 Pages Posted: 30 Mar 2024

See all articles by yafei liu

yafei liu

Southwest University

Li Li

Southwest University

Abstract

Knowledge graph completion (KGC) aims to infer valid triples through link prediction based on entities and relations within a knowledge graph, categorized into closed-world and open-world KGC respectively. Traditional models, designed for closed-world KGC, are limited by their reliance on static data and label sequences, which hampers their ability to represent new entities. Conversely, open-world KGC overcomes these limitations by incorporating textual descriptions for entities and employing text encoders to integrate new entities into existing graphs. The advent of pre-trained language models (PLMs) has significantly enhanced KGC by facilitating the use of prompt engineering. This paper introduces a novel relational soft prompt template that leverages PLMs for improved performance in both open-world and closed-world KGC. Our approach more effectively eliminates any minor changes caused by wording and stability when contrasted with manually crafted prompt templates. Our method outperforms the baselines significantly in KGC tasks, as validated through experiments on the WN18RR, FB15k-237, and Wikidata5M datasets, successfully accommodating both open-world and closed-world assumptions.

Keywords: Knowledge graph completion, Prompt, Pre-trained language model

Suggested Citation

liu, yafei and Li, Li, Rspkgc: Relational Soft Prompt Pre-Trained Language Model for Knowledge Graph Completion. Available at SSRN: https://ssrn.com/abstract=4778440 or http://dx.doi.org/10.2139/ssrn.4778440

Yafei Liu

Southwest University ( email )

Chongqing, 400715
China

Li Li (Contact Author)

Southwest University ( email )

Chongqing, 400715
China

Do you have a job opening that you would like to promote on SSRN?

Paper statistics

Downloads
75
Abstract Views
285
Rank
697,757
PlumX Metrics