Prompting Disentangled Embeddings for Knowledge Graph Completion with Pre-Trained Language Model

15 Pages Posted: 10 Apr 2024

See all articles by Yuxia Geng

Yuxia Geng

Hangzhou Dianzi University

Jiaoyan Chen

The University of Manchester

Yuhang Zeng

Hangzhou Dianzi University

Zhuo Chen

Zhejiang University

Wen Zhang

Zhejiang University

Jeff Z. Pan

University of Edinburgh - School of Informatics

Yuxiang Wang

Hangzhou Dianzi University

Xiaoliang Xu

Hangzhou Dianzi University

Abstract

Both graph structures and textual information play a critical role in Knowledge Graph Completion (KGC). With the success of Pre-trained Language Models (PLMs) such as BERT, they have been applied for text encoding for KGC. However, the current methods mostly prefer to fine-tune PLMs, leading to huge training costs and limited scalability to larger PLMs. In contrast, we propose to utilize prompts and perform KGC on a frozen PLM with only the prompts trained. Accordingly, we propose a new KGC method named PDKGC with two prompts --- a hard task prompt which is to adapt the KGC task to the PLM pre-training task of token prediction, and a disentangled structure prompt which learns disentangled graph representation so as to enable the PLM to combine more relevant structure knowledge with the text information. With the two prompts, PDKGC builds a textual predictor and a structural predictor, respectively, and their combination leads to more comprehensive entity prediction.Solid evaluation on two widely used KGC datasets has shown that PDKGC often outperforms the baselines including the state-of-the-art, and its components are all effective. Our codes and data are available at https://github.com/genggengcss/PDKGC.

Keywords: Knowledge Graph Completion, Pre-trained language Model, Prompt Tuning, Disentangled Embedding

Suggested Citation

Geng, Yuxia and Chen, Jiaoyan and Zeng, Yuhang and Chen, Zhuo and Zhang, Wen and Pan, Jeff Z. and Wang, Yuxiang and Xu, Xiaoliang, Prompting Disentangled Embeddings for Knowledge Graph Completion with Pre-Trained Language Model. Available at SSRN: https://ssrn.com/abstract=4790015 or http://dx.doi.org/10.2139/ssrn.4790015

Yuxia Geng (Contact Author)

Hangzhou Dianzi University ( email )

China

Jiaoyan Chen

The University of Manchester ( email )

United Kingdom

Yuhang Zeng

Hangzhou Dianzi University ( email )

China

Zhuo Chen

Zhejiang University ( email )

38 Zheda Road
Hangzhou, 310058
China

Wen Zhang

Zhejiang University ( email )

38 Zheda Road
Hangzhou, 310058
China

Jeff Z. Pan

University of Edinburgh - School of Informatics ( email )

Old College
South Bridge
Edinburgh, Scotland EH8 9JY
United Kingdom

Yuxiang Wang

Hangzhou Dianzi University ( email )

China

Xiaoliang Xu

Hangzhou Dianzi University ( email )

China

Do you have a job opening that you would like to promote on SSRN?

Paper statistics

Downloads
69
Abstract Views
272
Rank
722,637
PlumX Metrics