Relational Soft Prompt Pre-trained Language Model for Knowledge Graph Completion
Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
Knowledge graph completion (KGC) aims to infer valid triples through link prediction based on entities and relations within a knowledge graph, categorized into closed-world and open-world KGC respectively.Traditional models, designed for closed-world KGC, are limited by their reliance on static data and label sequences, which hampers their ability to represent new entities. Conversely, open-world KGC overcomes these limitations by incorporating textual descriptions for entities and employing text encoders to integrate new entities into existing graphs.The advent of pre-trained language models (PLMs) has significantly enhanced KGC by facilitating the use of prompt engineering. This paper introduces a novel relation soft prompt template that leverages PLMs for improved performance in both open-world and closed-world KGC. Our approach more effectively eliminates any minor changes caused by wording and stability when contrasted with manually crafted prompt templates.Our method significantly outperforms the baselines in KGC tasks, as validated through experiments on the WN18RR, FB15k-237, Wikidata5M, and Wiki27K datasets, successfully accommodating open-world and closed-world assumptions.