Enhancing Person-Centric Relation Extraction through Multi-Task Learning

Read the full article See related articles

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Relation extraction (RE) plays a vital role in understanding structured knowledge from text. Despite the prevalence and importance of person-centric relationships in real-world applications, they remain underexplored in current research. Existing models often struggle to capture the implicit semantic cues associated with person entities, resulting in suboptimal performance on such relations. To address this limitation, we propose a novel multi-task learning framework, SELF, designed to enhance extraction performance for person-centric relationships. Specifically, SELF introduces an auxiliary task that explicitly models the semantic space of person-related entities, enabling the model to effectively capture crucial yet sparse semantic information inherent in person-centric contexts. Additionally, SELF incorporates a hierarchical entity fusion module with a multi-layer routing mechanism to adaptively integrate shallow and deep semantic representations, further enriching the model's understanding of person-centric relationships. Extensive experiments conducted on the TACRED dataset, which is rich in person-centric relationships, along with two auxiliary datasets, demonstrate that SELF significantly outperforms existing models in classifying person-centric relations.

Article activity feed