Prompt based contextualized phrase embedding

Read the full article See related articles

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Word2vec gave us word embeddings, elmo and BERT made them context dependent. Word embedding evolved into whole sentence embedding models such as Universal Sentence Embedding (USE). At the heart of this concept is the notion that a portion of text can be represented as a distinct vector within a high-dimensional space, accompanied by a similarity metric that indicates semantic relatedness. We introduce a straightforward prompt-based contextualized phrase embedding (PCPE) method to transform a sentence within a broader context into an embedding that preserves meaningful similarity measures. The assessment utilizing the Phrase In Context similarity dataset demonstrates the advantages of this contextualized phrase embedding approach.

Article activity feed