Task Embedding Adapter: A Parameter-Efficient Meta-Learning Method for Cold-Start Recommendation

Read the full article See related articles

Discuss this preprint

Start a discussion What are Sciety discussions?

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Personalized recommendation systems have been widely applied in various fields, but the cold-start problem remains a major challenge due to insufficient interaction data for new users or items. To address this, we propose a novel Task Embedding Adapter method that combines meta-learning and parameter-efficient fine-tuning. Specifically, a learnable task embedding is introduced for each user and injected into the backbone model via adapter modules, enabling efficient and personalized feature adjustment. During training, only the adapter and task embedding parameters are optimized, while the backbone remains fixed, significantly reducing computational costs and improving adaptation speed. Experimental results on public datasets demonstrate that our approach achieves superior performance and generalization compared to existing meta-learning and PEFT methods.

Article activity feed