A Hybrid LLM and Graph-Enhanced Transformer Framework for Cold-Start Session-Based Fashion Recommendation

Read the full article See related articles

Discuss this preprint

Start a discussion What are Sciety discussions?

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Session-based recommendation in areas like fashion is hard because of cold-start problems, changing user needs, and the difficulty of using product labels. Traditional models, such as sequence models or graph methods, often do not work well when user behavior is sparse or when the system needs to understand meaning. To solve these problems, we present RecAgent-LLaMA, a multi-step conversational recommendation system based on LLaMA-2-7B. It includes prompt-based semantic search, session modeling using Transformer-XL and GAT, detailed interest detection, and cross-attention for re-ranking. This system joins the semantic power of large language models with structured modeling. It improves generalization and user intent understanding, and it can be used in scalable recommendation tasks.

Article activity feed