DSA-GNAS: Graph Neural Architecture Search with Deep Semantic Adaption of Large Language Models
Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
The remarkable success of pre-trained large language model (LLM) in naturallanguage processing develops a new paradigm of combining LLM with graph neu-ral networks (GNN) on modeling textual-attributed graph. However, manuallydesigning the optimal model architectures to adapt the deep semantic of LLM ondifferent graphs is trivial and highly demands on expertise knowledge. Thoughgraph neural architecture search (GNAS) provide a feasible solutions to auto-matically design optimal GNN architectures for different graphs, former researchmainly establish on the shallow embedding methods, which ignore the differencebetween the deep semantic space and shallow embeddings. Focus on these issues,we propose DSA-GNAS, an effective TAG learning framework with auto graphneural architecture search and deep semantic adaption. To better leverage thedeep semantic space, we propose a novel Structure-Semantic Fusion (S2F) searchspace. The model architectures are sampled from the S2F space and form into adual-path adapter to fine-tune semantic embedding generated by LLM, which cansufficiently make adaptation on the semantic space to graph downstream task.The model architectures are automatically optimized through a genetic searchstrategy, which is global and not restricted by gradient, offering promising effi-ciency in searching for the optimal model architecture. Experimental results showthat DSA-GNAS can significantly improve performance on the graph task overother baselines, demonstrating that DSA-GNAS can effectively work in design-ing optimal model architectures for adapting the deep semantic to graph relatedtasks.