Evaluating Pre-trained Large Language Models on Zero Shot Prompts for Parallelization of Source Code

Read the full article See related articles

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Large Language Models (LLMs) have recently gained prominence as intelligent software tools and areused in most phases of the software development life cycle (SDLC). While source code generation hasbeen widely studied, the generation of performant source code has received very little research attention. Although automatic code parallelization tools exist in the non-LLM setting, their analogue in the context of transformation using pre-trained models requires investigation. In this work, we perform a comparative study of 23 pre-trained LLMs considering the state-of-the-art Intel C Compiler (icc) as the baseline auto-parallelization tool. Our motivation is to investigate how well LLMs play the role of an auto-parallelizer, given a sequential C source code. Our experiments show that, in some instances, LLM-based zero-shot source code parallelization outperforms icc in non-functional aspects, such as speedup. Using 30 kernels from the popular PolyBench C benchmarks, we evaluated 23 pre-trained LLMs, generating 667 parallelized code versions. We identified notable performance gains after filtering out those with compilation errors or data race issues through comprehensive memory and threading error analysis. Across all tested LLMs, 26.66% of cases achieved a speedup surpassing that of the state-of-the-art (icc). The best-performing LLM-generated code achieved a parallelization speedup of 7.5× compared to the icc-parallelized version (speedup of 1.08×). Our study shows that pre-trained LLMs tasked with zero shot sequential to parallel code translation, i.e., even without fine-tuning or chain-of-thought prompting (or feedback prompts), perform competitively with non-LLM based compiler technology that has been developed over the past decades.

Article activity feed