ChatPORT: Fine-Tuned LLM for Easy Code {PORT}ing

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Fine-tuning existing LLMs for specialized tasks has become a very attractive alternative due to its low cost and quick development cycle. With many pre-trained LLMs available, it is an increasingly complex task to choose the correct model as the starting point or base model. In this work we discuss ChatPORT - a specialized fine-tuned LLM geared towards providing correctly translated codes from one programming model to another. We evaluate a number of base models and compare and contrast their features and characteristics that make them a viable starting point. In this paper, we focus on the OpenMP offload porting capabilities of ChatPORT. We build our training data using kernels from the Heterogeneous Computing Benchmarks (HeCBench) [12] and the OpenMP Validation and Verification suite [5] to fine-tune the base models. We then test the model using unseen kernels extracted from the HeCBench benchmark suite. Our results show that: (1) not all open LLMs geared towards HPC are aware of programming models like OpenMP, (2) although all base models benefit from fine-tuning they learn differently and produce different correctness rates, (3) depending on the memory size and compute resource available, different base models can be used for fine-tuning without significantly affecting the quality of transpiled code they generate, (4) fine-tuning improved the correctness rate of the LLM by an average of 43.2%, and (5) feedback-based training data further increased the correctness rate by an average of 6% over the LLMs tested.

Original languageEnglish
Title of host publicationOpenMP
Subtitle of host publicationBalancing Productivity and Performance Portability - 21st International Workshop on OpenMP, IWOMP 2025, Proceedings
EditorsYonghong Yan, Erik Saule, Michael Klemm, Bronis R. de Supinski, Jannis Klinkenberg, Swaroop Pophale
PublisherSpringer Science and Business Media Deutschland GmbH
Pages197-211
Number of pages15
ISBN (Print)9783032063427
DOIs
StatePublished - 2026
Event21st International Workshop on OpenMP, IWOMP 2025 - Charlotte, United States
Duration: Oct 1 2025Oct 3 2025

Publication series

NameLecture Notes in Computer Science
Volume16123 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

Conference21st International Workshop on OpenMP, IWOMP 2025
Country/TerritoryUnited States
CityCharlotte
Period10/1/2510/3/25

Funding

This material is based upon work supported by the U.S. Department of Energy, Office of Science, Office of Advanced Scientific Computing Research, through “Advancements in Artificial Intelligence for Science”, DE-FOA-0003264, under award number DE-SC0025645 and contract numbers ERKJ442.

Keywords

  • Code Porting
  • LLM
  • OpenMP offloading

Fingerprint

Dive into the research topics of 'ChatPORT: Fine-Tuned LLM for Easy Code {PORT}ing'. Together they form a unique fingerprint.

Cite this