Abstract
The prediction of electric and magnetic field amplitudes from atomic spectral data is critical for plasma control in fusion devices such as tokamaks. Conventional approaches that rely on physics-based models are computationally expensive and unsuitable for real-time applications. In this work, we develop and benchmark three machine learning algorithms—simulation-based inference (SBI), fully connected neural networks (FCNN), and histogram-based gradient boosting regression (GBR-Hist)—to infer field intensities directly from Doppler-free saturation spectroscopy (DFSS) spectra. Synthetic datasets of spectra were generated using the EZSSS code and evaluated both with and without added Poisson noise to mimic experimental conditions. We find that SBI achieves the highest accuracy and robustness, FCNN provides a strong balance of accuracy and computational efficiency for real-time applications, and GBR-Hist offers the fastest inference but is more sensitive to noise. These results demonstrate the potential of machine learning to accelerate DFSS analysis and enhance its utility for plasma diagnostics and control.
| Original language | English |
|---|---|
| Article number | 109710 |
| Journal | Journal of Quantitative Spectroscopy and Radiative Transfer |
| Volume | 348 |
| DOIs | |
| State | Published - Jan 2026 |
Funding
This work was supported by the U.S. Department of Energy, United States under contract number DE-AC02-09CH11466. The United States Government retains a non-exclusive, paid-up, irrevocable, world-wide license to publish or reproduce the published form of this manuscript, or allow others to do so, for United States Government purposes.
Keywords
- Doppler-free saturation spectroscopy
- Machine learning
- Tokamak plasmas