Recent advancements in artificial intelligence (AI) models for plasma heating have revealed capabilities beyond previous expectations. These innovative models not only boost prediction speed by 10 million times while maintaining accuracy but also successfully forecast plasma heating scenarios where traditional numerical codes have failed. Details of these findings will be shared on October 11 at the 66th Annual Meeting of the American Physical Society Division of Plasma Physics in Atlanta.
“With our intelligence, we can train the AI to surpass the limitations of existing numerical models,” stated Álvaro Sánchez-Villar, an associate research physicist from the U.S. Department of Energy’s (DOE) Princeton Plasma Physics Laboratory (PPPL). Sánchez-Villar is the leading author of a newly published peer-reviewed article in Nuclear Fusion concerning this research, which collaborated across five research institutions.
The AI models utilize machine learning to predict how electrons and ions behave in plasma under ion cyclotron range of frequency (ICRF) heating during fusion experiments. They are trained on data produced by a computer code. While most of the generated data aligned with previous findings, there were instances in extreme cases where the results deviated from expectations.
“We discovered a parametric regime within which the heating profiles showed unpredictable spikes in random positions,” Sánchez-Villar explained. “There was no physical reason to account for those spikes.”
The new AI models have proven to be more capable than previously considered, not only accelerating prediction speed by 10 million times while retaining accuracy but also accurately predicting plasma heating in instances where conventional numerical codes fell short. These models will be presented at an upcoming conference on October 11.
“Our intelligence enables AI to transcend the limitations posed by current numerical models,” said Álvaro Sánchez-Villar, the principal author of a peer-reviewed article in Nuclear Fusion, regarding this effort supported by five research institutions.
The AI leverages machine learning techniques to forecast electron and ion behavior in plasma subjected to ICRF heating during fusion experiments. The training data comes from computer code generation, which largely aligned with historical results, although there were some extreme cases that surprised researchers.
“In one instance, we identified a parametric regime where heating profiles presented erratic spikes at seemingly random locations,” Sánchez-Villar noted. “There was no physical explanation for these anomalies.”
“Effectively, our surrogate model functioned as if it repaired the original code, guided by careful data curation,” Sánchez-Villar continued. “With the right application, AI can not only expedite problem-solving but also enhance how we tackle challenges, pushing beyond our human limitations.”
True to expectations, the models also reduced computation times for ICRF heating, decreasing from approximately 60 seconds to just 2 microseconds. This significant speedup facilitates quicker simulations without compromising accuracy, aiding scientists and engineers in identifying optimal pathways to make fusion a viable energy source.
The project involved additional researchers, including Zhe Bai, Nicola Bertelli, E. Wes Bethel, Julien Hillairet, Talita Perciano, Syun’ichi Shiraiwa, Gregory M. Wallace, and John C. Wright. Funding for this research was provided by the U.S. Department of Energy under Contract Number DE-AC02-09CH11466. Resources from the National Energy Research Scientific Computing Center (NERSC) were utilized, operating under Contract No. DE-AC02-05CH11231, with NERSC Award FES m3716 designated for 2023.