AI may help develop clean, limitless fusion energy
Artificial intelligence (AI) may help develop safe, clean and virtually limitless fusion energy for generating electricity, scientists say. A team, including researchers from Princeton University and Harvard University, are applying deep learning to forecast sudden disruptions that can halt fusion reactions and damage the doughnut-shaped tokamaks or apparatus that house the reactions.
Deep learning is a powerful new version of the machine learning form of AI, according to the findings published in the journal Nature magazine.
“This research opens a promising new chapter in the effort to bring unlimited energy to Earth,” Steven Cowley, director of US Department of Energy’s (DOE) Princeton Plasma Physics Laboratory (PPPL). “Artificial intelligence is exploding across the sciences and now it’s beginning to contribute to the worldwide quest for fusion power,” Cowley said in a statement.
Fusion, which drives the Sun and stars, is the fusing of light elements in the form of plasma — the hot, charged state of matter composed of free electrons and atomic nuclei — that generates energy. Scientists are seeking to replicate fusion on Earth for an abundant supply of power for the production of electricity.
“Artificial intelligence is the most intriguing area of scientific growth right now, and to marry it to fusion science is very exciting,” said William Tang, a principal research physicist at PPPL. “We have accelerated the ability to predict with high accuracy the most dangerous challenge to clean fusion energy,” Tang said.
Unlike traditional software, which carries out prescribed instructions, deep learning learns from its mistakes. Accomplishing this seeming magic are neural networks, layers of interconnected nodes — mathematical algorithms — that are “parameterised,” or weighted by the programme to shape the desired output.
For any given input the nodes seek to produce a specified output, such as correct identification of a face or accurate forecasts of a disruption. Training kicks in when a node fails to achieve this task: the weights automatically adjust themselves for fresh data until the correct output is obtained.