September 12, 2024 by Naomi Dinmore, CERN
Collected at: https://phys.org/news/2024-09-boosting-particle-efficiency-ai-machine.html
As particle accelerator technology moves into the high-luminosity era, the need for extreme precision and unprecedented collision energy keeps growing. Given also the Laboratory’s desire to reduce energy consumption and costs, the design and operation of CERN’s accelerators must constantly be refined in order to be as efficient as possible.
To address this, the Efficient Particle Accelerators project (EPA) has been established—a team of people from different accelerator, equipment and control groups across CERN who are working together to improve accelerator efficiency.
A think-tank was set up following a 2022 workshop to plan upgrades for the High Luminosity LHC (HL-LHC), and it came up with seven recommendations on efficiency for the EPA to work on.
“The idea was to look at efficiency in the broadest terms,” says Alex Huschauer, engineer-in-charge of the CERN PS and member of the EPA. “We wanted a framework that could be applied to each machine in the accelerator complex.”
To do this, the team created nine work packages on efficiency to be deployed over the years leading up to the beginning of the HL-LHC run.
“It emerged from our discussions in the efficiency think-tank that automation is the way forward,” says the EPA project leader, Verena Kain. “This means using automation both in the conventional way and using AI and machine learning.”
For example, AI can help physicists combat accelerator magnet hysteresis. This happens when the field of the iron-dominated accelerator magnets cannot be described by a simple mapping of current in the electromagnet to the field.
If this is not taken into account, it can lead to inconsistent programmed fields and detrimental effects on beam quality, such as reducing the stability and precision of the beam’s trajectory. Today, these field errors are manually tuned to correct the field, a process that takes both time and energy.
“Hysteresis happens because the actual magnetic field is not defined just by the current in the power supply, but also by the magnet’s history,” says Kain. “What’s difficult is that we can’t model it analytically—we can’t work out exactly what current is needed to create the correct field for the beam in the accelerator magnet—at least not with the precision required. But AI can learn from the magnet’s historical data and elaborate a precise model.”
The team have done initial tests using magnets in the SPS and hope to train the AI on all CERN’s accelerating magnets over the coming years.
While the experiments across the CERN accelerator complex already use automation, AI and machine learning to assist with data-taking, up until now, much of the beam and accelerator control has been done manually.
“Most of the lower energy machines, like the PS, were built in an era when automation as we know it today was simply not possible,” Kain continues. Another area where automation can revolutionize efficiency is in scheduling.
“The different beams in the accelerator complex are produced one after the other and this has to be orchestrated so that the beam can be extracted from one machine and injected into the next at the right moment,” she says. “Sometimes we have to change the schedule between 20 to 40 times a day, and it can take around 5 minutes each time. That task, currently done manually, accounts for much of the work of people in the control center.”
By automating this process, control center operators will be able to spend more time working on the beams than on scheduling.
Other areas of focus for the EPA are automated LHC filling, autopilots, automatic fault recovery and prevention, automatic testing and sequencing, automatic parameter control and optimization. The team hopes to continue their research over the next five years, using LHC Run 3 and Long Shutdown 3 to conduct tests.
“Thanks to the EPA project, for the first time we will be using AI and automation for the accelerators on a large scale,” continues Huschauer. “If we can produce beams with better quality, we will be able to run the complex for less time, creating better physics data and reducing overall energy consumption.”
Leave a Reply