Software
Control Frameworks for Accelerator & Experiment Control
Paper Title Page
MO3BCO03 Control System Development at the South African Isotope Facility 160
 
  • J.K. Abraham, H. Anderson
    iThemba LABS, Somerset West, South Africa
  • W. Duckitt
    Stellenbosch University, Matieland, South Africa
 
  The South African Isotope Facility (SAIF) at iThemba LABS is well into its commissioning phase. The intention of SAIF is to free up our existing Separated Sector Cyclotron to do more physics research and to increase our radioisotope production and research capacity. An EPICS based control system, primarily utilising EtherCAT hardware, has been developed that spans the control of beamline equipment, target handling and bombardment stations, vault clearance and ARMS systems. Various building and peripheral services like cooling water and gases, HVAC and UPS have also been integrated into the control system via Modbus and OPCUA to allow for seamless control and monitoring. An overview of the SAIF facility and the EPICS based control system is presented. The control strategies, hardware and various EPICS and web based software and tools utilised are presented.  
slides icon Slides MO3BCO03 [3.511 MB]  
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-MO3BCO03  
About • Received ※ 06 October 2023 — Revised ※ 11 October 2023 — Accepted ※ 14 November 2023 — Issued ※ 12 December 2023
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
MO3BCO05 Online Models for X-ray Beamlines Using Sirepo-Bluesky 165
 
  • J.A. Einstein-Curtis, D.T. Abell, M.V. Keilman, P. Moeller, B. Nash, I.V. Pogorelov
    RadiaSoft LLC, Boulder, Colorado, USA
  • Y. Du, A. Giles, J. Lynch, T. Morris, M. Rakitin, A.L. Walter
    BNL, Upton, New York, USA
 
  Funding: This material is based upon work supported by the U.S. Department of Energy, Office of Science, Office of Basic Energy Science, under Award Number DE-SC0020593.
Synchrotron radiation beamlines transport X-rays from the electron beam source to the experimental sample. Precise alignment of the beamline optics is required to achieve adequate beam properties at the sample. This process is often done manually and can be quite time consuming. Further, we would like to know the properties at the sample in order to provide metadata for X-ray experiments. Diagnostics may provide some of this information but important properties may remain unmeasured. In order to solve both of these problems, we are developing tools to create fast online models (also known as digital twins). For this purpose, we are creating reduced models that fit into a hierarchy of X-ray models of varying degrees of complexity and runtime. These are implemented within a software framework called Sirepo-Bluesky* that allows for the computation of the model from within a Bluesky session which may control a real beamline. This work is done in collaboration with NSLS-II. We present the status of the software development and beamline measurements including results from the TES beamline. Finally, we present an outlook for continuing this work and applying it to more beamlines at NSLS-II and other synchrotron facilities around the world.
*https://github.com/NSLS-II/sirepo-bluesky
 
slides icon Slides MO3BCO05 [3.747 MB]  
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-MO3BCO05  
About • Received ※ 13 October 2023 — Accepted ※ 14 November 2023 — Issued ※ 09 December 2023  
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
MO3BCO06 Web Technology Enabling Fast and Easy Implementation of Large Experimental Facility Control System 171
 
  • W. Zheng, H.B. Ma, L.Y. Wang, X.H. Xie, W.J. Ye, M. Zhang, P.L. Zhang
    HUST, Hubei, People’s Republic of China
 
  Funding: This work is supported by the National Magnetic Confinement Fusion Science Program (No. 2017YFE0301803) and by the National Natural Science Foundation of China (No.51821005).
Large experimental facilities are essential for pushing the frontier of fundamental research. The control system is the key for smooth operation for Large experimental facilities. Recently many new types of facilities have emerged, especially in fusion community, new machines with completely different designs are being built. They are not as mature as accelerators. They need flexible control systems to accommodate frequent changes in hardware and experiment workflow. The ability to quickly integrate new device and sub-systems into the control system as well as to easily adopt new operation modes are important requirements for the control system. Here we present a control system framework that is built with standard web technology. The key is using HTTP RESTful web API as the fundamental protocol for maximum interoperability. This enables it to be integrated into the already well developed ecosystem of web technology. Many existing tools can be integrated with no or little development. for instance, InfluxDB can be used as the archiver, Node-RED can be used as the Scripter and Docker can be used for quick deployment. It has also made integration of in house developed embedded devices much easier. In this paper we will present the capability of this control system framework, as well as a control system for field-reversed configuration fusion experiment facility implemented with it.
 
slides icon Slides MO3BCO06 [5.831 MB]  
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-MO3BCO06  
About • Received ※ 04 October 2023 — Revised ※ 08 October 2023 — Accepted ※ 14 December 2023 — Issued ※ 17 December 2023
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
MO3BCO07 Fast Beam Delivery for Flash Irradiations at the HZB Cyclotron 178
 
  • J. Bundesmann, A. Denker, G. Kourkafas
    HZB, Berlin, Germany
  • J. Heufelder, A. Weber
    Charite, Berlin, Germany
  • P. Mühldorfer
    BHT, Berlin, Germany
 
  In the context of radiotherapy, Flash irradiations mean the delivery of high dose rates of more than 40 Gy/s, in a short time of less than one second. The expectation of the radio-oncologists are lesser side effects while maintaining the tumour control when using Flash. Clinically acceptable deviations of the applied dose to the described dose are less than 3%. Our accelerator control system is well suited for the standard treatment of ocular melanomas with irradiaton times of 30 s to 60 s. However, it is too slow for the short times required in Flash. Thus, a dedicated beam delivery control system has been developed, permitting irradiation times down to 7 ms with a maximal dose variation of less than 3%.  
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-MO3BCO07  
About • Received ※ 24 August 2023 — Revised ※ 07 October 2023 — Accepted ※ 14 November 2023 — Issued ※ 17 December 2023
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
TUMBCMO01 Extending the Coverage of Automated Testing in ITER’s Control System Software Distribution 338
 
  • R. Lange, H. Kim, A. Žagar
    ITER Organization, St. Paul lez Durance, France
  • V. Costa, J. Nieto, M. Ruiz
    UPM-I2A2, Madrid, Spain
 
  Funding: Partially funded by PID2019-108377RB-C33/MCIN/AEI (Agencia Estatal de Investigación) /10.13039/501100011033 and PID2022-137680OB-C33/MCIN/AEI /10.13039/501100011033 / FEDER/ and the European Union.
As part of the effort to standardize the control system environment of ITER’s in-kind delivered >170 plant systems, the Controls Division publishes CODAC Core System (CCS), a complete Linux-based control system software distribution. In the past, a large part of the integrated and end-to-end software testing for CCS was executed manually, using many long and complex test plan documents. As the project progress introduces increasing scope and higher quality requirements, that approach was not maintainable in the long term. ITER CODAC and its partners have started a multi-year effort converting manual tests to automated tests, inside the so-called Framework for Integration Testing (FIT), which itself is being developed and gradually extended as part of the effort. This software framework is complemented by a dedicated hardware test stand setup, comprising specimens of the different controllers and I/O hardware supported by CCS. FIT and the test stand will allow to run fully scripted hardware-in-the-loop (HIL) tests and allow functional verification of specific software modules as well as different end-to-end use cases.
 
slides icon Slides TUMBCMO01 [1.306 MB]  
poster icon Poster TUMBCMO01 [10.356 MB]  
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-TUMBCMO01  
About • Received ※ 04 October 2023 — Revised ※ 10 October 2023 — Accepted ※ 28 November 2023 — Issued ※ 09 December 2023
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
TUMBCMO02 EPICS Java Developments 342
 
  • KS. Saintin, P. Lotrus
    CEA-IRFU, Gif-sur-Yvette, France
  • L. Caouën
    CEA-DRF-IRFU, France
 
  The IRFU*/DIS software control team is involved from feasibility studies to the deployment of equipment covering low level (hardware, PLC) to high level (GUI supervision). For our experiments, we are using two mains frameworks: - MUSCADE, a full Java in-house solution embedded SCADA dedicated to small and compact experiments controlled by PLC (Programmable Logic Controller), only compatible with Windows Operating System (OS) for the server side. - EPICS**, a distributed control systems to operate devices such as particle accelerators, large facilities and major telescopes, mostly deployed on Linux OS environments. EPICS frameworks provides several languages for bindings and server interfaces such as C/C++, Python and Java. However, most of the servers also called IOC*** developed in the community are based on C/C++ and Linux OS System. EPICS also provides extensions developed in Java such as the EPICS Archiver Appliance, Phoebus Control-Studio**** (GUI), and Display Web Runtime (Web Client). All these tools depend on CAJ (a pure Java implementation Channel Access Library). Today, MUSCADE users use to work under Windows, and they need intuitive tools that provide the same features than MUSCADE. Thus, research and development activities mainly focus on EPICS solution adaptation. It aims to explore further CAJ library, especially on the server side aspect. In order to achieve this goal, several developments have been carried out since 2018.
* IRFU https://irfu.cea.fr/en
** EPICS https://epics-controls.org/
*** IOC Input Output Controller
**** Phoebus Control-Studio https://control-system-studio.readthedocs.io/
 
slides icon Slides TUMBCMO02 [1.381 MB]  
poster icon Poster TUMBCMO02 [2.202 MB]  
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-TUMBCMO02  
About • Received ※ 30 September 2023 — Revised ※ 08 October 2023 — Accepted ※ 13 October 2023 — Issued ※ 30 October 2023
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
TUMBCMO04 Real-Time Visualization and Peak Fitting of Time-of-Flight Neutron Diffraction at VULCAN 346
 
  • B.A. Sobhani, Y. Chen
    ORNL, Oak Ridge, Tennessee, USA
 
  In neutron scattering experiments at the VULCAN beamline at SNS, Gaussian fitting of dspace peaks can be used to summarize certain material properties of a sample. If this can be done in real time, it can also assist scientists in mid-experiment decision making. This paper describes a system developed in EPICS for visualizing dspace evolution and fitting dspace peaks in real-time at the VULCAN beamline.  
slides icon Slides TUMBCMO04 [0.433 MB]  
poster icon Poster TUMBCMO04 [0.338 MB]  
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-TUMBCMO04  
About • Received ※ 05 October 2023 — Revised ※ 11 October 2023 — Accepted ※ 28 November 2023 — Issued ※ 14 December 2023
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
TUPDP023
Control System for X-ray Imaging Experiments at CFEL  
 
  • D. Egorov, S. Bajt, H. Chapman, I. De Gennaro Aquino, H. Fleckenstein, P. Middendorf
    CFEL, Hamburg, Germany
 
  The Coherent Imaging Division of the Center for Free-Electron Laser Science (CFEL) at DESY develops innovative methods for imaging with the use of X-ray Free Electron Laser (XFEL) and synchrotron sources, with an emphasis on bioparticles and macromolecules. The determination of the structure of such objects is particularly sensitive to radiation damage, which can be overcome by using ultrafast X-ray pulses that outrun this damage. The use of X-ray imaging techniques in scientific research has significantly increased in recent years, resulting in a growing demand for advanced control systems that can enhance the accuracy, efficiency, and reliability of the experiments. The development and implementation of such systems allow researchers to automate and customize the various components involved in X-ray imaging experiments, including detectors and motor stages. The current implementation of the control system is based on the Kamzik3 framework, which was developed especially for these experiments. There is ongoing work to migrate the existing system to the Tango Controls framework, utilizing macros executed by Sardana. It will simplify the integration process of the experimental setup into beamlines on different synchrotron sources and allow the usage of community-developed tools.  
poster icon Poster TUPDP023 [2.416 MB]  
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
TUPDP033 Applying Model Predictive Control to Regulate Thermal Stability of a Hard X-ray Monochromator Using the Karabo SCADA Framework 579
 
  • M.A. Smith, G. Giovanetti, S. Hauf, I. Karpics, A. Parenti, A. Samadli, L. Samoylova, A. Silenzi, F. Sohn, P. Zalden
    EuXFEL, Schenefeld, Germany
 
  Model Predictive Control (MPC) is an advanced method of process control whereby a model is developed for a real-life system and an optimal control solution is then calculated and applied to control the system. At each time step, the MPC controller uses the system model and system state to minimize a cost function for optimal control. The Karabo SCADA Framework is a distributed control system developed specifically for European XFEL facility, consisting of tens of thousands of hardware and software devices and over two million attributes to track system state. This contribution describes the application of the Python MPC Toolbox within the Karabo SCADA Framework to solve a monochromator temperature control problem. Additionally, the experiences gained in this solution have led to a generic method to apply MPC to any group of Karabo SCADA devices.  
poster icon Poster TUPDP033 [0.337 MB]  
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-TUPDP033  
About • Received ※ 05 October 2023 — Revised ※ 18 October 2023 — Accepted ※ 04 December 2023 — Issued ※ 11 December 2023
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
TUPDP045 Monitoring the SKA Infrastructure for CICD 622
 
  • M. Di Carlo, M. Dolci
    INAF - OAAB, Teramo, Italy
  • P. Harding, U.Y. Yilmaz
    SKAO, Macclesfield, United Kingdom
  • J.B. Morgado
    Universidade do Porto, Faculdade de Ciências, Porto, Portugal
  • P. Osorio
    Atlar Innovation, Pampilhosa da Serra, Portugal
 
  Funding: INAF
The Square Kilometre Array (SKA) is an international effort to build two radio interferometers in South Africa and Australia, forming one Observatory monitored and controlled from global headquarters (GHQ) based in the United Kingdom at Jodrell Bank. The selected solution for monitoring the SKA CICD (continuous integration and continuous deployment) Infrastructure is Prometheus with the help of Thanos. Thanos is used for high availability, resilience, and long term storage retention for monitoring data. For data visualisation, the Grafana project emerged as an important tool for displaying data in order to make specific reasoning and debugging of particular aspect of the infrastructure in place. In this paper, the monitoring platform is presented while considering quality aspect such as performance, scalability, and data preservation.
 
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-TUPDP045  
About • Received ※ 27 September 2023 — Revised ※ 18 October 2023 — Accepted ※ 14 December 2023 — Issued ※ 19 December 2023
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
TUPDP062
Development of EPICS for RF Fundamental Power Coupler Experimental Test Bench  
 
  • H. Do, Y.H. Kim, M.J. Park
    IBS, Daejeon, Republic of Korea
 
  Funding: This work was supported by the Rare Isotope Science Project of Institute for Basic Science funded by Ministry of Science and ICT and NRF of Korea (2013M7A1A1075764)
An RF fundamental power coupler operating at 7 kW, 325 MHz applies RF power to the superconducting cavity of the high-energy Linac of the RAON. A prototype coupler was developed and a test bench was built to experiment the coupler alone. Experimental Physics and Industrial Control System (EPICS) was developed to control, monitor and protect the test bench. In addition, the Graphical User Interface (GUI) was visually constructed and the temporal operation was designed. Input and reflected RF power of the coupler, signals from electron pickup probes, vacuum level and temperature data are monitored in real time and stored in the DAQ. The interlock is used to protect the coupler from abnormal conditions such as MP occurring inside the coupler. When the MP occurs during the experiment, the RF power must be cut off immediately. Signals such as vacuum level, signals from electron pickup probes, temperature values and temperature increase rates per second were used to consist the interlock. Detailed the GUI design and results are presented.
 
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
TUPDP072 Overview of Observation Preparation and Scheduling on the MeerKAT Radio Telescope 669
 
  • L.P. Williams, R.L. Schwartz
    SARAO, Cape Town, South Africa
 
  Funding: National Research Foundation (South Africa)
The MeerKAT radio telescope performs a wide variety of scientific observations. Observation durations range from a few minutes, to many hours, and may form part of observing campaigns that span many weeks. Static observation requirements, such as resources or array configuration, may be determined and verified months in advance. Other requirements however, such as atmospheric conditions, can only be verified hours before the planned observation event. This wide variety of configuration, scheduling and control parameters are managed with features provided by the MeerKAT software. The short term scheduling functionality has expanded from simple queues to support for automatic scheduling (queuing). To support long term schedule planning, the MeerKAT telescope includes an Observation Panning Tool which provides configuration checking as well as dry-run environments that can interact with the production system. Observations are atomized to support simpler specification, facilitating machine learning projects and more flexibility in scheduling around engineering and maintenance events. This paper will provide an overview of observation specification, configuration, and scheduling on the MeerKAT telescope. The support for integration with engineering subsystems is also described. Engineering subsystems include User Supplied Equipment which are hardware and computing resources integrated to expand the MeerKAT telescope’s capabilities.
 
poster icon Poster TUPDP072 [1.546 MB]  
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-TUPDP072  
About • Received ※ 05 October 2023 — Revised ※ 09 November 2023 — Accepted ※ 20 December 2023 — Issued ※ 21 December 2023
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
TUPDP075 OPC UA EPICS Bridge 681
 
  • W. Duckitt
    Stellenbosch University, Matieland, South Africa
  • J.K. Abraham
    iThemba LABS, Somerset West, South Africa
 
  OPC UA is a service-orientated communication architecture that supports platform-independent, data exchange between embedded micro-controllers, PLCs or PCs and cloudbased infrastructure. This makes OPC UA ideal for developing manufacturer independent communication to vendor specific PLCs, for example. With this in mind, we present an OPC UA to EPICS bridge that has been containerized with Docker to provide a micro-service for communicating between EPICS and OPC UA variables.  
poster icon Poster TUPDP075 [0.681 MB]  
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-TUPDP075  
About • Received ※ 03 October 2023 — Revised ※ 20 October 2023 — Accepted ※ 11 December 2023 — Issued ※ 12 December 2023
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
TUPDP109 Tickit: An Event-Based Multi-Device Simulation Framework 823
 
  • A. Emery, T.M. Cobb, C.A. Forrester, G. O’Donnell
    DLS, Oxfordshire, United Kingdom
 
  Tickit is an event-based multi-device simulation framework providing configuration and orchestration of complex simulations. It was developed at Diamond Light Source in order to overcome limitations presented to us by some of our existing hardware simulations. With the Tickit framework, simulations can be addressed with a compositional approach. It allows devices to be simulated individually while still maintaining the interconnected behaviour exhibited by their hardware counterparts. This is achieved by modelling the interactions between devices, such as electronic signals. Devices can be collated into larger simulated systems providing a layer of simulated hardware against which to test the full stack of Data Acquisition and Controls tools. We aim to use this framework to extend the scope and improve the interoperability of our simulations; enabling us to further improve the testing of current systems and providing a preferential platform to assist in development of the new Acquisition and Controls tools.  
poster icon Poster TUPDP109 [0.703 MB]  
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-TUPDP109  
About • Received ※ 29 September 2023 — Revised ※ 21 October 2023 — Accepted ※ 04 December 2023 — Issued ※ 18 December 2023
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
TUPDP110 Control System Design of the CHIMERA Fusion Test Facility 827
 
  • P.T. Smith, A. Greer, B.A. Roberts, P.B. Taylor
    OSL, St Ives, Cambridgeshire, United Kingdom
  • D.J.N. McCubbin, M. Roberts
    JCE, Warrington, United Kingdom
 
  Funding: Observatory Sciences Ltd
CHIMERA is an experimental nuclear fusion test facility which aims to simulate the intense magnetic fields and temperature gradients found within a tokamak fusion reactor. The control system at CHIMERA is based on EPICS and will have approximately 30 input/output controllers (IOCs) when it comes online in 2024. It will make heavy use of CSS Phoebus for its user interface, sequencer and alarm system. CHIMERA will use EPICS Archiver Appliance for data archiving and EPICS areaDetector to acquire high speed data which is stored in the HDF5 format. The control philosophy at CHIMERA emphasises PLC based control logic using mostly Siemens S7-1500 PLCs and using OPCUA to communicate with EPICS. EPICS AUTOSAVE is used both for manually setting lists of process variables (PVs) and for automatic restoration of PVs if an IOC must be restarted.
 
poster icon Poster TUPDP110 [1.711 MB]  
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-TUPDP110  
About • Received ※ 03 October 2023 — Revised ※ 09 October 2023 — Accepted ※ 12 October 2023 — Issued ※ 17 October 2023
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
TUPDP122 Fast Wire Scanner Motion Control Software Upgrade For LCLS-II 869
 
  • Z. Huang, N. Balakrishnan, J.D. Bong, M.L. Campell, T.C. Thayer
    SLAC, Menlo Park, California, USA
 
  Funding: Work supported by U.S. Department of Energy under contract number DE- AC02-76SF00515
LCLS-II is the first XFEL to be based on continuous-wave superconducting accelerator technology (CW-SCRF), with the X-ray pulses at repetition rates of up to 1 MHz. LCLS-II’s wire scanner motion control is based on Aerotech Ensemble controller. The position feedback and the beam loss monitor readings during a wire scan aim to measure the beam profile. To meet the measurement requirements under both low and high beam repetition rates, we redesign the software program for EPICS IOC, Aerotech controller, and develop a new User Interface (UI) based on PyDM. This paper will describe the software development details and the software commissioning result under LCLS-II’s production environment.
 
poster icon Poster TUPDP122 [1.248 MB]  
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-TUPDP122  
About • Received ※ 05 October 2023 — Revised ※ 20 October 2023 — Accepted ※ 04 December 2023 — Issued ※ 12 December 2023
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
TUPDP123 SLAC ATCA Scope - Upgrading the EPICS Support Package 873
 
  • D. Alnajjar, M.P. Donadio, K.H. Kim, R. Ruckman
    SLAC, Menlo Park, California, USA
 
  Funding: Work supported by US DOE contract DE-AC02-76SF00515
The SLAC ATCA Scope, a 4-channel dual scope, has an EPICS support package that runs on top of SLAC’s Common Platform software and firmware, and communicates with several high-performance systems in LCLS running on the 7-slot Advanced Telecommunications Computing Architecture (ATCA) crate. The software was completely refactored to improve the usability for IOC engineers. Once linked with an EPICS IOC, it initializes the scope hardware and instantiates the upper software stack providing a set of PVs to control the API and hardware, and to operate the oscilloscope. The exported PVs provide seamless means to configure triggers and obtain data acquisitions similar to a real oscilloscope. The ATCA scope probes are configured dynamically by the user to probe up to four inputs of the ATCA ADC daughter cards. The EPICS support package automatically manages available ATCA carrier board DRAM resources based on the number of samples requested by the user, allowing acquisitions of up to 8 GBytes per trigger. The user can also specify a desired sampling rate, and the ATCA Scope will estimate the nearest possible sampling rate using the current sampling frequency, and perform downsampling to try to match that rate. Adding the EPICS module to an IOC is simple and straightforward. The ATCA Scope support package works for all high-performance systems that have the scope common hardware implemented in its FPGAs. Generic interfaces developed in PyDM are also provided to the user to control the oscilloscope and enrich the user’s seamless overall experience.
 
poster icon Poster TUPDP123 [0.984 MB]  
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-TUPDP123  
About • Received ※ 03 October 2023 — Accepted ※ 30 November 2023 — Issued ※ 08 December 2023  
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
TUSDSC01
BLISS: ESRF All-In-One, Python-based Experiment Control System  
 
  • M. Guijarro, G. Berruyer, L. Claustre, W. De Nolf, L. Felix, A. Götz, P. Guillou, C. Guilloud, J.M. Meyer, E. Papillon, S. Petitdemange, L. Pithan, V. Valls
    ESRF, Grenoble, France
 
  BLISS is an all-in-one experiment control system designed to address the complex challenges of synchronized data acquisition and management, for synchrotrons and other labs. Written in Python, BLISS provides a comprehensive solution for hardware control (BLISS native, Tango and EPICS control systems are supported), experiment control sequences, data acquisition, and data visualization. Its modular design makes it easy to configure and customize for different setups. One of the key features of BLISS is its decoupling of data acquisition from data storage, which is achieved through the use of Redis as a temporary buffer. Thanks to a companion Python library called "blissdata" clients can access data without perturbing the acquisition, alleviating real-time constraints for display, saving or to perform online data analysis. On top of blissdata, BLISS is shipped with Flint, a powerful data visualization tool to display and interact with experimental data in real-time, providing an efficient solution for quality control and immediate feedback. BLISS comes with handy web applications, including a configuration tool and a web terminal ; users can easily configure the system and interact with it. It is designed to interface with Daiquiri, for more advanced web applications. Additionally, BLISS includes a full simulation environment, which can be used to learn about the system and to try it out. In summary, BLISS is a complete solution for laboratory data acquisition and management that provides a user-friendly interface and supports online data analysis and data display.  
poster icon Poster TUSDSC01 [2.538 MB]  
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
TUSDSC02 Integrating Online Analysis with Experiments to Improve X-Ray Light Source Operations 921
 
  • N.M. Cook, E.G. Carlin, J.A. Einstein-Curtis, R. Nagler, R. O’Rourke
    RadiaSoft LLC, Boulder, Colorado, USA
  • A.M. Barbour, M. Rakitin, L. Wiegart, H. Wijesinghe
    BNL, Upton, New York, USA
 
  Funding: This material is based upon work supported by the U.S. Department of Energy, Office of Science, Office of Advanced Scientific Computing Research under Award Number DE-SC00215553.
The design, execution, and analysis of light source experiments requires the use of sophisticated simulation, controls and data management tools. Existing workflows require significant specialization to accommodate specific beamline operations and data pre-processing steps necessary for more intensive analysis. Recent efforts to address these needs at the National Synchrotron Light Source II (NSLS-II) have resulted in the creation of the Bluesky data collection framework, an open-source library for coordinating experimental control and data collection. Bluesky provides high level abstraction of experimental procedures and instrument readouts to encapsulate generic workflows. We present a prototype data analysis platform for integrating data collection with real time analysis at the beamline. Our application leverages Bluesky in combination with a flexible run engine to execute user configurable Python-based analyses with customizable queueing and resource management. We discuss initial demonstrations to support X-ray photon correlation spectroscopy experiments and future efforts to expand the platform’s features.
 
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-TUSDSC02  
About • Received ※ 06 October 2023 — Revised ※ 22 October 2023 — Accepted ※ 11 December 2023 — Issued ※ 14 December 2023
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
TH1BCO01 Five years of EPICS 7 - Status Update and Roadmap 1087
 
  • R. Lange
    ITER Organization, St. Paul lez Durance, France
  • L.R. Dalesio, M.A. Davidsaver, G.S. McIntyre
    Osprey DCS LLC, Ocean City, USA
  • S.M. Hartman, K.-U. Kasemir
    ORNL, Oak Ridge, Tennessee, USA
  • A.N. Johnson, S. Veseli
    ANL, Lemont, Illinois, USA
  • H. Junkes
    FHI, Berlin, Germany
  • T. Korhonen, S.C.F. Rose
    ESS, Lund, Sweden
  • M.R. Kraimer
    Self Employment, Private address, USA
  • K. Shroff
    BNL, Upton, New York, USA
  • G.R. White
    SLAC, Menlo Park, California, USA
 
  Funding: Work supported in part by the U.S. Department of Energy under contracts DE-AC02-76SF00515 and DE-AC05-00OR22725.
After its first release in 2017, EPICS version 7 has been introduced into production at several sites. The central feature of EPICS 7, the support of structured data through the new pvAccess network protocol, has been proven to work in large production systems. EPICS 7 facilitates the implementation of new functionality, including developing AI/ML applications in controls, managing large data volumes, interfacing to middle-layer services, and more. Other features like support for the IPv6 protocol and enhancements to access control have been implemented. Future work includes integrating a refactored API into the core distribution, adding modern network security features, as well as developing new and enhancing existing services that take advantage of these new capabilities. The talk will give an overview of the status of deployments, new additions to the EPICS Core, and an overview of its planned future development.
 
slides icon Slides TH1BCO01 [0.562 MB]  
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-TH1BCO01  
About • Received ※ 04 October 2023 — Revised ※ 12 October 2023 — Accepted ※ 19 November 2023 — Issued ※ 24 November 2023
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
TH1BCO02 Development of Laser Accelerator Control System Based on EPICS 1093
 
  • Y. Xia, K.C. Chen, L.W. Feng, Z. Guo, Q.Y. He, F.N. Li, C. Lin, Q. Wang, X.Q. Yan, M.X. Zang, J. Zhao
    PKU, Beijing, People’s Republic of China
  • J. Zhao
    Peking University, Beijing, Haidian District, People’s Republic of China
 
  Funding: State Key Laboratory of Nuclear Physics and Technology, and Key Laboratory of HEDP of the Ministry of Education, CAPT, Peking University, Beijing 100871, China;
China’s Ministry of Science and Technology supports Peking University in constructing a proton radiotherapy device based on a petawatt (PW) laser accelerator. The control system’s functionality and performance are vital for the accelerator’s reliability, stability, and efficiency. The PW laser accelerator control system has a three-layer distributed architecture, including device control, front-end (input/output) control and central control (data management, and human-machine interface) layers. The software platform primarily uses EPICS, supplemented by PLC, Python, and Java, while the hardware platform comprises industrial control computers, servers, and private cloud configurations. The control system incorporates various subsystems that manage the laser, target field, beamline, safety interlocks, conditions, synchronization, and functionalities related to data storage, display, and more. This paper presents a control system implementation suitable for laser accelerators, providing valuable insights for future laser accelerator control system development.
 
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-TH1BCO02  
About • Received ※ 04 October 2023 — Revised ※ 09 October 2023 — Accepted ※ 14 December 2023 — Issued ※ 15 December 2023
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
TH1BCO03 The Tango Controls Collaboration Status in 2023 1100
 
  • T. Juerges
    SKAO, Macclesfield, United Kingdom
  • G. Abeillé
    SOLEIL, Gif-sur-Yvette, France
  • R.J. Auger-Williams
    OSL, St Ives, Cambridgeshire, United Kingdom
  • B. Bertrand, V. Hardion, A.F. Joubert
    MAX IV Laboratory, Lund University, Lund, Sweden
  • R. Bourtembourg, A. Götz, D. Lacoste, N. Leclercq
    ESRF, Grenoble, France
  • T. Braun
    byte physics, Annaburg, Germany
  • G. Cuní, C. Pascual-Izarra, S. Rubio-Manrique
    ALBA-CELLS, Cerdanyola del Vallès, Spain
  • Yu. Matveev
    DESY, Hamburg, Germany
  • M. Nabywaniec, T.R. Noga, Ł. Żytniak
    S2Innovation, Kraków, Poland
  • L. Pivetta
    Elettra-Sincrotrone Trieste S.C.p.A., Basovizza, Italy
 
  Since 2021 the Tango Controls collaboration has improved and optimised its efforts in many areas. Not only have Special Interest Group meetings (SIGs) been introduced to speed up the adoption of new technologies or improvements, the kernel has switched to a fixed six-month release cycle for quicker adoption of stable kernel versions by the community. CI/CD provides now early feedback on test failures and compatibility issues. Major code refactoring allowed for a much more efficient use of developer resources. Relevant bug fixes, improvements and new features are now adopted at a much higher rate than ever before. The community participation has also noticeably improved. The kernel switched to C++14 and the logging system is undergoing a major refactoring. Among many new features and tools is jupyTango, Jupyter Notebooks on Tango Controls steroids. PyTango is now easy to install via binary wheels, old Python versions are no longer supported, the build-system is switching to CMake, and releases are now made much closer to stable cppTango releases.  
slides icon Slides TH1BCO03 [1.357 MB]  
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-TH1BCO03  
About • Received ※ 05 October 2023 — Revised ※ 24 October 2023 — Accepted ※ 21 November 2023 — Issued ※ 13 December 2023
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
TH1BCO04 Asynchronous Execution of Tango Commands in the SKA Telescope Control System: An Alternative to the Tango Async Device 1108
 
  • B.A. Ojur, A.J. Venter
    SARAO, Cape Town, South Africa
  • D. Devereux
    CSIRO, Clayton, Australia
  • D. Devereux, S.N. Twum, S. Vrcic
    SKAO, Macclesfield, United Kingdom
 
  Equipment controlled by the Square Kilometre Array (SKA) Control System will have a TANGO interface for control and monitoring. Commands on TANGO device servers have a 3000 milliseconds window to complete their execution and return to the client. This timeout places a limitation on some commands used on SKA TANGO devices which take longer than the 3000 milliseconds window to complete; the threshold is more stricter in the SKA Control System (CS) Guidelines. Such a command, identified as a Long Running Command (LRC), needs to be executed asynchronously to circumvent the timeout. TANGO has support for an asynchronous device which allows commands to be executed slower than 3000 milliseconds by using a coroutine to put the task on an event loop. During the exploration of this, a decision was made to implement a custom approach in our base repository which all devices depend on. In this approach, every command annotated as ¿long running¿ is handed over to a thread to complete the task and its progress is tracked through attributes. These attributes report the queued commands along with their progress, status and results. The client is provided with a unique identifier which can be used to track the execution of the LRC and take further action based on the outcome of that command. LRCs can be aborted safely using a custom TANGO command. We present the reference design and implementation of the Long Running Commands for the SKA Controls System.  
slides icon Slides TH1BCO04 [0.674 MB]  
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-TH1BCO04  
About • Received ※ 06 October 2023 — Revised ※ 24 October 2023 — Accepted ※ 20 December 2023 — Issued ※ 22 December 2023
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
TH1BCO05 Diamond Light Source Athena Platform 1115
 
  • J. Shannon, C.A. Forrester, K.A. Ralphs
    DLS, Oxfordshire, United Kingdom
 
  The Athena Platform aims to replace, upgrade and modernise the capabilities of Diamond Light Source’s acquisition and controls tools, providing an environment for better integration with information management and analysis functionality. It is a service-based experiment orchestration system built on top of NSLS-II’s Python based Bluesky/Ophyd data collection framework, providing a managed and extensible software deployment local to the beamline. By using industry standard infrastructure provision, security and interface technologies we hope to provide a sufficiently flexible and adaptable platform, to meet the wide spectrum of science use cases and beamline operation models in a reliable and maintainable way. In addition to a system design overview, we describe here some initial test deployments of core capabilities to a number of Diamond beamlines, as well as some of the technologies developed to support the overall delivery of the platform.  
slides icon Slides TH1BCO05 [1.409 MB]  
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-TH1BCO05  
About • Received ※ 05 October 2023 — Accepted ※ 08 December 2023 — Issued ※ 16 December 2023  
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
TH1BCO06 The Karabo Control System 1120
 
  • S. Hauf, N. Anakkappalla, J.T. Bin Taufik, V. Bondar, R. Costa, W. Ehsan, S.G. Esenov, G. Flucke, A. García-Tabarés Valdivieso, G. Giovanetti, D. Goeries, D.G. Hickin, I. Karpics, A. Klimovskaia, A. Parenti, A. Samadli, H. Santos, A. Silenzi, M.A. Smith, F. Sohn, M. Staffehl, C. Youngman
    EuXFEL, Schenefeld, Germany
 
  The Karabo distributed control system has been developed to address the challenging requirements of the European X-ray Free Electron Laser facility*, which include custom-made hardware, and high data rates and volumes. Karabo implements a broker-based SCADA environment**. Extensions to the core framework, called devices, provide control of hardware, monitoring, data acquisition and online processing on distributed hardware. Services for data logging and for configuration management exist. The framework exposes Python and C++ APIs, which enable developers to quickly respond to requirements within an efficient development environment. An AI driven device code generator facilitates prototyping. Karabo’s GUI features an intuitive, coding-free control panel builder. This allows non-software engineers to create synoptic control views. This contribution introduces the Karabo Control System out of the view of application users and software developers. Emphasis is given to Karabo’s asynchronous Python environment. We share experience of running the European XFEL using a clean-sheet developed control system, and discuss the availability of the system as free and open source software.
* Tschentscher, et al. Photon beam transport and scientific instruments at the European XFEL App. Sci.7.6(2017):592
** Hauf, et al. The Karabo distributed control system J.Sync. Rad.26.5(2019):1448ff
 
slides icon Slides TH1BCO06 [5.878 MB]  
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-TH1BCO06  
About • Received ※ 06 October 2023 — Accepted ※ 03 December 2023 — Issued ※ 12 December 2023  
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)