Keyword: EPICS
Paper Title Other Keywords Page
MO1BCO03 LCLS-II Accelerator Control System Status controls, MMI, linac, undulator 12
 
  • D. Rogind, S. Kwon
    SLAC, Menlo Park, California, USA
 
  Funding: US DOE
The Linac Coherent Light Source complex at the SLAC National Accelerator Laboratory has been upgraded to add a new superconducting accelerator with beam rates up to 1MHz. Though the majority of the more than twenty accelerator control systems are based on LCLS designs, to accommodate the increase in repetition rate from 120Hz to 1MHz, many of the diagnostics and global control systems are upgraded to high performance platforms with standalone CPUs running linuxRT to host the EPICS based controls. With installation and checkouts for control systems completing in 2022, the phased approach to integration and commissioning recently completed with demonstration of the threshold key performance parameters and first light occurring in the Summer of 2023. This paper provides an overview of the LCLS-II accelerator control system architecture, upgrades, the multi-year installation, checkout, integration, commissioning, and lessons learned.
 
slides icon Slides MO1BCO03 [2.380 MB]  
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-MO1BCO03  
About • Received ※ 02 October 2023 — Revised ※ 10 October 2023 — Accepted ※ 19 December 2023 — Issued ※ 21 December 2023
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
MO1BCO04 EIC Controls System Architecture Status and Plans controls, software, interface, operation 19
 
  • J.P. Jamilkowski, S.L. Clark, M.R. Costanzo, T. D’Ottavio, M. Harvey, K. Mernick, S. Nemesure, F. Severino, K. Shroff
    BNL, Upton, New York, USA
  • L.R. Dalesio
    Osprey DCS LLC, Ocean City, USA
  • K. Kulmatycski, C. Montag, V.H. Ranjbar, K.S. Smith
    Brookhaven National Laboratory (BNL), Electron-Ion Collider, Upton, New York, USA
 
  Funding: Contract Number DE-AC02-98CH10886 with the auspices of the US Department of Energy
Preparations are underway to build the Electron Ion Collider (EIC) once Relativistic Heavy Ion Collider (RHIC) beam operations are end in 2025, providing an enhanced probe into the building blocks of nuclear physics for decades into the future. With commissioning of the new facility in mind, Accelerator Controls will require modernization in order to keep up with recent improvements in the field as well as to match the fundamental requirements of the accelerators that will be constructed. We will describe the status of the Controls System architecture that has been developed and prototyped for EIC, as well as plans for future work. Major influences on the requirements will be discussed, including EIC Common Platform applications as well as our expectation that we’ll need to support a hybrid environment covering both the proprietary RHIC Accelerator Device Object (ADO) environment as well as EPICS.
 
slides icon Slides MO1BCO04 [1.458 MB]  
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-MO1BCO04  
About • Received ※ 05 October 2023 — Revised ※ 08 October 2023 — Accepted ※ 14 November 2023 — Issued ※ 11 December 2023
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
MO2AO02 A Beamline and Experiment Control System for the SLS 2.0 controls, interface, experiment, data-acquisition 71
 
  • K. Wakonig, C. Appel, A. Ashton, S. Augustin, M. Holler, I. Usov, J. Wyzula, X. Yao
    PSI, Villigen PSI, Switzerland
 
  The beamlines of the Swiss Light Source (SLS) predominantly rely on EPICS standards as their control interface but in contrast to many other facilities, there is up to now no standardized user interfacing component to orchestrate, monitor and provide feedback on the data acquisition. As a result, the beamlines have either adapted community solutions or developed their own high-level orchestration system. For the upgrade project SLS 2.0, a sub-project was initiated to facilitate a unified beamline and experiment control system. During a pilot phase and a first development cycle, libraries of the Bluesky project were used, combined with additional in-house developed services, and embedded in a service-based approach with a message broker and in-memory database. Leveraging the community solutions paired with industry standards, enabled the development of a highly modular system which provides the flexibility needed for a constantly changing scientific environment. One year after the development started, the system was already tested during many weeks of user operation and recently received the official approval by the involved divisions to be rolled out as part of the SLS 2.0 upgrade.  
slides icon Slides MO2AO02 [3.119 MB]  
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-MO2AO02  
About • Received ※ 05 October 2023 — Revised ※ 09 October 2023 — Accepted ※ 12 October 2023 — Issued ※ 14 October 2023
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
MO2AO04 Experimental Data Taking and Management: The Upgrade Process at BESSY II and HZB experiment, controls, data-acquisition, MMI 84
 
  • R. Müller, H. Görzig, G. Hartmann, K. Kiefer, R. Ovsyannikov, W. Smith, S. Vadilonga, J. Viefhaus
    HZB, Berlin, Germany
  • D.B. Allan
    BNL, Upton, New York, USA
 
  The endeavor of modernizing science data acquisition at BESSY II started 2019 [*] Significant achievements have been made: the Bluesky software ecosystem is now accepted framework for data acquisition, flow control and automation. It is operational at an increasing number of HZB beamlines, endstations and instruments. Participation in the global Bluesky collaboration is an extremely empowering experience. Promoting FAIR data principles at all levels developed a unifying momentum, providing guidance at less obvious design considerations. Now a joint demonstrator project of DESY, HZB, HZDR and KIT, named ROCK-IT (Remote Operando Controlled Knowledge-driven, IT-based), aims at portable solutions for fully automated measurements in the catalysis area of material science and is spearheading common developments. Foundation there is laid by Bluesky data acquisition, AI/ML support and analysis, modular sample environment, robotics and FAIR data handling. This paper puts present HZB controls projects as well as detailed HZB contributions to this conference [**] into context. It outlines strategies providing appropriate digital tools at a successor 4th generation light source BESSY III.
[*] R. Müller, et.al. https://doi.org/10.18429/JACoW-ICALEPCS2019-MOCPL02
[**] covering digital twins, Bluesky, sample environment, motion control, remote access, meta data
 
slides icon Slides MO2AO04 [2.522 MB]  
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-MO2AO04  
About • Received ※ 05 October 2023 — Revised ※ 26 October 2023 — Accepted ※ 14 November 2023 — Issued ※ 16 December 2023
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
MO2AO05 Deployment of ADTimePix3 areaDetector Driver at Neutron and X-ray User Facilities detector, neutron, controls, software 90
 
  • K.J. Gofron, J. Wlodek
    BNL, Upton, New York, USA
  • S.C. Chong, F. Fumiaki, SG. Giles, G.S. Guyotte, SDL. Lyons
    ORNL, Oak Ridge, Tennessee, USA
  • B. Vacaliuc
    ORNL RAD, Oak Ridge, Tennessee, USA
 
  Funding: This work was supported by the U.S. Department of Energy, Office of Science, Scientific User Facilities Division under Contract No. DE-AC05-00OR22725.
TimePix3 is a 65k hybrid pixel readout chip with simultaneous Time-of-Arrival (ToA) and Time-over-Threshold (ToT) recording in each pixel*. The chip operates without a trigger signal with a sparse readout where only pixels containing events are read out. The flexible architecture allows 40 MHits/s/cm2 readout throughput, using simultaneous readout and acquisition by sharing readout logic with transport logic of superpixel matrix formed using 2x4 structure. The chip ToA records 1.5625 ns time resolution. The X-ray and charged particle events are counted directly. However, indirect neutron counts use 6Li fission in a scintillator matrix, such as ZnS(Ag). The fission space-charge region is limited to 5-9 um. A photon from scintillator material excites a photocathode electron, which is further multiplied in dual-stack MCP. The neutron count event is a cluster of electron events at the chip. We report on the EPICS areaDetector** ADTimePix3 driver that controls Serval*** using json commands. The driver directs data to storage and to a real-time processing pipeline and configures the chip. The time-stamped data are stored in raw .tpx3 file format and passed through a socket where the clustering software identifies individual neutron events. The conventional 2D images are available as images for each exposure frame, and a preview is useful for sample alignment. The areaDetector driver allows integration of time-enhanced capabilities of this detector into SNS beamlines controls and unprecedented time resolution.
*T Poikela et al 2014 JINST 9 C05013.
**https://github.com/areaDetector
***Software provided by the vendor (ASI) that interfaces detector (10GE) and EPICS data acquisition ioc ADTimePix3
 
slides icon Slides MO2AO05 [3.379 MB]  
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-MO2AO05  
About • Received ※ 04 October 2023 — Revised ※ 08 October 2023 — Accepted ※ 13 October 2023 — Issued ※ 28 October 2023
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
MO3BCO03 Control System Development at the South African Isotope Facility controls, target, PLC, network 160
 
  • J.K. Abraham, H. Anderson
    iThemba LABS, Somerset West, South Africa
  • W. Duckitt
    Stellenbosch University, Matieland, South Africa
 
  The South African Isotope Facility (SAIF) at iThemba LABS is well into its commissioning phase. The intention of SAIF is to free up our existing Separated Sector Cyclotron to do more physics research and to increase our radioisotope production and research capacity. An EPICS based control system, primarily utilising EtherCAT hardware, has been developed that spans the control of beamline equipment, target handling and bombardment stations, vault clearance and ARMS systems. Various building and peripheral services like cooling water and gases, HVAC and UPS have also been integrated into the control system via Modbus and OPCUA to allow for seamless control and monitoring. An overview of the SAIF facility and the EPICS based control system is presented. The control strategies, hardware and various EPICS and web based software and tools utilised are presented.  
slides icon Slides MO3BCO03 [3.511 MB]  
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-MO3BCO03  
About • Received ※ 06 October 2023 — Revised ※ 11 October 2023 — Accepted ※ 14 November 2023 — Issued ※ 12 December 2023
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
MO3BCO06 Web Technology Enabling Fast and Easy Implementation of Large Experimental Facility Control System controls, experiment, framework, interface 171
 
  • W. Zheng, H.B. Ma, L.Y. Wang, X.H. Xie, W.J. Ye, M. Zhang, P.L. Zhang
    HUST, Hubei, People’s Republic of China
 
  Funding: This work is supported by the National Magnetic Confinement Fusion Science Program (No. 2017YFE0301803) and by the National Natural Science Foundation of China (No.51821005).
Large experimental facilities are essential for pushing the frontier of fundamental research. The control system is the key for smooth operation for Large experimental facilities. Recently many new types of facilities have emerged, especially in fusion community, new machines with completely different designs are being built. They are not as mature as accelerators. They need flexible control systems to accommodate frequent changes in hardware and experiment workflow. The ability to quickly integrate new device and sub-systems into the control system as well as to easily adopt new operation modes are important requirements for the control system. Here we present a control system framework that is built with standard web technology. The key is using HTTP RESTful web API as the fundamental protocol for maximum interoperability. This enables it to be integrated into the already well developed ecosystem of web technology. Many existing tools can be integrated with no or little development. for instance, InfluxDB can be used as the archiver, Node-RED can be used as the Scripter and Docker can be used for quick deployment. It has also made integration of in house developed embedded devices much easier. In this paper we will present the capability of this control system framework, as well as a control system for field-reversed configuration fusion experiment facility implemented with it.
 
slides icon Slides MO3BCO06 [5.831 MB]  
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-MO3BCO06  
About • Received ※ 04 October 2023 — Revised ※ 08 October 2023 — Accepted ※ 14 December 2023 — Issued ※ 17 December 2023
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
MO4BCO02 Lessons from Using Python GraphQL Libraries to Develop an EPICS PV Server for Web UIs controls, status, ECR, factory 191
 
  • R.J. Auger-Williams
    OSL, St Ives, Cambridgeshire, United Kingdom
  • A.L. Alexander, T.M. Cobb, M.J. Gaughran, A.J. Rose, A.W.R. Wells, A.A. Wilson
    DLS, Oxfordshire, United Kingdom
 
  Diamond Light Source is currently developing a web-based EPICS control system User Interface (UI). This will replace the use of EDM and the Eclipse-based CS-Studio at Diamond, and it will integrate with future Acquisition and Analysis software. For interoperability, it will use the Phoebus BOB file format. The architecture consists of a back-end application using EPICS Python libraries to obtain PV data and the query language GraphQL to serve these data to a React-based front end. A prototype was made in 2021, and we are now doing further development from the prototype to meet the first use cases. Our current work focuses on the back-end application, Coniql, and for the query interface we have selected the Strawberry GraphQL implementation from the many GraphQL libraries available. We discuss the reasons for this decision, highlight the issues that arose with GraphQL, and outline our solutions. We also demonstrate how well these libraries perform within the context of the EPICS web UI requirements using a set of performance metrics. Finally, we provide a summary of our development plans.  
slides icon Slides MO4BCO02 [4.243 MB]  
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-MO4BCO02  
About • Received ※ 29 September 2023 — Accepted ※ 13 October 2023 — Issued ※ 20 October 2023  
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
MO4BCO05 Apples to Oranges: A Comparison of EPICS Build and Deployment Systems site, LLRF, controls, MMI 205
 
  • S.C.F. Rose, D.H.C. Araujo, L.A. Mello Magalhães, A.L. Olsson
    ESS, Lund, Sweden
 
  ESS currently uses two different systems for managing the build and deployment of EPICS modules. Both of these use modules that are packaged and prepared to be dynamically loaded into soft IOCs, based on the require module developed at PSI. The difference is the deployment: For the accelerator, we use a custom utility to define and build an EPICS environment which is then distributed on a global shared filesystem to the production and lab networks. For the neutron instrumentation side, in contrast, we use conda to build individual EPICS environments for each IOC, where the underlying packages are stored on a shared artifactory server. In each case, the goal is to provide a repeatable and controllable mechanism to produce a consistent EPICS environment for IOCs in use at ESS. The difference (other than the tools and storage) is in some sense philosophical: should a software environment be defined at build-time or at run-time? In this presentation we will provide an overview of some of the challenges, contrasts, and lessons learned from these two different but related approaches to EPICS module deployment.  
slides icon Slides MO4BCO05 [0.819 MB]  
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-MO4BCO05  
About • Received ※ 06 October 2023 — Accepted ※ 13 October 2023 — Issued ※ 24 October 2023  
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
MO4AO01 Xilinx Zync Ultrascale+ MPSoC Used as Embedded IOC for a Beam Position Monitor (BPM) System software, FEL, Linux, controls 210
 
  • G.M. Marinkovic, D. Anicic, R. Ditter, B. Keil, J. Purtschert, M. Roggli
    PSI, Villigen PSI, Switzerland
 
  At PSI we are combining the hardware, firmware, operating system, control system, embedded event system, operation and supervision in a Beam Position Monitor (BPM) system for 24/7 accelerator operation, using a Multi-Processing-System-on-Chip (MPSoC) of type Xilinx Zynq UltraScale+. We presently use MPSoCs for our latest generic BPM electronics platform called "DBPM3" in the Athos soft X-ray branch, as well as for new BPMs and general controls hardware and devices for SLS 2.0, a major upgrade of the Swiss Light Source. We are also in the process of upgrading our previous "MBU" (modular BPM Unit) platform for the SwissFEL linac and hard X-ray "Aramis"  from external VMEbus based IOCs to integrated add-on cards with MPSoC IOCs. On all these MPSoCs, we are integrating an EPICS IOC, event receiver, measurement and feedback data real-time processing on a single chip. In this contribution, we describe our experience with the tight integration and daily operation of the various firmware and software components and features on the MPSoC, using the BPM system also to discuss general aspects relevant for other systems and components discussed in other PSI contributions on this conference.  
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-MO4AO01  
About • Received ※ 06 October 2023 — Revised ※ 09 October 2023 — Accepted ※ 23 November 2023 — Issued ※ 11 December 2023
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
TU1BCO01 A Workflow for Training and Deploying Machine Learning Models to EPICS controls, GPU, framework, software 244
 
  • M.F. Leputa, K.R.L. Baker, M. Romanovschi
    STFC/RAL/ISIS, Chilton, Didcot, Oxon, United Kingdom
 
  The transition to EPICS as the control system for the ISIS Neutron and Muon Source accelerators is an opportunity to more easily integrate machine learning into operations. But developing high quality machine learning (ML) models is insufficient. Integration into critical operations requires good development practices to ensure stability and reliability during deployment and to allow robust and easy maintenance. For these reasons we implemented a workflow for training and deploying models that utilize off-the-shelf, industry-standard tools such as MLflow. Our experience of how adoption of these tools can make developer’s lives easier during the training phase of a project is discussed. We describe how these tools may be used in an automated deployment pipeline to allow the ML model to interact with our EPICS ecosystem through Python-based IOCs within a containerized environment. This reduces the developer effort required to produce GUIs to interact with the models within the ISIS Main Control Room as tools familiar to operators, such as Phoebus, may be used.  
slides icon Slides TU1BCO01 [3.370 MB]  
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-TU1BCO01  
About • Received ※ 05 October 2023 — Accepted ※ 12 October 2023 — Issued ※ 19 October 2023  
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
TU2BCO04 Accelerator Systems Cyber Security Activities at SLAC controls, network, simulation, operation 292
 
  • G.R. White, A.L. Edelen
    SLAC, Menlo Park, California, USA
 
  Funding: Work supported in part by the U.S. Department of Energy under contract number DE-AC02-76SF00515.
We describe four cyber security related activities of SLAC and collaborations. First, from a broad review of accelerator computing cyber and mission reliability, our analysis method, findings and outcomes. Second, lab-wide and accelerator penetration testing, in particular methods to control, coordinate, and trap, potentially hazardous scans. Third, a summary gap analysis of recent US regulatory orders from common practice at accelerators, and our plans to address these in collaboration with the US Dept. of Energy. Finally, summary attack vectors of EPICS, and technical plans to add authentication and encryption to EPICS itself.
 
slides icon Slides TU2BCO04 [1.677 MB]  
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-TU2BCO04  
About • Received ※ 04 October 2023 — Revised ※ 13 October 2023 — Accepted ※ 15 November 2023 — Issued ※ 17 December 2023
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
TU2AO06 Accelerator Control Class for Graduate Students in SOKENDAI, KEK controls, distributed, GUI, factory 335
 
  • N. Kamikubota, K. Furukawa, M. Satoh, S. Yamada, N. Yamamoto
    KEK, Ibaraki, Japan
 
  The Graduate University for Advanced Studies, known as SOKENDAI, provides educational opportunities for graduate students in collaboration with national research institutions in Japan. KEK is one of the institutes, and has a program "Accelerator Science". Since 2019, we started two classes: lectures "Introduction to accelerator control system" for one semester, and a two-day seminar "Control of distributed devices for large systems". The former consists of 12 lectures on various topics of accelerator controls by teachers, followed by a presentation day by students. The latter consists of lecture and hands-on, which enables students to practice EPICS with Raspberry-pi based devices. In the paper, status of accelerator control classes are reported.
1) SOKENDAI, https://www.soken.ac.jp/en/
 
slides icon Slides TU2AO06 [2.813 MB]  
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-TU2AO06  
About • Received ※ 02 October 2023 — Revised ※ 13 October 2023 — Accepted ※ 29 November 2023 — Issued ※ 13 December 2023
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
TUMBCMO02 EPICS Java Developments controls, experiment, software, framework 342
 
  • KS. Saintin, P. Lotrus
    CEA-IRFU, Gif-sur-Yvette, France
  • L. Caouën
    CEA-DRF-IRFU, France
 
  The IRFU*/DIS software control team is involved from feasibility studies to the deployment of equipment covering low level (hardware, PLC) to high level (GUI supervision). For our experiments, we are using two mains frameworks: - MUSCADE, a full Java in-house solution embedded SCADA dedicated to small and compact experiments controlled by PLC (Programmable Logic Controller), only compatible with Windows Operating System (OS) for the server side. - EPICS**, a distributed control systems to operate devices such as particle accelerators, large facilities and major telescopes, mostly deployed on Linux OS environments. EPICS frameworks provides several languages for bindings and server interfaces such as C/C++, Python and Java. However, most of the servers also called IOC*** developed in the community are based on C/C++ and Linux OS System. EPICS also provides extensions developed in Java such as the EPICS Archiver Appliance, Phoebus Control-Studio**** (GUI), and Display Web Runtime (Web Client). All these tools depend on CAJ (a pure Java implementation Channel Access Library). Today, MUSCADE users use to work under Windows, and they need intuitive tools that provide the same features than MUSCADE. Thus, research and development activities mainly focus on EPICS solution adaptation. It aims to explore further CAJ library, especially on the server side aspect. In order to achieve this goal, several developments have been carried out since 2018.
* IRFU https://irfu.cea.fr/en
** EPICS https://epics-controls.org/
*** IOC Input Output Controller
**** Phoebus Control-Studio https://control-system-studio.readthedocs.io/
 
slides icon Slides TUMBCMO02 [1.381 MB]  
poster icon Poster TUMBCMO02 [2.202 MB]  
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-TUMBCMO02  
About • Received ※ 30 September 2023 — Revised ※ 08 October 2023 — Accepted ※ 13 October 2023 — Issued ※ 30 October 2023
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
TUMBCMO04 Real-Time Visualization and Peak Fitting of Time-of-Flight Neutron Diffraction at VULCAN lattice, neutron, detector, experiment 346
 
  • B.A. Sobhani, Y. Chen
    ORNL, Oak Ridge, Tennessee, USA
 
  In neutron scattering experiments at the VULCAN beamline at SNS, Gaussian fitting of dspace peaks can be used to summarize certain material properties of a sample. If this can be done in real time, it can also assist scientists in mid-experiment decision making. This paper describes a system developed in EPICS for visualizing dspace evolution and fitting dspace peaks in real-time at the VULCAN beamline.  
slides icon Slides TUMBCMO04 [0.433 MB]  
poster icon Poster TUMBCMO04 [0.338 MB]  
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-TUMBCMO04  
About • Received ※ 05 October 2023 — Revised ※ 11 October 2023 — Accepted ※ 28 November 2023 — Issued ※ 14 December 2023
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
TUMBCMO05 PyDM Development Update framework, interface, feedback, network 349
 
  • J.J. Bellister, Y.G. Yazar
    SLAC, Menlo Park, California, USA
 
  PyDM is a PyQt-based framework for building user interfaces for control systems. It provides a no-code, drag-and-drop system to make simple screens, as well as a straightforward Python framework to build complex applications. Recent updates include expanded EPICS PVAccess support using the P4P module. A new widget has been added for displaying data received from NTTables. Performance improvements have been implemented to enhance the loading time of displays, particularly those that heavily utilize template repeaters. Additionally, improved documentation and tutorial materials, accompanied by a sample template application, make it easier for users to get started.  
slides icon Slides TUMBCMO05 [0.345 MB]  
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-TUMBCMO05  
About • Received ※ 06 October 2023 — Revised ※ 08 October 2023 — Accepted ※ 13 October 2023 — Issued ※ 24 October 2023
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
TUMBCMO08 Extending Phoebus Data Browser to Alternative Data Sources database, controls, interface, experiment 355
 
  • M. Romanovschi, I.D. Finch, G.D. Howells
    STFC/RAL/ISIS, Chilton, Didcot, Oxon, United Kingdom
 
  The Phoebus user interface to EPICS is an integral part of the new control system for the ISIS Neutron and Muon Source accelerators and targets. Phoebus can use the EPICS Archiver Appliance, which has been deployed as part of the transition to EPICS, to display the history of PVs. However, ISIS data has and continues to be stored in the InfluxDB time series database. To enable access to this data, a Python application to interface between Phoebus and other databases has been developed. Our implementation utilises Quart, an asynchronous web framework, to allow multiple simultaneous data requests. Google Protocol Buffer, natively supported by Phoebus, is used for communication between Phoebus and the database. By employing subclassing, our system can in principle adapt to different databases, allowing flexibility and extensibility. Our open-source approach enhances Phoebus’s capabilities, enabling the community to integrate it within a wider range of applications.  
slides icon Slides TUMBCMO08 [0.799 MB]  
poster icon Poster TUMBCMO08 [0.431 MB]  
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-TUMBCMO08  
About • Received ※ 06 October 2023 — Revised ※ 12 October 2023 — Accepted ※ 21 November 2023 — Issued ※ 14 December 2023
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
TUMBCMO11 Upgrading and Adapting to CS-Studio Phoebus at Facility for Rare Isotope Beams controls, operation, interface, linac 364
 
  • T. Ashwarya, M. Ikegami, J. LeTourneau, A.C. Morton
    FRIB, East Lansing, Michigan, USA
 
  Funding: Work supported by the U.S. Department of Energy Office of Science under Cooperative Agreement DE-SC0000661
For more than a decade, the Eclipse-based Control System Studio has provided FRIB with a rich user interface to its EPICS-based control system. At FRIB, we use the Alarm Handler, BOY Display Manager, Scan Monitor/Editor, Channel Client, Save-and-Restore, and Data Browser to monitor and control various parts of the beamline. Our engineers have developed over 3000 displays using the BOY display manager mapping various segments and areas of the FRIB beamline. CS-Studio Phoebus is the latest next-generation upgrade to the Eclipse-based CS-Studio, which is based on the modern JavaFX-based graphics and aims toward providing existing functionalities and more. FRIB has already transitioned away from the old BEAST alarm servers to the new Kafka-based Phoebus alarm servers which have been monitoring thousands of our EPICS PVs with its robust monitoring and notifying capabilities. We faced certain challenges with conversion of FRIB’s thousands of displays and to address those we deployed scripts to help the bulk conversion of screens with automated mapping between BOY and Display Builder and also continually improved the Phoebus auto-conversion tool. This paper details the ongoing transition of FRIB from Eclipse-based CS-Studio to Phoebus and various adaptations and solutions that we used to ease this transition for our users. Moving to the new Phoebus-based services and client have provided us with an opportunity to rectify and improve on certain issues known to have existed with Eclipse-based CS-Studio and its services.
 
slides icon Slides TUMBCMO11 [0.872 MB]  
poster icon Poster TUMBCMO11 [2.190 MB]  
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-TUMBCMO11  
About • Received ※ 03 October 2023 — Revised ※ 08 October 2023 — Accepted ※ 30 November 2023 — Issued ※ 16 December 2023
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
TUMBCMO12 Multi-Dimensional Spectrogram Application for Live Visualization and Manipulation of Large Waveforms cavity, controls, proton, real-time 368
 
  • B.E. Bolling, A.A. Gorzawski, J. Peterson
    ESS, Lund, Sweden
 
  The European Spallation Source (ESS) is a research facility under construction aiming to be the world’s most powerful pulsed neutron source. It is powered by a complex particle accelerator designed to provide a 2.86 ms long proton pulse at 2 GeV with a repetition rate of 14 Hz. Protons are accelerated via cavity fields through various accelerating structures that are powered by Radio-Frequency (RF) power. As the cavity fields may break down due to various reasons, usually post-mortem data of such events contain the information needed regarding the cause. In other events, the underlying cause may have been visible on previous beam pulses before the interlock triggering event. The Multi-Dimensional Spectrogram Application is designed to be able to collect, manipulate and visualize large waveforms at high repetition rates, with the ESS goal being 14 Hz, for example cavity fields, showing otherwise unnoticed temporary breakdowns that may explain the sometimes-unknown reason for increased power (compensating for those invisible temporary breakdowns). The first physical event that was recorded with the tool was quenching of a superconducting RF cavity in real time in 3D. This paper describes the application developed using Python and the pure-python graphics and GUI library PyQtGraph and PyQt5 with Python-OpenGL bindings.  
slides icon Slides TUMBCMO12 [2.932 MB]  
poster icon Poster TUMBCMO12 [11.475 MB]  
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-TUMBCMO12  
About • Received ※ 04 October 2023 — Accepted ※ 23 November 2023 — Issued ※ 23 November 2023  
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
TUMBCMO27 EPICS IOC Integration with Rexroth Controller for a T-Zero Chopper controls, neutron, interface, PLC 429
 
  • B.K. Krishna, M. Ruiz Rodriguez
    ORNL, Oak Ridge, Tennessee, USA
 
  A neutron chopper is not typically used as a filter, but rather as a way to modulate a beam of neutrons to select a certain energy range or to enable time-of-flight measurements. T-Zero neutron choppers have been incorporated into several beamlines at SNS and are operated via a Rexroth controller. However, the current OPC is only compatible with Windows XP, which has led to the continued use of an XP machine to run both the Indradrive (Rexroth interface) and EPICS IOC. This setup has caused issues with integrating with our Data Acquisition server and requires separate maintenance. As a result, for a new beamline project, we opted to switch to the Rexroth XM22 controller with T-Zero chopper, which allows for the use of drivers provided by Rexroth in various programming languages. This paper will detail the XM22 controller drivers and explain how to utilize them to read PLC parameters from the controller into the EPICS application and its Phoebus/CSS interface.  
slides icon Slides TUMBCMO27 [0.389 MB]  
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-TUMBCMO27  
About • Received ※ 08 October 2023 — Revised ※ 12 December 2023 — Accepted ※ 15 December 2023 — Issued ※ 19 December 2023
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
TUMBCMO30 EPICS Based Tool for LLRF Operation Support and Testing cavity, controls, LLRF, operation 432
 
  • K. Klys, W. Cichalewski
    TUL-DMCS, Łódż, Poland
  • P. Pierini
    ESS, Lund, Sweden
 
  Interruptions in the functioning of linear superconductive accelerators LLRF (Low-Level Radio Frequency) systems can result in significant downtime. This can lead to lost productivity and revenue. Accelerators are foreseen to operate under various conditions and in different operating modes. As such, it is crucial to have flexibility in their operation to adapt to demands. Automation is a potential solution to address these challenges by reducing the need for human intervention and improving the control’s quality over the accelerator. The paper describes EPICS-based tools for LLRF control system testing, optimization, and operations support. The proposed software implements procedures and applications that are usually extensions to the core LLRF systems functionalities and are performed by operators. This facilitates the maintenance of the accelerator and increases its flexibility in adaptation to various work conditions and can increase its availability level. The paper focuses on the architecture of the solution. It also depicts its components related to superconducting cavities parameters identification and elements responsible for their tuning. Since the proposed solution is destined for the European Spallation Source control system, the application has a form of multiple IOCs (Input/Output Controllers) wrapped into E3 (ESS EPICS Environment) modules. Nevertheless, it can be adjusted to other control systems - its logic is universal and applicable (after adaptations) to other LLRF control systems with superconducting cavities.  
slides icon Slides TUMBCMO30 [0.466 MB]  
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-TUMBCMO30  
About • Received ※ 06 October 2023 — Revised ※ 12 October 2023 — Accepted ※ 28 November 2023 — Issued ※ 30 November 2023
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
TUMBCMO31 Use of EPICS in Small Laboratories controls, experiment, software, interface 437
 
  • H. Junkes
    FHI, Berlin, Germany
 
  For some time now, we* have also been using the EPICS** control system in small laboratories in order to be able to guarantee data recording and processing in accordance with the FAIR*** guidelines and thus to increase the overall quality of the data. It was necessary to overcome many reservations and, above all, to counter the prejudice that such systems are only suitable for large-scale installations. We are now trying to communicate the idea behind this kind of data acquisition (distributed systems, open protocols, open file formats, etc.) also in the studies of physicists, chemists and engineers and are extending our activities to universities. We also hope that in the future, users who use the individual user facilities will be able to make optimal use of the options available there. In our talk we will present the use of EPICS in small laboratories.
* https://epics.mpg.de
** https://epics-controls.org
*** https://www.fair-di.eu/fairmat/about-fairmat/consortium-fairmat
 
slides icon Slides TUMBCMO31 [0.788 MB]  
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-TUMBCMO31  
About • Received ※ 06 October 2023 — Revised ※ 12 October 2023 — Accepted ※ 28 November 2023 — Issued ※ 06 December 2023
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
TUMBCMO35 The SILF Accelerator Controls Plan controls, interface, feedback, software 449
 
  • Z.Z. Zhou, L. Hu, M.T. Kang, G.M. Liu, T. Liu, T. Yu, J.H. Zhu
    IASF, Shenzhen, Guangdong, People’s Republic of China
 
  The Shenzhen Innovation Light Source Facility (SILF) is an accelerator-based multidiscipline user facility planned to be constructed in Shenzhen, Guangdong, Chi-na. This paper introduces controls design outline and progress. Some technical plans and schedules are also discussed.  
slides icon Slides TUMBCMO35 [0.747 MB]  
poster icon Poster TUMBCMO35 [0.545 MB]  
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-TUMBCMO35  
About • Received ※ 28 September 2023 — Revised ※ 08 October 2023 — Accepted ※ 06 December 2023 — Issued ※ 15 December 2023
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
TUPDP015 Test Bench for Motor and Motion Controller Characterization controls, experiment, GUI, data-acquisition 522
 
  • D.K. Kraft, M. Brendike
    HZB, Berlin, Germany
 
  To maximize beamtime usage motorization of beamline equipment is crucial. Choosing the correct motor is complex, since performance depends largely on the combination of motor and motion controller [1]. This challenge, alongside renewing the twenty years old infrastructure at BESSY II, led to the demand for a motor testbench. The testbench was designed to be modular, so it fits different motors, loads and sensors. It allows independent performance verification and enables us to find a fitting combination of motor and motion controller. The testbench is operated via EPICS and Bluesky, allowing us usage of python for automated data acquisition and testing. An overview of the mechanical and electrical setup, as well as some data from different performance tests will be presented.
[1]A.Hughes , B.Drury, ’Electric Motors and Drivers: Fundamentals, Types and Applications’, Fifth Edition, Kidlington, United Kingdom, 2019, pp. 41-86.
 
poster icon Poster TUPDP015 [1.295 MB]  
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-TUPDP015  
About • Received ※ 06 October 2023 — Revised ※ 13 October 2023 — Accepted ※ 02 December 2023 — Issued ※ 13 December 2023
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
TUPDP016 Migrating from Alarm Handler to Phoebus Alarm-Server at BESSY II controls, network, GUI, ISOL 526
 
  • M. Gotz, T. Birke
    HZB, Berlin, Germany
 
  The BESSY II lightsource has been in operation at Helmholtz-Center Berlin (HZB) for 25 years and is expected to be operated for more than the next decade. The EPICS Alarm Handler (alh) has served as the basis for a reliable alarm system for BESSY II as well as other facilities and laboratories operated by HZB. To preempt software obsolescence and enable a centralized architecture for other Alarm Handlers running throughout HZB, the alarm system is being migrated to the alarm-service developed within the Control System Studio/Phoebus ecosystem. To facilitate operation of the Alarm Handler, while evaluating the new system, tools were developed to automate creation of the Phoebus alarm-service configuration files in the control systems’ build process. Additionally, tools and configurations were devised to mirror the old system’s key features in the new one. This contribution presents the tools developed and the infrastructure deployed to use the Phoebus alarm-service at HZB.  
poster icon Poster TUPDP016 [0.343 MB]  
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-TUPDP016  
About • Received ※ 29 September 2023 — Accepted ※ 06 December 2023 — Issued ※ 11 December 2023  
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
TUPDP022 DALI Control System Considerations TANGO, controls, software, interface 547
 
  • K. Zenker, M. Justus, R. Steinbrück
    HZDR, Dresden, Germany
 
  The Dresden Advanced Light Infrastructure (DALI) is part of the German national Helmholtz Photon Science Roadmap. It will be a high-field source of intense terahertz radiation based on accelerated electrons and the successor of the Center for High-Power Radiation Sources (ELBE) operated at HZDR since 2002. In the current phase of DALI the conceptional design report is in preparation and there are ongoing considerations which control system to use best. We will present the status of those considerations, that include defining the requirements for the control system and a discussion of control system candidates. In the early conceptional phase we are still open to any control system that can fulfill our requirements. Besides pure technical performance, features and security the requirements encompass modernity, well established support by community and companies, long term availability as well as collaboration potential and benefit. To collect opinions from the community on what is the optimal control system we prepared a survey. Like that we would like to benefit as much as possible from the community experience with different types of control systems.  
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-TUPDP022  
About • Received ※ 05 October 2023 — Revised ※ 13 October 2023 — Accepted ※ 04 December 2023 — Issued ※ 18 December 2023
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
TUPDP028 Challenges of the COSY Synchrotron Control System Upgrade to EPICS controls, synchrotron, power-supply, timing 561
 
  • C. Böhme, C. Deliege, M. Simon, M. Thelen
    FZJ, Jülich, Germany
  • V. Kamerdzhiev
    GSI, Darmstadt, Germany
  • R. Modic, Z. Oven
    Cosylab, Ljubljana, Slovenia
 
  The COSY (COoler SYncchrotron) at the Forschungszentrum Jülich is a hadron accelerator build in the early 90s, with work started in the late 80s. At this time the whole control system was based on a self-developed real-time operating system for Motorola m68k boards, utilizing, unusual for this time, IP-networks as transport layer. The GUI was completely based on Tcl/Tk. After 25 years of operation, in 2016, it was decided to upgrade the control system to EPICS and the GUI to CS-Studio, in order to e.g. allow a better automatization or automatized archiving of operational parameters. This was done together with Cosylab d.d. bit by bit while the synchrotron was in operation, and because of the complexity is still ongoing. The experiences of the stepwise upgrade process will be presented and a lessons learned will be emphasized.  
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-TUPDP028  
About • Received ※ 06 October 2023 — Accepted ※ 12 October 2023 — Issued ※ 14 October 2023  
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
TUPDP038 Status of Vacuum Control System Upgrade of ALPI Accelerator controls, vacuum, PLC, interface 595
 
  • L. Antoniazzi, A. Conte, C.R. Roncolato, G. Savarese
    INFN/LNL, Legnaro (PD), Italy
 
  The vacuum system of ALPI (Acceleratore Lineare Per Ioni) accelerator at LNL (Laboratori Nazionali di Legnaro), including around 40 pumping groups, was installed in the 90s. The control and supervision systems, composed by about 14 control racks, were developed in the same period by an external company, which produced custom solutions for the HW and SW parts. Control devices are based on custom PLCs, while the supervision system is developed in C and C#. The communication network is composed of multiple levels from serial standard to Ethernet passing true different devices to collect the data. The obsolescence of the hardware, the rigid system infrastructure, the deficit of spares parts and the lack of external support, impose a complete renovation of the vacuum system and relative controls. In 2022 the legacy high level control system part was substituted with a new one developed in EPICS (Experimental Physics and Industrial Control System) and CSS (Control System Studio)*. After that, we started the renovation of the HW part with the installation and integration of two new flexible and configurable low level control system racks running on a Siemens PLC and exploiting serial server to control the renewed pumping groups and pressure gauges. The plan for the next years is to replace the legacy hardware with new one retrieving spare parts, provide service continuity, improve PLC software and extend the EPICS control system with new features. This paper describes the adopted strategy and the upgrade status.
* G. Savarese et al., Vacuum Control System Upgrade for ALPI
accelerator, in Proc. IPAC-22, Bangkok, Thailand, doi:10.18429/JACoW-IPAC2022-MOPOMS045
 
poster icon Poster TUPDP038 [3.286 MB]  
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-TUPDP038  
About • Received ※ 04 October 2023 — Accepted ※ 11 December 2023 — Issued ※ 17 December 2023  
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
TUPDP039 Integrating EPICS Control System in VR Environment: Proof of Concept controls, interface, cyclotron, framework 599
 
  • L. Pranovi, M. Montis
    INFN/LNL, Legnaro (PD), Italy
 
  Preliminary activities were performed to verify the feasibility of Virtual Reality (VR) and Augmented Reality (AR) technologies applied to nuclear physics laboratories, using them for different purposes: scientific dissemination events, data collection, training, and machine maintenance*. In particular, this last field has been fascinating since it lets developers discover the possibility of redesigning the concept of the Human-Machine Interface. Based on the experience, it has been natural to try to provide to the final user (such as system operators and maintainers) with all the set of information describing the machine and control system parameters. For this reason, we tried to integrate the accelerator’s control system environment and VR/AR application. In this contribution, the integration of an EPICS-based control system and VR environment will be described.
* L.Pranovi et al., "VIRTUAL REALITY AND CONTROL SYSTEMS: HOW A 3D SYSTEM LOOKS LIKE", ICALEPCS 2021, Shanghai, China
 
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-TUPDP039  
About • Received ※ 03 October 2023 — Revised ※ 08 October 2023 — Accepted ※ 01 December 2023 — Issued ※ 11 December 2023
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
TUPDP048 The Upgrade of Pulsed Magnet Control System Using PXIe Devices at KEK LINAC controls, linac, real-time, operation 635
 
  • D. Wang, M. Satoh
    KEK, Ibaraki, Japan
 
  In the KEK electron-positron injector LINAC, the pulsed magnet control system modulates the magnetic field at intervals of 20 ms, enabling simultaneous injection into four distinct target rings: 2.5 GeV PF, 6.5 GeV PF-AR, 4 GeV SuperKEKB LER, and 7 GeV SuperKEKB HER. This system operates based on a trigger signal delivered from the event timing system. Upon the receiving specified event code, the PXI DAC board outputs a waveform to the pulse driver, which consequently determines the current of the pulsed magnet. The combination of Windows 8.1 and LabVIEW was utilized to implement the control system since 2017. Nonetheless, due to the cessation of support for Windows 8.1, a system upgrade has become imperative. To address this, Linux has been selected as a suitable replacement for Windows and the EPICS driver for PXIe devices is thus required. This manuscript introduces the development of the novel Linux-based pulsed magnet control system.  
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-TUPDP048  
About • Received ※ 06 October 2023 — Accepted ※ 11 December 2023 — Issued ※ 14 December 2023  
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
TUPDP049 15 Years of the J-PARC Main Ring Control System Operation and Its Future Plan controls, operation, network, software 639
 
  • S. Yamada
    J-PARC, KEK & JAEA, Ibaraki-ken, Japan
 
  The accelerator control system of the J-PARC MR started operation in 2008. Most of the components of the control computers, such as servers, disks, operation terminals, front-end computers and software, which were introduced during the construction phase, have gone through one or two generational changes in the last 15 years. Alongside, the policies for the operation of control computers have changed. This paper reviews the renewal of those components and discusses the philosophy behind the configuration and operational policy. It is also discusses the approach to matters that did not exist at the beginning of the project, such as virtualization or cyber security.  
poster icon Poster TUPDP049 [0.489 MB]  
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-TUPDP049  
About • Received ※ 05 October 2023 — Revised ※ 25 October 2023 — Accepted ※ 14 December 2023 — Issued ※ 17 December 2023
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
TUPDP050 Development and Test Operation of the Prototype of the New Beam Interlock System for Machine Protection of the RIKEN RI Beam Factory controls, operation, FPGA, experiment 645
 
  • M. Komiyama, M. Fujimaki, N. Fukunishi, A. Uchiyama
    RIKEN Nishina Center, Wako, Japan
  • M. Hamanaka, K. Kaneko, R. Koyama, M. Nishimura, H. Yamauchi
    SHI Accelerator Service Ltd., Tokyo, Japan
  • A. Kamoshida
    National Instruments Japan Corporation, MInato-ku, Tokyo, Japan
 
  We have been operating the beam interlock system (BIS) for machine protection of the RIKEN RI Beam Factory (RIBF) since 2006. It stops beams approximately 15 ms after receiving an alert signal from the accelerator and beam line components. We continue to operate BIS successfully; however, we are currently developing a successor system to stop a beam within 1 ms considering that the beam intensity of RIBF will continue to increase in the future. After comparing multiple systems, CompactRIO, a product by National Instruments, was selected for the successor system. Interlock logic for signal input/output is implemented on the field-programmable gate array (FPGA) because fast processing speed is required. On the other hand, signal condition setting and monitoring do not require the same speed as interlock logic. They are implemented on the RT-OS and controlled by using experimental physics and industrial control system (EPICS) by setting up an EPICS server on the RT-OS. As a first step in development, a prototype consisting of two stations that handle only digital alert signals was developed and installed in part of the RIBF in the summer of 2022 (224 input contacts). The signal response time of the prototype, measured with an oscilloscope, averaged 0.52 ms with both stations (the distance between two stations is approximately 75 m). Furthermore, by additionally installing a pull-up circuit at each signal input contact of the system, the system response time was successfully reduced to approximately 0.13 ms.  
poster icon Poster TUPDP050 [0.816 MB]  
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-TUPDP050  
About • Received ※ 03 October 2023 — Revised ※ 09 October 2023 — Accepted ※ 14 December 2023 — Issued ※ 18 December 2023
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
TUPDP052 The Progress and Status of HEPS Beamline Control System controls, detector, experiment, synchrotron 650
 
  • G. Li, X.B. Deng, X.W. Dong, Z.H. Gao, G. Lei, Y. Liu, C.X. Yin, Z.Y. Yue, D.S. Zhang, Q. Zhang, Z. Zhao, A.Y. Zhou
    IHEP, Beijing, People’s Republic of China
  • N. Xie
    IMP/CAS, Lanzhou, People’s Republic of China
 
  HEPS will be the first high-energy (6GeV) synchrotron radiation light source in China which is mainly composed of an accelerator, beamlines and end-stations. In phase I, 14+1 beamlines and corresponding experimental stations will be constructed. The beamline control system design, based on EPICS, has been completed and will soon enter the stage of engineering construction and united commissioning. Here, the progress and status of the beamline control system are presented.  
poster icon Poster TUPDP052 [4.531 MB]  
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-TUPDP052  
About • Received ※ 01 October 2023 — Revised ※ 11 October 2023 — Accepted ※ 05 December 2023 — Issued ※ 17 December 2023
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
TUPDP065 Introduction to the Control System of the PAL-XFEL Beamlines FEL, controls, experiment, network 655
 
  • G.S. Park, S-M. Hwang, M.Z. Jeong, W.U. Kang, C.Y. Lim
    PAL, Pohang, Republic of Korea
 
  The PAL-XFEL beamlines are composed of two different types of beamlines: a hard X-ray beamline and a soft X-ray beamline. The hard X-ray beamline generates free electron lasers with pulse energies ranging from 2-15 keV, pulse lengths of 10-35 fs, and arrival time errors of less than 20 fs from 4-11 GeV electron beams for X-ray Scattering & Spectroscopy (XSS) and Nano Crystallography & Coherent Imaging (NCI) experiments. On the other hand, the soft X-ray beamline generates free electron lasers with photon energies ranging from 0.25-1.25 keV, and with more than 1012 photons, along with 3 GeV electron beams for soft X-ray Scattering & Spectroscopy (SSS) experiments. To conduct experiments using the XFEL, precise beam alignment, diagnostics, and control of experimental devices are necessary. The devices of the three beamlines are composed of control systems based on the Experimental Physics and Industrial Control System (EPICS), which is a widely-used open-source software framework for distributed control systems. The beam diagnostic devices include QBPM (Quad Beam Position Monitor), photodiode, Pop-in monitor, and inline spectrometer, among others. Additionally, there are other systems such as CRL (Compound Refractive Lenses), KB mirror (Kirkpatrick-Baez mirror), attenuator, and vacuum that are used in the PAL-XFEL beamlines. We would like to introduce the control system, event timing, and network configuration for PAL-XFEL experiments.  
poster icon Poster TUPDP065 [1.116 MB]  
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-TUPDP065  
About • Received ※ 10 October 2023 — Revised ※ 11 October 2023 — Accepted ※ 13 October 2023 — Issued ※ 29 October 2023
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
TUPDP075 OPC UA EPICS Bridge PLC, controls, software, embedded 681
 
  • W. Duckitt
    Stellenbosch University, Matieland, South Africa
  • J.K. Abraham
    iThemba LABS, Somerset West, South Africa
 
  OPC UA is a service-orientated communication architecture that supports platform-independent, data exchange between embedded micro-controllers, PLCs or PCs and cloudbased infrastructure. This makes OPC UA ideal for developing manufacturer independent communication to vendor specific PLCs, for example. With this in mind, we present an OPC UA to EPICS bridge that has been containerized with Docker to provide a micro-service for communicating between EPICS and OPC UA variables.  
poster icon Poster TUPDP075 [0.681 MB]  
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-TUPDP075  
About • Received ※ 03 October 2023 — Revised ※ 20 October 2023 — Accepted ※ 11 December 2023 — Issued ※ 12 December 2023
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
TUPDP085 EPICS at FREIA Laboratory controls, PLC, cavity, software 718
 
  • K.J. Gajewski, K. Fransson
    Uppsala University, Uppsala, Sweden
 
  FREIA laboratory is a Facility for REsearch Instrumentation and Accelerator development at Uppsala University, Sweden. It was officially open in 2013 to test and develop superconducting accelerating cavities and their high power RF sources. The laboratory focuses on superconducting technology and accelerator development and conducts research on beam physics and light generation with charged particles, accelerator technology and instrumentation. From the very beginning EPICS* has been chosen as a control system for all the infrastructure and equipment in the lab. Use of EPICS allowed us to build a robust, expandable and maintainable control system with a very limited man power. The paper will present the choices we made and the problems we have solved to achieve this goal. We will show the current status of the control system and the strategy for the future.
* https://epics-controls.org/
 
poster icon Poster TUPDP085 [2.305 MB]  
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-TUPDP085  
About • Received ※ 27 September 2023 — Revised ※ 09 October 2023 — Accepted ※ 14 December 2023 — Issued ※ 18 December 2023
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
TUPDP094 EPICS NTTables for Machine Timing Configuration timing, MMI, MEBT, controls 767
 
  • A.A. Gorzawski, J.P.S. Martins, N. Milas
    ESS, Lund, Sweden
 
  The European Spallation Source (ESS), currently under construction and initial commissioning in Lund, Sweden, will be the brightest spallation neutron source in the world, when its driving proton linac achieves the design power of 5 MW at 2 GeV. Such a high power requires production, efficient acceleration, and almost no-loss transport of a high current beam, thus making the design and beam commissioning of this machine challenging. The recent commissioning runs (2021-2023) showed an enhanced need for a consistent and robust way of setting up the machine for beam production. One of the big challenges at ESS beam operations is aligning the machine setup and the timing setup limiting the need for operator actions. In this paper, we show a concept of using EPICS 7 NTTables to enable this machine settings consistency. Along with that, we also highlight a few challenges related to other EPICS tools like Save and Restore and Archiver.  
poster icon Poster TUPDP094 [0.682 MB]  
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-TUPDP094  
About • Received ※ 04 October 2023 — Accepted ※ 06 December 2023 — Issued ※ 08 December 2023  
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
TUPDP105 The SLS 2.0 Beamline Control System Upgrade Strategy controls, experiment, MMI, network 807
 
  • T. Celcer, X. Yao, E. Zimoch
    PSI, Villigen PSI, Switzerland
 
  After more than 20 years of successful operation the SLS facility will undergo a major upgrade, replacing the entire storage ring, which will result in a significantly improved beam emittance and brightness. In order to make use of improved beam characteristics, beamline upgrades will also play a crucial part in the SLS 2.0 project. However, offering our users an optimal beamtime experience will strongly depend on our ability to leverage the beamline control and data acquisition tools to a new level. Therefore, it is necessary to upgrade and modernize the majority of our current control system stack. This article provides an overview of the planned beamline control system upgrade from the technical as well as project management perspective. A portfolio of selected technical solutions for the main control system building blocks will be discussed. Currently, the controls HW in SLS is based on the VME platform, running the VxWorks operating system. Digital/analog I/O, a variety of motion solutions, scalers, high voltage power supplies, and timing and event system are all provided using this platform. A sensible migration strategy is being developed for each individual system, along with the overall strategy to deliver a modern high-level experiment orchestration environment. The article also focuses on the challenges of the phased upgrade, coupled with the unavoidable coexistence with existing VME-based legacy systems due to time, budget, and resource constraints.  
poster icon Poster TUPDP105 [4.148 MB]  
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-TUPDP105  
About • Received ※ 04 October 2023 — Revised ※ 11 October 2023 — Accepted ※ 05 December 2023 — Issued ※ 15 December 2023
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
TUPDP106 SwissFEL Resonant Kicker Control System kicker, controls, electron, FEL 813
 
  • R.A. Krempaská, A.D. Alarcon, S. Dordevic, C.H. Gough, M. Paraliev, W. Portmann
    PSI, Villigen PSI, Switzerland
 
  SwissFEL X-ray Free Electron Laser at the Paul Scherrer Institute is a user facility designed to run in two electron bunch mode in order to serve simultaneously two experimental beamline stations. Two closely spaced (28 ns) electron bunches are accelerated in one RF macro pulse up to 3 GeV. A high stability resonant kicker system and a Lambertson septum magnet are used to separate the bunches and to send them to the respective beamlines[1]. The resonant kickers control system consists of various hardware and software components whose tasks are the synchronization of the kickers with the electron beam, pulse-to-pulse amplitude and phase measurement, generating pulsed RF power to excite a resonating deflection current, as well as movement of the mechanical tuning vanes of the resonant kickers. The feedback software monitors and controls all the important parameters. We present the integration solutions of these components into EPICS.  
poster icon Poster TUPDP106 [2.025 MB]  
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-TUPDP106  
About • Received ※ 03 October 2023 — Revised ※ 09 October 2023 — Accepted ※ 11 December 2023 — Issued ※ 13 December 2023
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
TUPDP108 Progress of the EPICS Transition at the Isis Accelerators controls, network, operation, PLC 817
 
  • I.D. Finch, B.R. Aljamal, K.R.L. Baker, R. Brodie, J.-L. Fernández-Hernando, G.D. Howells, M.F. Leputa, S.A. Medley, M. Romanovschi
    STFC/RAL/ISIS, Chilton, Didcot, Oxon, United Kingdom
  • A. Kurup
    Imperial College of Science and Technology, Department of Physics, London, United Kingdom
 
  The ISIS Neutron and Muon Source accelerators have been controlled using Vsystem running on OpenVMS / Itaniums, while beamlines and instruments are controlled using EPICS. We outline the work in migrating accelerator controls to EPICS using the PVAccess protocol with a mixture of conventional EPICS IOCs and custom Python-based IOCs primarily deployed in containers on Linux servers. The challenges in maintaining operations with two control systems running in parallel are discussed, including work in migrating data archives and maintaining their continuity. Semi-automated conversion of the existing Vsystem HMIs to EPICS and the creation of new EPICS control screens required by the Target Station 1 upgrade are reported. The existing organisation of our controls network and the constraints this imposes on remote access via EPICS and the solution implemented are described. The successful deployment of an end-to-end EPICS system to control the post-upgrade Target Station 1 PLCs at ISIS is discussed as a highlight of the migration.  
poster icon Poster TUPDP108 [0.510 MB]  
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-TUPDP108  
About • Received ※ 02 October 2023 — Accepted ※ 04 December 2023 — Issued ※ 17 December 2023  
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
TUPDP109 Tickit: An Event-Based Multi-Device Simulation Framework simulation, framework, controls, hardware 823
 
  • A. Emery, T.M. Cobb, C.A. Forrester, G. O’Donnell
    DLS, Oxfordshire, United Kingdom
 
  Tickit is an event-based multi-device simulation framework providing configuration and orchestration of complex simulations. It was developed at Diamond Light Source in order to overcome limitations presented to us by some of our existing hardware simulations. With the Tickit framework, simulations can be addressed with a compositional approach. It allows devices to be simulated individually while still maintaining the interconnected behaviour exhibited by their hardware counterparts. This is achieved by modelling the interactions between devices, such as electronic signals. Devices can be collated into larger simulated systems providing a layer of simulated hardware against which to test the full stack of Data Acquisition and Controls tools. We aim to use this framework to extend the scope and improve the interoperability of our simulations; enabling us to further improve the testing of current systems and providing a preferential platform to assist in development of the new Acquisition and Controls tools.  
poster icon Poster TUPDP109 [0.703 MB]  
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-TUPDP109  
About • Received ※ 29 September 2023 — Revised ※ 21 October 2023 — Accepted ※ 04 December 2023 — Issued ※ 18 December 2023
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
TUPDP110 Control System Design of the CHIMERA Fusion Test Facility controls, experiment, PLC, SCADA 827
 
  • P.T. Smith, A. Greer, B.A. Roberts, P.B. Taylor
    OSL, St Ives, Cambridgeshire, United Kingdom
  • D.J.N. McCubbin, M. Roberts
    JCE, Warrington, United Kingdom
 
  Funding: Observatory Sciences Ltd
CHIMERA is an experimental nuclear fusion test facility which aims to simulate the intense magnetic fields and temperature gradients found within a tokamak fusion reactor. The control system at CHIMERA is based on EPICS and will have approximately 30 input/output controllers (IOCs) when it comes online in 2024. It will make heavy use of CSS Phoebus for its user interface, sequencer and alarm system. CHIMERA will use EPICS Archiver Appliance for data archiving and EPICS areaDetector to acquire high speed data which is stored in the HDF5 format. The control philosophy at CHIMERA emphasises PLC based control logic using mostly Siemens S7-1500 PLCs and using OPCUA to communicate with EPICS. EPICS AUTOSAVE is used both for manually setting lists of process variables (PVs) and for automatic restoration of PVs if an IOC must be restarted.
 
poster icon Poster TUPDP110 [1.711 MB]  
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-TUPDP110  
About • Received ※ 03 October 2023 — Revised ※ 09 October 2023 — Accepted ※ 12 October 2023 — Issued ※ 17 October 2023
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
TUPDP113 A Flexible EPICS Framework for Sample Alignment at Neutron Beamlines controls, framework, neutron, operation 836
 
  • J.P. Edelen, M.J. Henderson, M.C. Kilpatrick
    RadiaSoft LLC, Boulder, Colorado, USA
  • S. Calder, B. Vacaliuc
    ORNL RAD, Oak Ridge, Tennessee, USA
  • R.D. Gregory, G.S. Guyotte, C.M. Hoffmann, B.K. Krishna
    ORNL, Oak Ridge, Tennessee, USA
 
  Funding: This material is based upon work supported by the U.S. Department of Energy, Office of Science, Office of Basic Energy Science under Award Number DE-SC0021555.
RadiaSoft has been developing a flexible front-end framework, written in Python, for rapidly developing and testing automated sample alignment IOCs at Oak Ridge National Laboratory. We utilize YAML-formatted configuration files to construct a thin abstraction layer of custom classes which provide an internal representation of the external hardware within a controls system. The abstraction layer takes advantage of the PCASPy and PyEpics libraries in order to serve EPICS process variables & respond to read/write requests. Our framework allows users to build a new IOC that has access to information about the sample environment in addition to user-defined machine learning models. The IOC then monitors for user inputs, performs user-defined operations on the beamline, and reports on its status back to the control system. Our IOCs can be booted from the command line, and we have developed command line tools for rapidly running and testing alignment processes. These tools can also be accessed through an EPICS GUI or in separate Python scripts. This presentation provides an overview of our software structure and showcases its use at two beamlines at ORNL.
 
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-TUPDP113  
About • Received ※ 06 October 2023 — Revised ※ 22 October 2023 — Accepted ※ 04 December 2023 — Issued ※ 16 December 2023
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
TUPDP121 Conceptual Design of the Matter in Extreme Conditions Upgrade (MEC-U) Rep-Rated Laser Control System controls, laser, timing, hardware 865
 
  • B.T. Fishler, F. Batysta, J. Galbraith, V.K. Gopalan, J. Jimenez, L.S. Kiani, E.S. Koh, J.F. McCarrick, A.K. Patel, R.E. Plummer, B. Reagan, E. Sistrunk, T.M. Spinka, K. Terzi, K.M. Velas
    LLNL, Livermore, California, USA
  • M.Y. Cabral, T.A. Wallace, J. Yin
    SLAC, Menlo Park, California, USA
 
  Funding: This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.
The Lawrence Livermore National Laboratory (LLNL) is delivering the Dual-mode Energetic Laser for Plasma and High Intensity Science (DELPHI) system to SLAC as part of the MEC-U project to create an unprecedented platform for high energy density experiments. The DELPHI control system is required to deliver short and/or long pulses at a 10 Hz firing rate with femto/pico-second accuracy sustained over fourteen 12-hour operator shifts to a common shared target chamber. The MEC-U system requires the integration of the control system with SLAC provided controls related to personnel safety, machine safety, precision timing, data analysis and visualization, amongst others. To meet these needs along with the system’s reliability, availability, and maintainability requirements, LLNL is delivering an EPICS based control system leveraging proven SLAC technology. This talk presents the conceptual design of the DELPHI control system and the methods planned to ensure its successful commissioning and delivery to SLAC.
 
poster icon Poster TUPDP121 [1.610 MB]  
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-TUPDP121  
About • Received ※ 02 October 2023 — Revised ※ 09 October 2023 — Accepted ※ 04 December 2023 — Issued ※ 17 December 2023
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
TUPDP122 Fast Wire Scanner Motion Control Software Upgrade For LCLS-II controls, software, linac, MMI 869
 
  • Z. Huang, N. Balakrishnan, J.D. Bong, M.L. Campell, T.C. Thayer
    SLAC, Menlo Park, California, USA
 
  Funding: Work supported by U.S. Department of Energy under contract number DE- AC02-76SF00515
LCLS-II is the first XFEL to be based on continuous-wave superconducting accelerator technology (CW-SCRF), with the X-ray pulses at repetition rates of up to 1 MHz. LCLS-II’s wire scanner motion control is based on Aerotech Ensemble controller. The position feedback and the beam loss monitor readings during a wire scan aim to measure the beam profile. To meet the measurement requirements under both low and high beam repetition rates, we redesign the software program for EPICS IOC, Aerotech controller, and develop a new User Interface (UI) based on PyDM. This paper will describe the software development details and the software commissioning result under LCLS-II’s production environment.
 
poster icon Poster TUPDP122 [1.248 MB]  
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-TUPDP122  
About • Received ※ 05 October 2023 — Revised ※ 20 October 2023 — Accepted ※ 04 December 2023 — Issued ※ 12 December 2023
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
TUPDP123 SLAC ATCA Scope - Upgrading the EPICS Support Package software, controls, interface, FPGA 873
 
  • D. Alnajjar, M.P. Donadio, K.H. Kim, R. Ruckman
    SLAC, Menlo Park, California, USA
 
  Funding: Work supported by US DOE contract DE-AC02-76SF00515
The SLAC ATCA Scope, a 4-channel dual scope, has an EPICS support package that runs on top of SLAC’s Common Platform software and firmware, and communicates with several high-performance systems in LCLS running on the 7-slot Advanced Telecommunications Computing Architecture (ATCA) crate. The software was completely refactored to improve the usability for IOC engineers. Once linked with an EPICS IOC, it initializes the scope hardware and instantiates the upper software stack providing a set of PVs to control the API and hardware, and to operate the oscilloscope. The exported PVs provide seamless means to configure triggers and obtain data acquisitions similar to a real oscilloscope. The ATCA scope probes are configured dynamically by the user to probe up to four inputs of the ATCA ADC daughter cards. The EPICS support package automatically manages available ATCA carrier board DRAM resources based on the number of samples requested by the user, allowing acquisitions of up to 8 GBytes per trigger. The user can also specify a desired sampling rate, and the ATCA Scope will estimate the nearest possible sampling rate using the current sampling frequency, and perform downsampling to try to match that rate. Adding the EPICS module to an IOC is simple and straightforward. The ATCA Scope support package works for all high-performance systems that have the scope common hardware implemented in its FPGAs. Generic interfaces developed in PyDM are also provided to the user to control the oscilloscope and enrich the user’s seamless overall experience.
 
poster icon Poster TUPDP123 [0.984 MB]  
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-TUPDP123  
About • Received ※ 03 October 2023 — Accepted ※ 30 November 2023 — Issued ※ 08 December 2023  
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
TUPDP125 Design and Implementation the LCLS-II Machine Protection System software, database, controls, interface 877
 
  • J.A. Mock, Z.A. Domke, R.T. Herbst, P. Krejcik, R. Ruckman, L. Sapozhnikov
    SLAC, Menlo Park, California, USA
 
  The linear accelerator complex at the SLAC National Accelerator Laboratory has been upgraded to include LCLS-II, a new linac capable of producing beam power as high as several hundred kW with CW beam rates up to 1 MHz while maintaining existing capabilities from the copper machine. Because of these high-power beams, a new Machine Protection System with a latency of less than 100 us was designed and installed to prevent damage to the machine when a fault or beam loss is detected. The new LCLS-II MPS must work in parallel with the existing MPS from the respective sources all the way through the user hutches to provide a mechanism to reduce the beam rate or shut down operation in a beamline without impacting the neighboring beamline when a fault condition is detected. Because either beamline can use either accelerator as its source and each accelerator has different operating requirements, great care was taken in the overall system design to ensure the necessary operation can be achieved with a seamless experience for the accelerator operators. The overall system design of the LCLS-II MPS software including the ability to interact with the existing systems and the tools developed for the control room to provide the user operation experience will be described.  
poster icon Poster TUPDP125 [1.360 MB]  
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-TUPDP125  
About • Received ※ 04 October 2023 — Revised ※ 30 November 2023 — Accepted ※ 04 December 2023 — Issued ※ 14 December 2023
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
TUPDP127 SLAC LINAC Mode Manager Interface linac, interface, timing, undulator 882
 
  • T. Summers, C. Bianchini Mattison, M. Gibbs, T.J. Kabana, P. Krejcik, J.A. Mock
    SLAC, Menlo Park, California, USA
 
  With the successful commissioning of the new superconducting (SC) LINAC, the LINAC Coherent Light Source (LCLS) now has the capability of interleaving beams from either the normal conducting (NC) LINAC or the SC LINAC to two different destinations, the soft (SXR) and hard (HXR) x-ray undulator beamlines. A mode manager user interface has been created to manage the beamline configuration to transport beam pulses to multiple destinations, which include the numerous intermediate tune-up dumps and safety dumps between the injectors and the final beam dumps. The mode manager interfaces with the timing system which controls the bunch patterns to the various locations, and the machine protection system which prevents excess beam power from being sent to the wrong destination. This paper describes the implementation method for handling the mode switching, as well as the operator user interface which allows users to graphically select the desired beam paths.  
poster icon Poster TUPDP127 [1.191 MB]  
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-TUPDP127  
About • Received ※ 05 October 2023 — Revised ※ 22 October 2023 — Accepted ※ 30 November 2023 — Issued ※ 09 December 2023
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
TUPDP130 PyDM Archive Viewer feedback, GUI, target, controls 892
 
  • Y.G. Yazar, J.J. Bellister, Z.A. Domke, T. Summers
    SLAC, Menlo Park, California, USA
  • F.M. Osman
    Santa Clara University, Santa Clara, California, USA
 
  A new open-source PyQT-based archive viewer application has been developed at SLAC National Accelerator Laboratory. The viewer’s main purpose is to visualize both live values and historical Process Variable (PV) data retrieved from the EPICS Archive Appliances. It is designed as both a stand-alone application and to be easily launched from widgets on PyDM operator interfaces. In addition to providing standard configurability for things like traces, formulas, style and data exporting, it provides post-processing capabilities for filtering and curve fitting. The current release supports standard enumerated and analog data types as well as waveforms. Extension of this to support EPICS7 normative data types such as NTTable and NTNDArray is under development.  
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-TUPDP130  
About • Received ※ 06 October 2023 — Revised ※ 22 October 2023 — Accepted ※ 14 December 2023 — Issued ※ 20 December 2023
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
TUPDP132 Temperature Control of Crystal Optics for Ultrahigh-Resolution Applications controls, optics, power-supply, lattice 899
 
  • K.J. Gofron
    ORNL, Oak Ridge, Tennessee, USA
  • Y.Q. Cai, D.S. Coburn, A. Suvorov
    BNL, Upton, New York, USA
 
  Funding: This work was supported by the U.S. Department of Energy, Office of Science, Scientific User Facilities Division under Contract No. DE-AC05-00OR22725
The temperature control of crystal optics is critical for ultrahigh resolution applications such as those used in meV-resolved Inelastic Scattering. Due to the low count rate and long acquisition time of these experiments, for 1-meV energy resolution, the absolute temperature stability of the crystal optics must be maintained below 4 mK to ensure the required stability of lattice constant, thereby ensuring the energy stability of the optics. Furthermore, the temperature control with sub-mK precision enables setting the absolute temperature of individual crystal, making it possible to align the reflection energy of each crystal’s rocking curve in sub-meV resolution thereby maximizing the combined efficiency of the crystal optics. In this contribution, we report the details of an EPICS control system using PT1000 sensors, Keithley 3706A 7.5 digits sensor scanner, and Wiener MPOD LV power supply for the analyzer crystals of the Inelastic X-ray Scattering (IXS) beamline 10-ID at NSLS-II**. We were able to achieve absolute temperature stability below 1 mK and sub-meV energy alignment for several asymmetrically cut analyzer crystals. The EPICS ePID record was used for the control of the power supplies based on the PT1000 sensor input that was read with 7.5 digits accuracy from the Keithley 3706A scanner. The system enhances the performance of the meV-resolved IXS spectrometer with currently a 1.4 meV total energy resolution and unprecedented spectral sharpness for studies of atomic dynamics in a broad range of materials.
 
poster icon Poster TUPDP132 [0.809 MB]  
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-TUPDP132  
About • Received ※ 28 September 2023 — Revised ※ 09 October 2023 — Accepted ※ 30 November 2023 — Issued ※ 10 December 2023
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
TUSDSC08 Phoebus Tools and Services controls, framework, interface, site 944
 
  • K. Shroff
    BNL, Upton, New York, USA
  • T. Ashwarya
    FRIB, East Lansing, Michigan, USA
  • T.M. Ford
    LBNL, Berkeley, California, USA
  • K.-U. Kasemir
    ORNL, Oak Ridge, Tennessee, USA
  • R. Lange
    ITER Organization, St. Paul lez Durance, France
  • G. Weiss
    ESS, Lund, Sweden
 
  The Phoebus toolkit consists of a variety of control system applications providing user interfaces to control systems and middle-layer services. Phoebus is the latest incarnation of Control System Studio (CS-Studio), which has been redesigned replacing the underlying Eclipse RCP framework with standard Java alternatives like SPI, preferences, etc. Additionally the GUI toolkit was switched from SWT to JavaFX. This new architecture has not only simplified the development process while preserving the extensible and pluggable aspects of RCP, but also improved the performance and reliability of the entire toolkit. The Phoebus technology stack includes a set of middle-layer services that provide functionality like archiving, creating and restoring system snapshots, consolidating and organizing alarms, user logging, name lookup, etc. Designed around modern and widely used web and storage technologies like Spring Boot, Elastic, MongoDB, Kafka, the Phoebus middle-layer services are thin, scalable, and can be easily incorporated in CI/CD pipelines. The clients in Phoebus leverage the toolkit’s integration features, including common interfaces and utility services like adapter and selection, to provide users with a seamless experience when interacting with multiple services and control systems. This presentation aims to provide an overview of the Phoebus technology stack, highlighting the benefits of integrated tools in Phoebus and the microservices architecture of Phoebus middle-layer services.  
poster icon Poster TUSDSC08 [0.816 MB]  
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-TUSDSC08  
About • Received ※ 06 October 2023 — Revised ※ 09 October 2023 — Accepted ※ 23 November 2023 — Issued ※ 30 November 2023
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
WE1BCO03 Design of the HALF Control System controls, network, timing, operation 958
 
  • G. Liu, L.G. Chen, C. Li, X.K. Sun, K. Xuan, D.D. Zhang
    USTC/NSRL, Hefei, Anhui, People’s Republic of China
 
  The Hefei Advanced Light Facility (HALF) is a 2.2-GeV 4th synchrotron radiation light source, which is scheduled to start construction in Hefei, China in 2023. The HALF contains an injector and a 480-m diffraction limited storage ring, and 10 beamlines for phase one. The HALF control system is EPICS based with integrated application and data platforms for the entire facility including accelerator and beamlines. The unified infrastructure and network architecture are designed to build the control system. The infrastructure provides resources for the EPICS development and operation through virtualization technology, and provides resources for the storage and process of experimental data through distributed storage and computing clusters. The network is divided into the control network and the dedicated high-speed data network by physical separation, the control network is subdivided into multiple subnets by VLAN technology. Through estimating the scale of the control system, the 10Gbps control backbone network and the data network that can be expanded to 100Gbps can fully meet the communication requirements of the control system. This paper reports the control system architecture design and the development work of some key technologies in details.  
slides icon Slides WE1BCO03 [2.739 MB]  
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-WE1BCO03  
About • Received ※ 02 October 2023 — Revised ※ 09 October 2023 — Accepted ※ 13 October 2023 — Issued ※ 26 October 2023
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
WE1BCO04 The LCLS-II Experiment System Vacuum Controls Architecture vacuum, controls, interface, experiment 962
 
  • M. Ghaly, T.A. Wallace
    SLAC, Menlo Park, California, USA
 
  Funding: This work is supported by Department of Energy contract DE-AC02-76SF00515.
The LCLS-II Experiment System Vacuum Controls Architecture is a collection of vacuum system design templates, interlock logics, supported components (eg. gauges, pumps, valves), interface I/O, and associated software libraries which implement a baseline functionality and simulation. The architecture also includes a complement of engineering and deployment tools including cable test boxes or hardware simulators, as well as some automatic configuration tools. Vacuum controls at LCLS spans from rough vacuum in complex pumping manifolds, protection of highly-sensitive x-ray optics using fast shutters, maintenance of ultra-high vacuum in experimental sample delivery setups, and beyond. Often, the vacuum standards for LCLS systems exceeds what most vendors are experienced with. The system must maintain high-availability, while also remaining flexible and handling ongoing modifications. This paper will review the comprehensive architecture, the requirements of the LCLS systems, and introduce how to use it for new vacuum system designs. The architecture is meant to influence all phases of a vacuum system lifecycle, and ideally could become a shared project for installations beyond LCLS-II.
 
slides icon Slides WE1BCO04 [3.154 MB]  
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-WE1BCO04  
About • Received ※ 31 October 2023 — Revised ※ 20 November 2023 — Accepted ※ 08 December 2023 — Issued ※ 12 December 2023
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
WE1BCO07 The LCLS-II Precision Timing Control System laser, timing, controls, interface 966
 
  • T.K. Johnson, M.C. Browne, C.B. Pino
    SLAC, Menlo Park, California, USA
 
  The LCLS-II precision timing system is responsible for the synchronization of optical lasers with the LCLS-II XFEL. The system uses both RF and optical references for synchronization. In contrast to previous systems used at LCLS the optical lasers are shared resources, and must be managed during operations. The timing system consists of three primary functionalities: RF reference distribution, optical reference distribution, and a phase-locked loop (PLL). This PLL may use either the RF or the optical reference as a feedback source. The RF allows for phase comparisons over a relatively wide range, albeit with limited resolution, while the optical reference enables very fine phase comparison (down to attoseconds), but with limited operational range. These systems must be managed using high levels of automation. Much of this automation is done via high-level applications developed in EPICS. The beamline users are presented with relatively simple interfaces that streamline operation and abstract much of the system complexity away. The system provides both PyDM GUIs as well as python interfaces to enable time delay scanning in the LCLS-II DAQ.  
slides icon Slides WE1BCO07 [3.734 MB]  
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-WE1BCO07  
About • Received ※ 06 November 2023 — Revised ※ 09 November 2023 — Accepted ※ 14 December 2023 — Issued ※ 20 December 2023
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
WE2BCO04 Maintaining a Hybrid Control System at ISIS with a Vsystem/EPICS Bridge controls, software, hardware, target 986
 
  • K.R.L. Baker, I.D. Finch, M. Romanovschi
    STFC/RAL/ISIS, Chilton, Didcot, Oxon, United Kingdom
 
  The migration of the controls system for the ISIS accelerator from Vsystem to EPICS presents a significant challenge and risk to day-to-day operations. To minimise this impact throughout the transition, a software bridge between the two control systems has been developed that allows the phased porting of HMIs and hardware. The hybrid Vsystem and EPICS system also allows the continued use of existing feedback control applications that now require interaction between both control systems, for example the halo steering operation in Target Station 1. This work describes the implementation of this bridge, referred to as PVEcho, for the mapping of Vsystem channels to EPICS PVs and vice versa. The position within the wider ISIS controls software stack is outlined as well as how it utilises Python libraries for EPICS. Finally, we will discuss the software development practices applied that have allowed the bridge to run reliably for months at a time.  
slides icon Slides WE2BCO04 [2.757 MB]  
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-WE2BCO04  
About • Received ※ 05 October 2023 — Accepted ※ 08 December 2023 — Issued ※ 11 December 2023  
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
WE2BCO05 Continuous Modernization of Control Systems for Research Facilities controls, network, software, operation 993
 
  • K. Vodopivec, K.S. White
    ORNL, Oak Ridge, Tennessee, USA
 
  Funding: This work was supported by the U.S. Department of Energy under contract DE-AC0500OR22725.
The Spallation Neutron Source at Oak Ridge National Laboratory has been in operation since 2006. In order to achieve high operating reliability and availability as mandated by the sponsor, all systems participating in the production of neutrons need to be maintained to the highest achievable standard. This includes SNS integrated control system, comprising of specialized hardware and software, as well as computing and networking infrastructure. While machine upgrades are extending the control system with new and modern components, the established part of control system requires continuous modernization efforts due to hardware obsolescence, limited lifetime of electronic components, and software updates that can break backwards compatibility. This article discusses challenges of sustaining control system operations through decades of facility lifecycle, and presents a methodology used at SNS for continuous control system improvements that was developed by analyzing operational data and experience.
 
slides icon Slides WE2BCO05 [1.484 MB]  
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-WE2BCO05  
About • Received ※ 05 October 2023 — Revised ※ 12 October 2023 — Accepted ※ 08 December 2023 — Issued ※ 12 December 2023
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
WE2BCO06 EPICS Deployment at Fermilab controls, Linux, network, software 997
 
  • P.M. Hanlet, J.S. Diamond, M. Gonzalez, K.S. Martin
    Fermilab, Batavia, Illinois, USA
 
  Fermilab has traditionally not been an EPICS house, as such expertise in EPICS is limited and scattered. However, PIP-II will be using EPICS for its control system. Furthermore, when PIP-II is operating, it must to interface with the existing, though modernized (see ACORN) legacy control system. We have developed and deployed a software pipeline that addresses these needs and presents to developers a tested and robust software framework, including template IOCs from which new developers can quickly gain experience. In this presentation, we will discuss the motivation for this work, the implementation of a continuous integration/continuous deployment pipeline, testing, template IOCs, and the deployment of user applications. We will also discuss how this is used with the current PIP-II teststand and lessons learned.  
slides icon Slides WE2BCO06 [2.860 MB]  
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-WE2BCO06  
About • Received ※ 06 October 2023 — Revised ※ 23 October 2023 — Accepted ※ 11 December 2023 — Issued ※ 17 December 2023
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
WE3BCO01 Modular and Scalable Archiving for EPICS and Other Time Series Using ScyllaDB and Rust database, FEL, MMI, operation 1008
 
  • D. Werder, T. Humar
    PSI, Villigen PSI, Switzerland
 
  At PSI we currently run too many different products with the common goal of archiving timestamped data. This includes EPICS Channel Archiver as well as Archiver Appliance for EPICS IOC’s, a buffer storage for beam-synchronous data at SwissFEL, and more. This number of monolithic solutions is too large to maintain and overlaps in functionality. Each solution brings their own storage engine, file format and centralized design which is hard to scale. In this talk I report on how we factored the system into modular components with clean interfaces. At the core, the different storage engines and file formats have been replaced by ScyllaDB, which is an open source product with enterprise support and remarkable adoption in the industry. We gain from its distributed, fault-tolerant and scalable design. The ingest of data into ScyllaDB is factored into components according to the different type of protocols of the sources, e.g. Channel Access. Here we build upon the Rust language and achieve robust, maintainable and performant services. One interface to access and process the recorded data is the HTTP retrieval service. This service offers e.g. search among the channels by various criteria, full event data as well as aggregated and binned data in either json or binary formats. This service can also run user-defined data transformations and act as a source for Grafana for a first view into recorded channel data. Our setup for SwissFEL ingests the ~370k EPICS updates/s from ~220k PVs (scalar and waveform), having rates between 0.1 and 100 Hz.  
slides icon Slides WE3BCO01 [1.179 MB]  
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-WE3BCO01  
About • Received ※ 04 October 2023 — Revised ※ 09 November 2023 — Accepted ※ 14 December 2023 — Issued ※ 14 December 2023
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
TH1BCO01 Five years of EPICS 7 - Status Update and Roadmap controls, network, site, status 1087
 
  • R. Lange
    ITER Organization, St. Paul lez Durance, France
  • L.R. Dalesio, M.A. Davidsaver, G.S. McIntyre
    Osprey DCS LLC, Ocean City, USA
  • S.M. Hartman, K.-U. Kasemir
    ORNL, Oak Ridge, Tennessee, USA
  • A.N. Johnson, S. Veseli
    ANL, Lemont, Illinois, USA
  • H. Junkes
    FHI, Berlin, Germany
  • T. Korhonen, S.C.F. Rose
    ESS, Lund, Sweden
  • M.R. Kraimer
    Self Employment, Private address, USA
  • K. Shroff
    BNL, Upton, New York, USA
  • G.R. White
    SLAC, Menlo Park, California, USA
 
  Funding: Work supported in part by the U.S. Department of Energy under contracts DE-AC02-76SF00515 and DE-AC05-00OR22725.
After its first release in 2017, EPICS version 7 has been introduced into production at several sites. The central feature of EPICS 7, the support of structured data through the new pvAccess network protocol, has been proven to work in large production systems. EPICS 7 facilitates the implementation of new functionality, including developing AI/ML applications in controls, managing large data volumes, interfacing to middle-layer services, and more. Other features like support for the IPv6 protocol and enhancements to access control have been implemented. Future work includes integrating a refactored API into the core distribution, adding modern network security features, as well as developing new and enhancing existing services that take advantage of these new capabilities. The talk will give an overview of the status of deployments, new additions to the EPICS Core, and an overview of its planned future development.
 
slides icon Slides TH1BCO01 [0.562 MB]  
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-TH1BCO01  
About • Received ※ 04 October 2023 — Revised ※ 12 October 2023 — Accepted ※ 19 November 2023 — Issued ※ 24 November 2023
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
TH1BCO02 Development of Laser Accelerator Control System Based on EPICS controls, laser, operation, proton 1093
 
  • Y. Xia, K.C. Chen, L.W. Feng, Z. Guo, Q.Y. He, F.N. Li, C. Lin, Q. Wang, X.Q. Yan, M.X. Zang, J. Zhao
    PKU, Beijing, People’s Republic of China
  • J. Zhao
    Peking University, Beijing, Haidian District, People’s Republic of China
 
  Funding: State Key Laboratory of Nuclear Physics and Technology, and Key Laboratory of HEDP of the Ministry of Education, CAPT, Peking University, Beijing 100871, China;
China’s Ministry of Science and Technology supports Peking University in constructing a proton radiotherapy device based on a petawatt (PW) laser accelerator. The control system’s functionality and performance are vital for the accelerator’s reliability, stability, and efficiency. The PW laser accelerator control system has a three-layer distributed architecture, including device control, front-end (input/output) control and central control (data management, and human-machine interface) layers. The software platform primarily uses EPICS, supplemented by PLC, Python, and Java, while the hardware platform comprises industrial control computers, servers, and private cloud configurations. The control system incorporates various subsystems that manage the laser, target field, beamline, safety interlocks, conditions, synchronization, and functionalities related to data storage, display, and more. This paper presents a control system implementation suitable for laser accelerators, providing valuable insights for future laser accelerator control system development.
 
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-TH1BCO02  
About • Received ※ 04 October 2023 — Revised ※ 09 October 2023 — Accepted ※ 14 December 2023 — Issued ※ 15 December 2023
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
TH1BCO05 Diamond Light Source Athena Platform software, controls, framework, experiment 1115
 
  • J. Shannon, C.A. Forrester, K.A. Ralphs
    DLS, Oxfordshire, United Kingdom
 
  The Athena Platform aims to replace, upgrade and modernise the capabilities of Diamond Light Source’s acquisition and controls tools, providing an environment for better integration with information management and analysis functionality. It is a service-based experiment orchestration system built on top of NSLS-II’s Python based Bluesky/Ophyd data collection framework, providing a managed and extensible software deployment local to the beamline. By using industry standard infrastructure provision, security and interface technologies we hope to provide a sufficiently flexible and adaptable platform, to meet the wide spectrum of science use cases and beamline operation models in a reliable and maintainable way. In addition to a system design overview, we describe here some initial test deployments of core capabilities to a number of Diamond beamlines, as well as some of the technologies developed to support the overall delivery of the platform.  
slides icon Slides TH1BCO05 [1.409 MB]  
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-TH1BCO05  
About • Received ※ 05 October 2023 — Accepted ※ 08 December 2023 — Issued ※ 16 December 2023  
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
TH2AO05 Secure Role-Based Access Control for RHIC Complex controls, operation, software, network 1150
 
  • A. Sukhanov, J. Morris
    BNL, Upton, New York, USA
 
  Funding: Work supported by Brookhaven Science Associates, LLC under Contract No. DE-SC0012704 with the U.S. Department of Energy.
This paper describes the requirements, design, and implementation of Role-Based Access Control (RBAC) for RHIC Complex. The system is being designed to protect from accidental, unauthorized access to equipment of the RHIC Complex, but it also can provide significant protection against malicious attacks. The role assignment is dynamic. Roles are primarily based on user id but elevated roles may be assigned for limited periods of time. Protection at the device manager level may be provided for an entire server or for individual device parameters. A prototype version of the system has been deployed at RHIC complex since 2022. The authentication is performed on a dedicated device manager, which generates an encrypted token, based on user ID, expiration time, and role level. Device managers are equipped with an authorization mechanism, which supports three methods of authorization: Static, Local and Centralized. Transactions with token manager take place ’atomically’, during secured set() or get() requests. The system has small overhead: ~0.5 ms for token processing and ~1.5 ms for network round trip. Only python based device managers are participating in the prototype system. Testing has begun with C++ device managers, including those that run on VxWorks platforms. For easy transition, dedicated intermediate shield managers can be deployed to protect access to device managers which do not directly support authorization.
 
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-TH2AO05  
About • Received ※ 04 October 2023 — Revised ※ 14 November 2023 — Accepted ※ 19 December 2023 — Issued ※ 22 December 2023
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
TH2BCO02 Open Source EtherCAT Motion Control Rollout for Motion Applications at SLS-2.0 Beamlines controls, PLC, hardware, framework 1166
 
  • A.S. Acerbo, T. Celcer, A. Sandström
    PSI, Villigen PSI, Switzerland
 
  The SLS-2.0 upgrade project comprises of a new storage ring and magnet lattice and will result in improved emittance and brightness by two orders of magnitude. Paired with these upgrades is a generational upgrade of the motion control system, away from VME based hardware and towards a more modern framework. For SLS-2.0 beamlines, the EtherCAT Motion Control (ECMC) open source framework has been chosen as the de-facto beamline motion control system for simple motion, analog/digital input/output and simple data collection. The ECMC framework comprises of a feature rich implementation of the EtherCAT protocol and supports a broad range of Beckhoff hardware, with the ability to add further EtherCAT devices. ECMC provides soft PLC functionality supported by the C++ Mathematical Expression Toolkit Library (ExprTk), which runs at a fixed frequency on the EtherCAT master at a rate up to the EtherCAT frame rate. This PLC approach allows for implementing complex motion, such as forward and backward kinematics of multi-positioner systems, i.e. roll, yaw, and pitch in a 5-axis mirror system. Additional logic can be loaded in the form of plugins written in C. Further work is ongoing to provide flexible Position Compare functionality at a frequency of 1 kHz coupled with event triggering as a way to provide a basic fly-scan functionality for medium performance applications with the use of standardized SLS-2.0 beamline hardware. We provide an overview of these and related ECMC activities currently ongoing for the SLS-2.0 project.  
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-TH2BCO02  
About • Received ※ 06 October 2023 — Accepted ※ 08 December 2023 — Issued ※ 12 December 2023  
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
TH2BCO03 The LCLS-II Experiment Control System controls, PLC, experiment, vacuum 1172
 
  • T.A. Wallace, D.L. Flath, M. Ghaly, T.K. Johnson, K.R. Lauer, Z.L. Lentz, R.S. Tang-Kong, J. Yin
    SLAC, Menlo Park, California, USA
 
  Funding: Work supported by the U.S. Department of Energy under contract number DE-AC02-76SF00515.
The Linac Coherent Light Source (LCLS) has been undergoing upgrades for several years now through at least two separate major projects: LCLS-II a DOE 403.13b project responsible for upgrading the accelerator, undulators and some front-end beam delivery systems, and the LCLS-II Strategic Initiative or L2SI project which assumed responsibility for upgrading the experiment endstations to fully utilize the new XFEL machine capabilities to be delivered by LCLS-II. Both projects included scope to design, install and commission a control system prepared to handle the risks associated with the tenfold increase in beam power we will eventually achieve. This paper provides an overview of the new control system architecture from the LCLS-II and L2SI projects and status of its commissioning.
 
slides icon Slides TH2BCO03 [2.700 MB]  
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-TH2BCO03  
About • Received ※ 04 November 2023 — Accepted ※ 11 December 2023 — Issued ※ 16 December 2023  
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
TH2BCO06 The SNS PLC Based Controls Solution for Stepper Motors controls, PLC, hardware, Ethernet 1187
 
  • D.C. Williams, F.C. Medio
    ORNL, Oak Ridge, Tennessee, USA
 
  Funding: SNS is managed by UT-Battelle, LLC, under contract DE-AC05-00OR22725 for the U.S. Department of Energy
The Spallation Neutron Source (SNS) at Oak Ridge National Laboratory has been operating for over 15 years and many electronic components are now obsolete and require replacement to assure reliability and sustainability. SNS uses stepper motors to control accelerator components throughout the facility including the cryomodule tuners, beam scrapers, and the primary and secondary stripper foils. The original motor controls were implemented with VME controllers, custom power supplies, and various types of motor drivers. As these components became less reliable and obsolete a new control solution was needed that could be applied to multiple motion control systems. Fast performance requirements are not crucial for these stepper motors, so the PLC technology was selected. The first system replaced was the Ring stripper foil control system and plans are underway to replace the beam scrapers. This paper provides an overview of the commercial off-the-shelf (COTS) hardware used to control stepper motors at SNS. Details of the design and challenges to convert a control system during short maintenance periods without disrupting beam operation will be covered in this paper.
 
slides icon Slides TH2BCO06 [1.914 MB]  
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-TH2BCO06  
About • Received ※ 19 September 2023 — Revised ※ 10 October 2023 — Accepted ※ 13 October 2023 — Issued ※ 25 October 2023
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
THMBCMO08 whatrecord: A Python-Based EPICS File Format Tool database, controls, HOM, PLC 1206
 
  • K.R. Lauer
    SLAC, Menlo Park, California, USA
 
  Funding: This work is supported by Department of Energy contract DE-AC02-76SF00515.
whatrecord is a Python-based parsing tool for interacting with a variety of EPICS file formats, including R3 and R7 database files. The project aims for compliance with epics-base by using Lark grammars that closely reflect the original Lex/Yacc grammars. It offers a suite of tools for working with its supported file formats, with convenient Python-facing dataclass object representations and easy JSON serialization. A prototype backend web server for hosting IOC and record information is also included as well as a Vue.js-based frontend, an EPICS build system Makefile dependency inspector, a static analyzer-of-sorts for startup scripts, and a host of other things that the author added at whim to this side project.
 
slides icon Slides THMBCMO08 [1.442 MB]  
poster icon Poster THMBCMO08 [1.440 MB]  
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-THMBCMO08  
About • Received ※ 03 October 2023 — Revised ※ 24 October 2023 — Accepted ※ 14 December 2023 — Issued ※ 21 December 2023
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
THMBCMO10 SECoP Integration for the Ophyd Hardware Abstraction Layer hardware, interface, controls, status 1212
 
  • P. Wegmann, K. Kiefer, O. Mannix, L. Rossa, W. Smith
    HZB, Berlin, Germany
  • E. Faulhaber
    MLZ, Garching, Germany
  • M. Zolliker
    PSI, Villigen PSI, Switzerland
 
  At the core of the Bluesky experimental control ecosystem the ophyd hardware abstraction, a consistent high-level interface layer, is extremely powerful for complex device integration. It introduces the device data model to EPICS and eases integration of alien control protocols. This paper focuses on the integration of the Sample Environment Communication Protocol (SECoP)* into the ophyd layer, enabling seamless incorporation of sample environment hardware into beamline experiments at photon and neutron sources. The SECoP integration was designed to have a simple interface and provide plug-and-play functionality while preserving all metadata and structural information about the controlled hardware. Leveraging the self-describing characteristics of SECoP, automatic generation and configuration of ophyd devices is facilitated upon connecting to a Sample Environment Control (SEC) node. This work builds upon a modified SECoP-client provided by the Frappy framework**, intended for programming SEC nodes with a SECoP interface. This paper presents an overview of the architecture and implementation of the ophyd-SECoP integration and includes examples for better understanding.
*Klaus Kiefer et al. "An introduction to SECoP - the sample environment communication protocol".
**Markus Zolliker and Enrico Faulhaber url: https://github.com/sampleenvironment/Frappy.
 
slides icon Slides THMBCMO10 [0.596 MB]  
poster icon Poster THMBCMO10 [0.809 MB]  
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-THMBCMO10  
About • Received ※ 05 October 2023 — Accepted ※ 08 December 2023 — Issued ※ 14 December 2023  
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
THMBCMO11 Full Stack PLC to EPICS Integration at ESS PLC, controls, software, factory 1216
 
  • A. Rizzo, E.E. Foy, D. Hasselgren, A.Z. Horváth, A. Petrushenko, J.A. Quintanilla, S.C.F. Rose, A. Simelio
    ESS, Lund, Sweden
 
  The European Spallation Source is one of the largest science and technology infrastructure projects being built today. The Control System at ESS is then essential for the synchronisation and day-to-day running of all the equipment responsible for the production of neutrons for the experimental programs. The standardised PLC platform for ESS to handle slower signal comes from Siemens*, while for faster data interchange with deterministic timing and higher processing power, from Beckoff/EtherCAT**. All the Control Systems based on the above technologies are integrated using EPICS framework***. We will present how the full stack integration from PLC to EPICS is done at ESS using our standard Configuration Management Ecosystem.
* https://www.siemens.com/global/en/products/automation/systems/industrial/plc.html
** https://www.beckhoff.com/en-en/products/i-o/ethercat/
*** https://epics-controls.org/
 
slides icon Slides THMBCMO11 [0.178 MB]  
poster icon Poster THMBCMO11 [0.613 MB]  
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-THMBCMO11  
About • Received ※ 05 October 2023 — Revised ※ 25 October 2023 — Accepted ※ 08 December 2023 — Issued ※ 18 December 2023
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
THMBCMO21 Development of Standard MicroTCA Deployment at ESS controls, interface, ion-source, GUI 1238
 
  • F. Chicken, J.J. Jamróz, J.P.S. Martins
    ESS, Lund, Sweden
 
  At the European Spallation Source, over 300 MicroTCA systems will be deployed over the accelerator, target area and instruments. Covering integrations for RF, Beam Instrumentation, Machine Protection and Timing Distribution systems, ESS has developed a method to standardise the deployment of the basic MicroTCA system configuration using a combination of Python scripts and Ansible playbooks with a view to ensure long-term maintainability of the systems and future upgrades. By using Python scripts to setup, the Micro Carrier Hub (MCH) registering it on the network and update the firmware to our chosen version, and Ansible playbooks to register the Concurrent Technologies CPU on the ESS network and install the chosen Linux OS before a second playbook installs the ESS EPICs Environment (E3) ensures all new systems have identical setup procedures and have all the necessary packages before the on-site integration is started.  
slides icon Slides THMBCMO21 [0.686 MB]  
poster icon Poster THMBCMO21 [2.560 MB]  
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-THMBCMO21  
About • Received ※ 05 October 2023 — Revised ※ 25 October 2023 — Accepted ※ 11 December 2023 — Issued ※ 16 December 2023
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
THMBCMO29 Motion Controls for ORNL Neutron Science Experimental Beamlines controls, HOM, software, experiment 1261
 
  • X. Geng, A. Groff, M.R. Pearson, G. Taufer
    ORNL, Oak Ridge, Tennessee, USA
 
  Funding: ORNL is managed by UT-Battelle, LLC, under contract DE-AC05-00OR22725 for the U. S. Department of Energy
This paper presents a comprehensive overview of the motion control systems employed within the neutron science user facilities at Oak Ridge National Laboratory (ORNL). The Spallation Neutron Source (SNS) and the High Flux Isotope Reactor (HFIR) at ORNL have a total of 35 neutron beam lines with numerous motors for mo-tion control. The motion systems vary in complexity from a linear sample positioning stage to multi-axis end stations. To enhance the capabilities of these motion systems, a concerted effort has been made to establish standardized hardware and flexible software that improve performance, increase reliability and provide the capability for automated experiments. The report discusses the various motion controllers used, the EPICS-based IOCs (Input Output Controllers), high-level motion software, and plans for ongoing upgrades and new projects.
 
slides icon Slides THMBCMO29 [1.893 MB]  
poster icon Poster THMBCMO29 [6.483 MB]  
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-THMBCMO29  
About • Received ※ 05 October 2023 — Revised ※ 10 October 2023 — Accepted ※ 13 December 2023 — Issued ※ 22 December 2023
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
THMBCMO30 Using ArUco Codes for Beam Spot Analysis with a Camera at an Unknown Position detector, HOM, MMI, controls 1264
 
  • W. Smith, M. Arce, M. Bär, M. Gorgoi, C.E. Jimenez, I. Rudolph
    HZB, Berlin, Germany
 
  Measuring the focus size and position of an X-ray beam at the interaction point in an synchrotron beamline is a critical parameter that is used when planning experiments and when determining if a beamline is achieving it’s design goals. Commonly this is performed using a dedicated UHV "focus chamber" comprising a fluorescent screen at an adjustable calibrated distance from the mounting flange and a camera on the same axis as the beam. Having to install a large piece of hardware makes regular checks prohibitively time consuming. A fluorescent screen can be mounted to a sample holder and moved using a manipulator in the existing end-station and a camera pointed at this to show a warped version of the beam spot at the interaction point. The warping of the image is caused by the relative position of the camera to the screen, which is difficult to determine and can change and come out of camera focus as the manipulator is moved. This paper proposes a solution to this problem using ArUco codes printed onto a fluorescent screen which provide a reference in the image. Reference points from the ArUco codes are recovered from an image and used to correct warping and provide a calibration in real time using an EPICS AreaDetector plugin using OpenCV. This analysis is presently in commissioning and aims to characterise the beam spots at the dual-colour beamline of the EMIL laboratory at BESSY II.  
slides icon Slides THMBCMO30 [4.674 MB]  
poster icon Poster THMBCMO30 [0.942 MB]  
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-THMBCMO30  
About • Received ※ 16 September 2023 — Revised ※ 10 October 2023 — Accepted ※ 13 October 2023 — Issued ※ 22 October 2023
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
THMBCMO36 Video Compression for areaDetector detector, neutron, scattering, controls 1290
 
  • B.A. Sobhani
    ORNL, Oak Ridge, Tennessee, USA
 
  At neutron sources such as SNS and HFIR, neutrons collide with neutron detectors at a much lower rate than light would for an optical detector. Additionally, the image typically does not pan or otherwise move. This means that the incremental element-by-element differences between frames will be small. This makes neutron imaging data an ideal candidate for video-level compression where the incremental differences between frames are compressed and sent, as opposed to image-level compression where the entire frame is compressed and sent. This paper describes an EPICS video compression plugin for areaDetector that was developed at SNS.  
slides icon Slides THMBCMO36 [0.312 MB]  
poster icon Poster THMBCMO36 [0.221 MB]  
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-THMBCMO36  
About • Received ※ 05 October 2023 — Revised ※ 12 October 2023 — Accepted ※ 13 December 2023 — Issued ※ 15 December 2023
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
THPDP001 New Generation Qt Control Components for Hi Level Software controls, storage-ring, framework, TANGO 1291
 
  • G. Strangolino, G. Gaio, R. Passuello
    Elettra-Sincrotrone Trieste S.C.p.A., Basovizza, Italy
 
  A new generation of Qt graphical components, namely cumbia-qtcontrols-ng is under development at ELETTRA. A common engine allows each component to be rendered on traditional QWidgets and scalable QGraphicsItems alike. The latter technology makes it possible to integrate live controls with static SVG in order to realize any kind of synoptic with touch and scaling capabilities. A pluggable zoomer can be installed on any widget or graphics item. Apply numeric controls, Cartesian and Circular (Radar) plots are the first components realized.  
poster icon Poster THPDP001 [0.935 MB]  
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-THPDP001  
About • Received ※ 29 September 2023 — Revised ※ 14 November 2023 — Accepted ※ 20 December 2023 — Issued ※ 20 December 2023
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
THPDP013 EPICS Integration for Rapid Control Prototyping Hardware from Speedgoat hardware, controls, real-time, interface 1317
 
  • L. Rossa, M. Brendike
    HZB, Berlin, Germany
 
  To exploit the full potential of fourth generation Synchrotron Sources, new beamline instrumentation is increasingly developed with a mechatronics approach. [*,**,***] Implementing this approach raises the need for Rapid Control Prototyping (RCP) and Hardware-In-the-Loop (HIL) simulations. To integrate such RCP and HIL systems into every-day beamline operation we developed an interface from a Speedgoat real-time performance machine - programmable via MATLAB Simulink - to EPICS. The interface was developed to be simple to use and still flexible. The Simulink software developer uses dedicated Simulink-blocks to export model information and real-time data into structured UDP Ethernet frames. The corresponding EPICS IOC listens to the UDP frames and auto-generates a corresponding database file to fit the data-stream from the Simulink model. The EPICS IOC can run on either a beamline measurement PC or to keep things spatially close on a mini PC (such as a Raspberry Pi) attached to the Speedgoat machine. An overview of the interface idea, architecture and implementation, together with some simple examples will be presented.
* https://doi.org/10.18429/JACoW-MEDSI2016-MOPE19
** https://doi.org/10.18429/JACoW-ICALEPCS2019-TUCPL05
*** https://orbi.uliege.be/bitstream/2268/262789/1/TUIO02.pdf
 
poster icon Poster THPDP013 [1.143 MB]  
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-THPDP013  
About • Received ※ 29 September 2023 — Revised ※ 25 October 2023 — Accepted ※ 13 December 2023 — Issued ※ 18 December 2023
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
THPDP020 Management of EPICS IOCs in a Distributed Network Environment Using Salt controls, monitoring, network, hardware 1340
 
  • E. Blomley, J. Gethmann, A.-S. Müller, M. Schuh
    KIT, Karlsruhe, Germany
  • S. Marsching
    Aquenos GmbH, Baden-Baden, Germany
 
  An EPICS-based control system typically consists of many individual IOCs, which can be distributed across many computers in a network. Managing hundreds of deployed IOCs, keeping track of where they are running, and providing operators with basic interaction capabilities can easily become a maintenance nightmare. At the Institute for Beam Physics and Technology (IBPT) of the Karlsruhe Institute of Technology (KIT), we operate separate networks for our accelerators KARA and FLUTE and use the Salt Project to manage the IT infrastructure. Custom Salt states take care of deploying our IOCs across multiple servers directly from the code repositories, integrating them into the host operating system and monitoring infrastructure. In addition, this allows the integration into our GUI in order to enable operators to monitor and control the process for each IOC without requiring any specific knowledge of where and how that IOC is deployed. Therefore, we can maintain and scale to any number of IOCs on any numbers of hosts nearly effortless. This paper presents the design of this system, discusses the tools and overall setup required to make it work, and shows off the integration into our GUI and monitoring systems.  
poster icon Poster THPDP020 [0.431 MB]  
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-THPDP020  
About • Received ※ 04 October 2023 — Accepted ※ 10 December 2023 — Issued ※ 14 December 2023  
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
THPDP028 Particle Swarm Optimization Techniques for Automatic Beam Transport at the Lnl Superconducting Linac Accelerators controls, cavity, beam-transport, linac 1370
 
  • M. Montis, L. Bellan
    INFN/LNL, Legnaro (PD), Italy
 
  The superconductive quarter wave cavities hadron Lin-ac ALPI is the final acceleration stage at the Legnaro National Laboratories and it is going to be used as re-acceleration line of the radioactive ion beams for the SPES (Selective Production of Exotic Species) project. The Linac was designed in ’90s with the available techniques and it was one of the peak technologies of this kind in Europe at those times, controls included. In the last decade, controls related to all the functional systems composing the accelerator have been ungraded to an EPICS-based solution. This upgrade has given us the opportunity to design and test new possible solutions for automatic beam transport. The work described in this paper is based on the experience and results (in terms of time, costs, and manpower) obtained using Particle Swarm Optimization (PSO) techniques for beam transport optimization applied to the ALPI accelerator. Due to the flexibility and robustness of this method, this tool will be extended to other parts of the facility.  
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-THPDP028  
About • Received ※ 06 September 2023 — Revised ※ 10 October 2023 — Accepted ※ 10 December 2023 — Issued ※ 16 December 2023
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
THPDP029 Alpi-Piave Beam Transport Control System Upgrade at Legnaro National Laboratories controls, power-supply, beam-transport, Ethernet 1374
 
  • M. Montis, F. Gelain, M.G. Giacchini
    INFN/LNL, Legnaro (PD), Italy
 
  During the last decade, the control system employed for ALPI and PIAVE Accelerators was upgraded to the new EPICS-based framework as part of the new standards adopted in the SPES project in construction in Legnaro. The actual control for beam transport was fully completed in 2015 and it has been in production since that year. Due to the power supply upgrade and to optimize costs and maintenance time, the original controllers based on in-dustrial PCs were substituted with dedicated serial-over-ethernet devices and Virtual Machines (VMs). In this work we will describe the solution designed and imple-mented for ALPI-PIAVE accelerators.  
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-THPDP029  
About • Received ※ 18 September 2023 — Revised ※ 10 October 2023 — Accepted ※ 18 December 2023 — Issued ※ 21 December 2023
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
THPDP030 ESS Drift Tube Linac Control System Commissioning: Results and Lessons Learned controls, DTL, hardware, site 1377
 
  • M. Montis, L. Antoniazzi, A. Baldo, M.G. Giacchini
    INFN/LNL, Legnaro (PD), Italy
  • A. Rizzo
    ESS, Lund, Sweden
 
  European Spallation Source (ESS) will be a neutron source using proton beam Linac of expected 5MW beam power. Designed and implemented by INFN-LNL, the Drift Tube Linac (DTL) control system is based on EPICS framework as indicated by the Project Requirements. This document aims to describe the results of the first part of the control system commissioning stage in 2022, where INFN and ESS teams were involved in the final tests on site. This phase was the first step toward a complete de-ployment of the control system, where the installation was composed by three sequential stages, according to the apparatus commissioning schedule. In this scenario, the firsts Site Acceptance Test (SAT) and Site Integrated Test (SIT) were crucial, and their results were the mile-stones for the other stages: the lessons learned can be important to speed up the future integration, calibration, and tuning of such a complex control system.

 
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-THPDP030  
About • Received ※ 18 September 2023 — Revised ※ 10 October 2023 — Accepted ※ 13 October 2023 — Issued ※ 26 October 2023
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
THPDP032 Introduction of the Ethernet-Based Field Networks to Inter-Device Communication for RIBF Control System Ethernet, network, controls, PLC 1384
 
  • A. Uchiyama, N. Fukunishi, M. Komiyama
    RIKEN Nishina Center, Wako, Japan
 
  Internet Protocol (IP) networks are widely used to remotely control measurement instruments and controllers. In addition to proprietary protocols, common commands such as the standard commands for programmable instruments (SCPI) are used by manufacturers of measuring instruments. Many IP-network-based devices have been used in RIBF control systems constructed using the experimental physics and industrial control system (EPICS); these are commercial devices designed and developed independently. EPICS input/output controllers (IOCs) usually establish socket communications to send commands to IP-network-based devices. However, in the RIBF control system, reconnection between the EPICS IOC and the device is often not established after the loss of socket communication due to an unexpected power failure of the device or network switch. In this case, it is often difficult to determine whether the socket connection to the EPICS IOC is broken even after checking the communication by pinging. Using Ethernet as the field network in the physical layer between the device and EPICS IOC can solve these problems. Therefore, we are considering the introduction of field networks such as EtherCAT and Ethernet/IP, which use Ethernet in the physical layer. In the implementation of the prototype system, EPICS IOCs and devices are connected via EtherCAT and Soft PLCs are run on the machine running EPICS IOCs for sequence control.  
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-THPDP032  
About • Received ※ 06 October 2023 — Revised ※ 11 October 2023 — Accepted ※ 11 December 2023 — Issued ※ 15 December 2023
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
THPDP033 Multi-User Virtual Accelerator at HEPS for High-Level Application Development and Beam Commissioning MMI, linac, framework, controls 1388
 
  • P. Zhu, Y. Jiao, J.Y. Li, N. Li, C. Meng, Y.M. Peng, G. Xu
    IHEP, Beijing, People’s Republic of China
  • X.H. Lu
    IHEP CSNS, Guangdong Province, People’s Republic of China
 
  At High Energy Photon Source (HEPS), a multi-user virtual accelerator system has been developed for testing the high-level application (HLA) and simulating the effects of various errors on the results of beam commissioning. The virtual accelerator is based on the Pyapas development framework for HLA and is designed using a client/server (C/S) architecture. It uses Ocelot with custom multipole field models for physical calculations and supports error simulation for various magnet and beam instrumentation and diagnostics devices. Calculation results are sent externally through the EPICS PV channel. The multi-user virtual accelerator system was developed to meet the needs of different users within the same network segment who need to simultaneously call the virtual accelerator for software debugging and simulation research. Each user can open a unique virtual accelerator without affecting others, and can also start different virtual accelerators for different research content. The number of virtual accelerators opened is not limited. The operation of the entire virtual accelerator system can be easily switched on and off like opening an app, greatly facilitating user use. This article provides a detailed description of the design concept and implementation of the multi-user virtual accelerator system.  
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-THPDP033  
About • Received ※ 11 October 2023 — Revised ※ 12 October 2023 — Accepted ※ 11 December 2023 — Issued ※ 13 December 2023
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
THPDP036 Research on HALF Historical Data Archiver Technology database, controls, experiment, distributed 1394
 
  • X.K. Sun, D.D. Zhang
    USTC/NSRL, Hefei, Anhui, People’s Republic of China
  • H. Chen
    USTC, SNST, Anhui, People’s Republic of China
 
  The Hefei Advanced Light Facility (HALF) is a 2.2-GeV 4th synchrotron radiation light source, which is scheduled to start construction in Hefei, China in 2023. The HALF contains an injector and a 480-m diffraction limited storage ring, and 10 beamlines for phase one. The HALF historical data archiver system is responsible to store operation data for the entire facility including accelerator and beamlines. It is necessary to choose a high-performance database for the massive structured data generated by HALF. A fair test platform is designed and built to test the performance of six commonly used databases in the accelerator field. The test metrics include reading and writing performance, availability, scalability, and software ecosystem. This paper introduces the design of the database test scheme, the construction of the test platform and the future test plan in detail.  
poster icon Poster THPDP036 [0.933 MB]  
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-THPDP036  
About • Received ※ 28 September 2023 — Revised ※ 26 October 2023 — Accepted ※ 11 December 2023 — Issued ※ 12 December 2023
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
THPDP037 The Alarm System at HLS-II monitoring, controls, status, distributed 1399
 
  • S. Xu, X.K. Sun
    USTC/NSRL, Hefei, Anhui, People’s Republic of China
 
  The control system of the Hefei Light Source II (HLS-II) is a distributed system based on Experimental Physics and Industrial Control System. The alarm system of HLS-II is responsible for monitoring the alarm state of the facility and distributing the alarm message in time. The monitoring range of the alarm system covers the devices of HLS-II technical group and the server platform. Zabbix is an open-source software tool to monitor the server platform. Custom metrics collection is achieved by implementing external scripts written in Python and automated agent deployment discovers the monitored servers running with Zabbix agents. The alarm distribution strategy of the front end devices is designed to overcome alarm floods. The alarm system of HLS-II provides multiple messaging channels to notify the responsible staff, including WeChat, SMS and web-based GUI. The alarm system of HLS-II has been deployed since December 2022. The result shows the alarm system facilitates the operator to troubleshoot problem efficiently to improve the availability of HLS-II.  
poster icon Poster THPDP037 [0.653 MB]  
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-THPDP037  
About • Received ※ 30 September 2023 — Accepted ※ 08 December 2023 — Issued ※ 13 December 2023  
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
THPDP053 Test Automation for Control Systems at the European Spallation Source controls, software, PLC, framework 1435
 
  • K. Vestin, F.S. Alves, L.J. Johansson, S. Pavinato, K.E. Rosengren, M.V. Vojneski
    ESS, Lund, Sweden
 
  This paper describes several control system test auto-mation frameworks for the control systems at the Europe-an Spallation Source (ESS), a cutting-edge research facili-ty that generates neutron beams for scientific experi-ments. The control system is a crucial component of ESS, responsible for regulating and monitoring the facility’s complex machinery, including a proton accelerator, target station, and several neutron instruments. The traditional approach to testing control systems largely relies on manual testing, which is time-consuming and error-prone. To enhance the testing process, several different test automation frameworks have been devel-oped for various types of applications. Some of these frameworks are integrated with the ESS control system, enabling automated testing of new software releases and updates, as well as regression testing of existing func-tionality. The paper provides an overview of the various automa-tion frameworks in use at ESS, including their architec-ture, tools, and development techniques. It discusses the benefits of the different frameworks, such as increased testing efficiency, improved software quality, and reduced testing costs. The paper concludes by outlining future development directions.  
poster icon Poster THPDP053 [1.020 MB]  
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-THPDP053  
About • Received ※ 19 September 2023 — Revised ※ 10 October 2023 — Accepted ※ 08 December 2023 — Issued ※ 14 December 2023
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
THPDP070 Building, Deploying and Provisioning Embedded Operating Systems at PSI Linux, network, controls, hardware 1505
 
  • D. Anicic
    PSI, Villigen PSI, Switzerland
 
  In the scope of the Swiss Light Source (SLS) upgrade project, SLS 2.0, at Paul Scherrer Institute (PSI) two New Processing Platforms (NPP), both running RT Linux, have been added to the portfolio of existing VxWorks and Linux VME systems. At the lower end we have picked a variety of boards, all based on the Xilinx Zynq UltraScale+ MPSoC. Even though these devices have less processing power, due to the built-in FPGA and Real-time CPU (RPU) they can deliver strict, hard RT performance. For high-throughput, soft-RT applications we went for Intel Xeon based single-board PCs in the CPCI-S form factor. All platforms are operated as diskless systems. For the Zynq systems we have decided on building in-house a Yocto Kirkstone Linux distribution, whereas for the Xeon PCs we employ off-the-shelf Debian 10 Buster. In addition to these new NPP systems, in the scope of our new EtherCAT-based Motion project, we have decided to use small x8664 servers, which will run the same Debian distribution as NPP. In this contribution we present the selected Operating Systems (OS) and discuss how we build, deploy and provision them to the diskless clients.  
poster icon Poster THPDP070 [0.758 MB]  
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-THPDP070  
About • Received ※ 02 October 2023 — Accepted ※ 13 October 2023 — Issued ※ 19 October 2023  
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
THPDP076 Stream-based Virtual Device Simulation for Enhanced EPICS Integration and Automated Testing controls, interface, MMI, software 1522
 
  • M. Lukaszewski, K. Klys
    E9, London, United Kingdom
 
  Integrating devices into the Experimental Physics and Industrial Control System (EPICS) can often take a suboptimal path due to discrepancies between available documentation and real device behaviour. To address this issue, we introduce "vd" (virtual device), a software for simulating stream-based virtual devices that enables testing communication without connecting to the real device. It is focused on the communication layer rather than the device’s underlying physics. The vd listens to a TCP port for client commands and employs ASCII-based byte stream communication. It offers easy configuration through a user-friendly config file containing all necessary information to simulate a device, including parameters for the simulated device and information exchanged via TCP, such as commands and queries related to each parameter. Defining the protocol for data exchange through a configuration file allows users to simulate various devices without modifying the simulator’s code. The vd’s architecture enables its use as a library for creating advanced simulations, making it a tool for testing and validating device communication and integration into EPICS. Furthermore, the vd can be integrated into CI pipelines, facilitating automated testing and validation of device communication, ultimately improving the quality of the produced control system.  
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-THPDP076  
About • Received ※ 06 October 2023 — Accepted ※ 08 December 2023 — Issued ※ 12 December 2023  
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
THPDP088 ATCA-Based Beam Line Data Software for SLAC’s LCLS-II Timing System software, Linux, network, FPGA 1560
 
  • D. Alnajjar, M.P. Donadio, K.H. Kim, M. Weaver
    SLAC, Menlo Park, California, USA
 
  Funding: Work supported by US DOE contract DE-AC02-76SF00515
Among the several acquisition services available with SLAC’s high beam rate accelerator, all of which are contemplated in the acquisition service EPICS support package, resides the new Advanced Telecommunications Computing Architecture (ATCA) Beam Line Data (BLD) service. BLD runs on top of SLAC’s common platform software and firmware, and communicates with several high-performance systems (i.e. MPS, BPM, LLRF, timing, etc.) in LCLS, running on a 7-slot ATCA crate. Once linked with an ATCA EPICS IOC and with the proper commands called in the IOC shell, it initializes the BLD FPGA logic and the upper software stack, and makes PVs available allowing the control of the BLD data acquisition rates, and the starting of the BLD data acquisition. This service permits the forwarding of acquired data to configured IP addresses and ports in the format of multicast network packets. Up to four BLD rates can be configured simultaneously, each accessible at its configured IP destination, and with a maximum rate of 1MHz. Users interested in acquiring any of the four BLD rates will need to register in the corresponding IP destination for receiving a copy of the multicast packet on their respective receiver software. BLD has allowed data to be transmitted over multicast packets for over a decade at SLAC, but always at a maximum rate of 120 Hz. The present work focuses on bringing this service to the high beam rate high-performance systems using ATCAs, allowing the reuse of many legacy in-house-developed client software infrastructures.
 
poster icon Poster THPDP088 [1.060 MB]  
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-THPDP088  
About • Received ※ 03 October 2023 — Accepted ※ 06 December 2023 — Issued ※ 17 December 2023  
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
THSDSC03 Integrate EPICS 7 with MATLAB Using PVAccess for Python (P4P) Module controls, interface, experiment, status 1580
 
  • K.T. Kim, J.J. Bellister, K.H. Kim, E. Williams, S. Zelazny
    SLAC, Menlo Park, California, USA
 
  MATLAB is essential for accelerator scientists engaged in data analysis and processing across diverse fields, including particle physics experiments, synchrotron light sources, XFELs, and telescopes, due to its extensive range of built-in functions and tools. Scientists also depend on EPICS 7* to control and monitor complex systems. Since Python has gained popularity in the scientific community and many facilities have been migrating towards it, SLAC has developed matpva, a Python interface to integrate EPICS 7 with MATLAB. Matpva utilizes the Python P4P module** and EPICS 7 to offer a robust and reliable interface for MATLAB users that employ EPICS 7. The EPICS 7 PVAccess API allows higher-level scientific applications to get/set/monitor simple and complex structures from an EPICS 7-based control system. Moreover, matpva simplifies the process by handling the data type conversion from Python to MATLAB, making it easier for researchers to focus on their analyses and innovative ideas instead of technical data conversion. By leveraging matpva, researchers can work more efficiently and make discoveries in diverse fields, including particle physics and astronomy.
* See https://epics-controls.org/resources-and-support/base/epics-7/ to learn more about EPICS 7
** Visit https://mdavidsaver.github.io/p4p/ to learn more about the P4P
 
poster icon Poster THSDSC03 [0.865 MB]  
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-THSDSC03  
About • Received ※ 06 October 2023 — Revised ※ 10 October 2023 — Accepted ※ 06 December 2023 — Issued ※ 15 December 2023
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
THSDSC04 CamServer: Stream Processing at SwissFEL and SLS 2.0 FEL, controls, data-acquisition, monitoring 1585
 
  • A. Gobbo, A. Babic
    PSI, Villigen PSI, Switzerland
 
  CamServer is a Python package for data stream processing developed at Paul Scherrer Institute (PSI). It is a key component of SwissFEL’s data acquisition, where it is deployed on a cluster of servers and used for displaying and processing images from all cameras. It scales linearly with the number of servers and is capable of handling multiple high-resolution cameras at 100Hz, as well as a variety of data types and sources. The processing unit, called a pipeline, runs in a private process that can be either permanent or spawned on demand. Pipelines consume and produce ZMQ streams, but input data can be arbitrary using an adapter layer (e.g. EPICS). A proxy server handles requests and creates pipelines on the cluster’s worker nodes according to rules. Some processing scripts are available out of the box (e.g. calculation of standard beam metrics) but users can upload custom ones. The system is managed via its REST API, using a client library or a GUI application. CamServer’s output data streams are consumed by a variety of client types such as data storage, image visualization, monitoring and DAQ applications. This work describes the use of CamServer, the status of the SwissFEL’s cluster and the development roadmap with plans for SLS 2.0.  
poster icon Poster THSDSC04 [1.276 MB]  
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-THSDSC04  
About • Received ※ 03 October 2023 — Revised ※ 10 October 2023 — Accepted ※ 06 December 2023 — Issued ※ 19 December 2023
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
THSDSC06 Developing a Digital Twin for BESSY II Synchrotron Light Source Based on EPICS and Microservice Design synchrotron, controls, lattice, monitoring 1594
 
  • W. Sulaiman Khail, M. Ries, P. Schnitzer
    HZB, Berlin, Germany
 
  Digital twins, i.e. theory and design tools connected to the real devices and machine by mapping of physics components to the technical correspondents, are powerful tools providing accelerators with commissioning predictions and feedback capabilities. This paper describes a new tool allowing for greater flexibility in configuring the modelling part combined with ease of adding new features. To enable the various components developed in EPICS, Python, C, and C++ to work together seamlessly, we adopt a microservice architecture, with REST API services providing the interfaces between the components. End user scripts are implemented as REST API services, allowing for better data analysis and visualization. Finally, the paper describes the integration of dash and ploty for enhanced data comparison and visualization. Overall, this workflow provides a powerful and flexible solution for managing and optimizing BESSY II digital twins, with the potential for further customization and extension to upcoming machines.  
poster icon Poster THSDSC06 [0.797 MB]  
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-THSDSC06  
About • Received ※ 05 October 2023 — Revised ※ 27 October 2023 — Accepted ※ 05 November 2023 — Issued ※ 05 December 2023
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
FR1BCO01 Status of the European Spallation Source Controls controls, PLC, operation, timing 1600
 
  • T. Korhonen
    ESS, Lund, Sweden
 
  The European Spallation Source has made substantial progress in the recent years. Similarly, the control system has taken shape and has gone through the first commissioning and is now in production use. While there are still features and services in preparation, the central features are already in place. The talk will give an overview of the areas where the control system is used, our use and experience with the central technologies like MTCA.4 and EPICS 7, plus an overview of the next steps. The talk will also look at what was planned and reported in ICALEPCS 2015 and how our system of today compares with them, and the evolution from green field project to an operating organization.  
slides icon Slides FR1BCO01 [2.354 MB]  
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-FR1BCO01  
About • Received ※ 06 October 2023 — Revised ※ 12 October 2023 — Accepted ※ 12 December 2023 — Issued ※ 15 December 2023
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
FR1BCO02 Controls at the Fermilab PIP-II Superconducting Linac controls, software, operation, cryomodule 1607
 
  • D.J. Nicklaus, P.M. Hanlet
    Fermilab, Batavia, Illinois, USA
 
  PIP-II is an 800 MeV superconducting RF linac under development at Fermilab. As the new first stage in our accelerator chain, it will deliver high-power beam to multiple experiments simultaneously and thus drive Fermilab’s particle physics program for years to come. In a pivot for Fermilab, controls for PIP-II are based on EPICS instead of ACNET, the legacy control system for accelerators at the lab. This paper discusses the status of the EPICS controls work for PIP-II. We describe the EPICS tools selected for our system and the experience of operators new to EPICS. We introduce our continuous integration / continuous development environment. We also describe some efforts at cooperation between EPICS and ACNET, or efforts to move towards a unified interface that can apply to both control systems.  
slides icon Slides FR1BCO02 [4.528 MB]  
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-FR1BCO02  
About • Received ※ 04 October 2023 — Revised ※ 12 October 2023 — Accepted ※ 10 December 2023 — Issued ※ 11 December 2023
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
FR1BCO04 The Controls and Science IT Project for the SLS 2.0 Upgrade controls, network, storage-ring, experiment 1616
 
  • A. Ashton, H.-H. Braun, S. Fries, X. Yao, E. Zimoch
    PSI, Villigen PSI, Switzerland
 
  Operation of the Swiss Light Source (SLS) at the Paul Scherrer Institue (PSI) in Switzerland began in 2000 and it quickly became one of the most successful synchrotron radiation facilities worldwide, providing academic and industry users with a suite of excellent beamlines covering a wide range of methods and applications. To maintain the SLS at the forefront of synchrotron user facilities and to exploit all of the improvement opportunities, PSI prepared a major upgrade project for SLS, named SLS 2.0. The Controls and Science IT (CaSIT) subproject was established to help instigate a project management structure to facilitate new concepts, increased communication, and clarify budgetary responsibility. This article focusses on the progress being made to exploit the current technological opportunities offered by a break in operations whilst taking into consideration future growth opportunities and realistic operational support within an academic research facility.  
slides icon Slides FR1BCO04 [6.389 MB]  
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-FR1BCO04  
About • Received ※ 05 October 2023 — Revised ※ 10 October 2023 — Accepted ※ 20 November 2023 — Issued ※ 17 December 2023
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)  
 
FR2BCO01 React Automation Studio: Modern Scientific Control with the Web controls, interface, GUI, framework 1643
 
  • W. Duckitt
    Stellenbosch University, Matieland, South Africa
  • J.K. Abraham
    iThemba LABS, Somerset West, South Africa
  • D. Marcato, G. Savarese
    INFN/LNL, Legnaro (PD), Italy
 
  React Automation Studio is a progressive web application framework that enables the control of large scientific equipment through EPICS from any smart device connected to a network. With built-in advanced features such as reusable widgets and components, macro substitution, OAuth 2.0 authentication, access rights administration, alarm-handing with notifications, diagnostic probes and archived data viewing, it allows one to build modern, secure and fully responsive control user interfaces and overview screens for the desktop, web browser, TV, mobile and tablet devices. A general overview of React Automation Studio and its features as well as the system architecture, implementation, community involvement and future plans for the system is presented.  
slides icon Slides FR2BCO01 [1.866 MB]  
DOI • reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-FR2BCO01  
About • Received ※ 03 October 2023 — Accepted ※ 05 December 2023 — Issued ※ 13 December 2023  
Cite • reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml)