Paper | Title | Other Keywords | Page |
---|---|---|---|
MO2AO01 | Facing the Challenges of Experiment Control and Data Management at ESRF-EBS | experiment, SRF, GUI, framework | 66 |
|
|||
In 2020 the new ESRF-EBS (Extremely Brilliant Source) took-up operation. With the much higher photon flux, experiments are faster and produce more data. To meet the challenges, a complete revision of data acquisition, management and analysis tools was undertaken. The result is a suite of advanced software tools, deployed today on more than 30 beamlines. The main packages are BLISS for experiment control and data acquisition, LIMA2 for high-speed detector control, EWOKS for data reduction and analysis workflows, and Daiquiri the web GUI framework. BLISS is programmed in Python, to allow easy sequence programming for scientists and easy integration of scientific software. BLISS offers: Configuration of hardware and experimental set-ups, a generic scanning engine for step-based and continuous data acquisition, live data display, frameworks to handle 1D and 2D detectors, spectrometers, monochromators, diffractometers (HKL) and regulation loops. For detectors producing very high data rates, data reduction at the source is important. LIMA2 allows parallel data processing to add the necessary computing power (CPU and GPU) for online data reduction in a flexible way. The EWOKS workflow system can use online or offline data to automate data reduction or analysis. Workflows can run locally or on a compute cluster, using CPUs or GPUs. Results are saved or fed back to the control system for display or to adapt the next data acquisition. | |||
Slides MO2AO01 [2.766 MB] | |||
DOI • | reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-MO2AO01 | ||
About • | Received ※ 03 October 2023 — Revised ※ 07 October 2023 — Accepted ※ 12 October 2023 — Issued ※ 29 October 2023 | ||
Cite • | reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml) | ||
MO2AO02 | A Beamline and Experiment Control System for the SLS 2.0 | controls, interface, experiment, EPICS | 71 |
|
|||
The beamlines of the Swiss Light Source (SLS) predominantly rely on EPICS standards as their control interface but in contrast to many other facilities, there is up to now no standardized user interfacing component to orchestrate, monitor and provide feedback on the data acquisition. As a result, the beamlines have either adapted community solutions or developed their own high-level orchestration system. For the upgrade project SLS 2.0, a sub-project was initiated to facilitate a unified beamline and experiment control system. During a pilot phase and a first development cycle, libraries of the Bluesky project were used, combined with additional in-house developed services, and embedded in a service-based approach with a message broker and in-memory database. Leveraging the community solutions paired with industry standards, enabled the development of a highly modular system which provides the flexibility needed for a constantly changing scientific environment. One year after the development started, the system was already tested during many weeks of user operation and recently received the official approval by the involved divisions to be rolled out as part of the SLS 2.0 upgrade. | |||
Slides MO2AO02 [3.119 MB] | |||
DOI • | reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-MO2AO02 | ||
About • | Received ※ 05 October 2023 — Revised ※ 09 October 2023 — Accepted ※ 12 October 2023 — Issued ※ 14 October 2023 | ||
Cite • | reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml) | ||
MO2AO04 | Experimental Data Taking and Management: The Upgrade Process at BESSY II and HZB | experiment, controls, EPICS, MMI | 84 |
|
|||
The endeavor of modernizing science data acquisition at BESSY II started 2019 [*] Significant achievements have been made: the Bluesky software ecosystem is now accepted framework for data acquisition, flow control and automation. It is operational at an increasing number of HZB beamlines, endstations and instruments. Participation in the global Bluesky collaboration is an extremely empowering experience. Promoting FAIR data principles at all levels developed a unifying momentum, providing guidance at less obvious design considerations. Now a joint demonstrator project of DESY, HZB, HZDR and KIT, named ROCK-IT (Remote Operando Controlled Knowledge-driven, IT-based), aims at portable solutions for fully automated measurements in the catalysis area of material science and is spearheading common developments. Foundation there is laid by Bluesky data acquisition, AI/ML support and analysis, modular sample environment, robotics and FAIR data handling. This paper puts present HZB controls projects as well as detailed HZB contributions to this conference [**] into context. It outlines strategies providing appropriate digital tools at a successor 4th generation light source BESSY III.
[*] R. Müller, et.al. https://doi.org/10.18429/JACoW-ICALEPCS2019-MOCPL02 [**] covering digital twins, Bluesky, sample environment, motion control, remote access, meta data |
|||
Slides MO2AO04 [2.522 MB] | |||
DOI • | reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-MO2AO04 | ||
About • | Received ※ 05 October 2023 — Revised ※ 26 October 2023 — Accepted ※ 14 November 2023 — Issued ※ 16 December 2023 | ||
Cite • | reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml) | ||
TUPDP015 | Test Bench for Motor and Motion Controller Characterization | controls, experiment, EPICS, GUI | 522 |
|
|||
To maximize beamtime usage motorization of beamline equipment is crucial. Choosing the correct motor is complex, since performance depends largely on the combination of motor and motion controller [1]. This challenge, alongside renewing the twenty years old infrastructure at BESSY II, led to the demand for a motor testbench. The testbench was designed to be modular, so it fits different motors, loads and sensors. It allows independent performance verification and enables us to find a fitting combination of motor and motion controller. The testbench is operated via EPICS and Bluesky, allowing us usage of python for automated data acquisition and testing. An overview of the mechanical and electrical setup, as well as some data from different performance tests will be presented.
[1]A.Hughes , B.Drury, ’Electric Motors and Drivers: Fundamentals, Types and Applications’, Fifth Edition, Kidlington, United Kingdom, 2019, pp. 41-86. |
|||
Poster TUPDP015 [1.295 MB] | |||
DOI • | reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-TUPDP015 | ||
About • | Received ※ 06 October 2023 — Revised ※ 13 October 2023 — Accepted ※ 02 December 2023 — Issued ※ 13 December 2023 | ||
Cite • | reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml) | ||
TUPDP042 | Control and Data Acquisition System Upgrade in RFX-mod2 | controls, plasma, real-time, PLC | 607 |
|
|||
RFX-mod2, currently under construction at Consorzio RFX, is an evolution of the former RFX-mod experiment, with an improved shell and a larger set of electromagnetic sensors. This set, including 192 saddle coils, allows exploring a wide range of plasma control schemas, but at the same time poses a challenge for its Control and Data Acquisition System (CODAS). RFX-mod2 CODAS is required to provide the high-speed acquisition of a large set of signals and their inclusion in the Plasma Control System that must provide a sub-millisecond response to plasma instabilities. While brand new solutions are provided for the acquisition of the electromagnetic signals, involving Zynq-based ADC devices, other parts of the CODAS system have been retained from the former RFX-mod CODAS. The paper presents the solutions adopted in the new RFX-mod2 CODAS, belonging to three main categories: 1) Plasma Real-Time control, including both hardware solutions based on Zynq and the integration of data acquisition and real-time frameworks for its software configuration. For this purpose, MDSplus and MARTe2, two frameworks for data acquisition and real-time control, respectively, have been adopted, which are widely used in the fusion community. 2) Data acquisition, including upgrades performed to the former cPCI-based systems and new ad-hoc solutions based on RedPitaya. 3) Plant supervision, carried out in WinCC-OA and integrated with the data acquisition system via a new WinCC-OA database plugin. | |||
DOI • | reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-TUPDP042 | ||
About • | Received ※ 05 October 2023 — Accepted ※ 13 October 2023 — Issued ※ 16 October 2023 | ||
Cite • | reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml) | ||
TUPDP043 | Final Design of Control and Data Acquisition System for the ITER Heating Neutral Beam Injector Test Bed | controls, experiment, network, power-supply | 612 |
|
|||
Funding: This work has been carried out within the framework of the EUROfusion Consortium funded by the European Union via Euratom Research and Training Programme (Grant Agreement No 101052200 - EUROfusion) Tokamaks use heating neutral beam (HNB) injectors to reach fusion conditions and drive the plasma current. ITER, the large international tokamak, will have three high-energy, high-power (1MeV, 16.5MW) HNBs. MITICA, the ITER HNB prototype, is being built at the ITER Neutral Beam Test Facility, Italy, to develop and test the ITER HNB, whose requirements are far beyond the current HNB technology. MITICA operates in a pulsed way with pulse duration up to 3600s and 25% duty cycle. It requires a complex control and data acquisition system (CODAS) to provide supervisory and plant control, monitoring, fast real-time control, data acquisition and archiving, data access, and operator interface. The control infrastructure consists of two parts: central and plant system CODAS. The former provides high-level resources such as servers and a central archive for experimental data. The latter manages the MITICA plant units, i.e., components that generally execute a specific function, such as power supply, vacuum pumping, or scientific parameter measurements. CODAS integrates various technologies to implement the required functions and meet the associated requirements. Our paper presents the CODAS requirement and architecture based on the experience gained with SPIDER, the ITER full-size beam source in operation since 2018. It focuses on the most challenging topics, such as synchronization, fast real-time control, software development for long-lasting experiments, system commissioning, and integration. |
|||
Poster TUPDP043 [0.621 MB] | |||
DOI • | reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-TUPDP043 | ||
About • | Received ※ 05 October 2023 — Accepted ※ 10 December 2023 — Issued ※ 19 December 2023 | ||
Cite • | reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml) | ||
THMBCMO34 | Ultra-High Throughput Automated Macromolecular Crystallography Data Collection Using the Bluesky Framework | experiment, software, controls, hardware | 1280 |
|
|||
At Diamond Light Source, several Macromolecular Crystallography (MX) beamlines focus on, or include, completely automated data collection. This is used primarily for high throughput collection on samples with known or partially known structures, for example, screening a protein for drug or drug fragment interactions. The automated data collection routines are currently built on legacy experiment orchestration software which includes a lot of redundancy originally implemented for safety when human users are controlling the beamline, but which is inefficient when the beamline hardware occupies a smaller number of known states. Diamond is building its next generation, service-based, Data Acquisition Platform, Athena, using NSLSII’s Bluesky experiment orchestration library. The Bluesky library facilitates optimising the orchestration of experiment control by simplifying the work necessary to parallelise and reorganise the steps of an experimental procedure. The MX data acquisition team at Diamond is using the Athena platform to increase the possible rate of automated MX data collection both for immediate use and in preparation to take advantage of the upgraded Diamond-II synchrotron, due in several years. This project, named Hyperion, will include sample orientation and centring, fluorescence scanning, optical monitoring, collection strategy determination, and rotation data collection at multiple positions on a single sample pin. | |||
Slides THMBCMO34 [1.002 MB] | |||
Poster THMBCMO34 [3.445 MB] | |||
DOI • | reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-THMBCMO34 | ||
About • | Received ※ 04 October 2023 — Revised ※ 10 October 2023 — Accepted ※ 13 December 2023 — Issued ※ 19 December 2023 | ||
Cite • | reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml) | ||
THPDP079 | Integration of Bespoke Daq Software with Tango Controls in the SKAO Software Framework: From Problems to Progress | TANGO, controls, GPU, software | 1533 |
|
|||
The Square Kilometre Array Observatory (SKAO) project is an international effort to build two radio interferometers in South Africa and Australia to form one Observatory monitored and controlled from the global headquarters in the United Kingdom at Jodrell Bank. The Monitoring, Control and Calibration System (MCCS) is the "front-end" management software for the Low telescope which provides monitoring and control capabilities as well as implementing calibration processes and providing complex diagnostics support. Once completed the Low telescope will boast over 130, 000 individual log-periodic antennas and so the scale of the data generated will be huge. It is estimated that an average of 8 terabits per second of data will be transferred from the SKAO telescopes in both countries to Central Processing Facilities (CPFs) located at the telescope sites. In order to keep pace with this magnitude of data production an equally impressive data acquisition (DAQ) system is required. This paper outlines the challenges encountered and solutions adopted whilst incorporating a bespoke DAQ library within the SKAO’s Kubernetes-Tango ecosystem in the MCCS subsystem in order to allow high speed data capture whilst maintaining a consistent deployment experience. | |||
Poster THPDP079 [0.981 MB] | |||
DOI • | reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-THPDP079 | ||
About • | Received ※ 02 October 2023 — Accepted ※ 08 December 2023 — Issued ※ 19 December 2023 | ||
Cite • | reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml) | ||
THPDP081 | Exploring Ethernet-Based CAMAC Replacements at ATLAS | controls, Ethernet, network, operation | 1542 |
|
|||
Funding: This work was supported by the US Department of Energy, Office of Nuclear Physics, under Contract No. DE-AC02-06CH11357. This research used resources of ANL’s ATLAS facility. The Argonne Tandem Linear Accelerating System (ATLAS) facility at Argonne National Laboratory is researching ways at avoiding a crisis caused by the end-of-life issues with its 30 year-old CAMAC system. Replacement parts for CAMAC have long since been unavailable causing the potential for long periods of accelerator down times once the limited CAMAC spares are exhausted. ATLAS has recently upgraded the Ethernet in the facility from a 100-Mbps (max) to a 1-Gbps network. Therefore, an Ethernet-based data acquisition system is desirable. The data acquisition replacement requires reliability, speed, and longevity to be a viable upgrade to the facility. In addition, the transition from CAMAC to a modern data acquisition system will be done with minimal interruption of operations. |
|||
DOI • | reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-THPDP081 | ||
About • | Received ※ 10 October 2023 — Revised ※ 11 October 2023 — Accepted ※ 13 October 2023 — Issued ※ 20 October 2023 | ||
Cite • | reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml) | ||
THSDSC04 | CamServer: Stream Processing at SwissFEL and SLS 2.0 | FEL, controls, EPICS, monitoring | 1585 |
|
|||
CamServer is a Python package for data stream processing developed at Paul Scherrer Institute (PSI). It is a key component of SwissFEL’s data acquisition, where it is deployed on a cluster of servers and used for displaying and processing images from all cameras. It scales linearly with the number of servers and is capable of handling multiple high-resolution cameras at 100Hz, as well as a variety of data types and sources. The processing unit, called a pipeline, runs in a private process that can be either permanent or spawned on demand. Pipelines consume and produce ZMQ streams, but input data can be arbitrary using an adapter layer (e.g. EPICS). A proxy server handles requests and creates pipelines on the cluster’s worker nodes according to rules. Some processing scripts are available out of the box (e.g. calculation of standard beam metrics) but users can upload custom ones. The system is managed via its REST API, using a client library or a GUI application. CamServer’s output data streams are consumed by a variety of client types such as data storage, image visualization, monitoring and DAQ applications. This work describes the use of CamServer, the status of the SwissFEL’s cluster and the development roadmap with plans for SLS 2.0. | |||
Poster THSDSC04 [1.276 MB] | |||
DOI • | reference for this paper ※ doi:10.18429/JACoW-ICALEPCS2023-THSDSC04 | ||
About • | Received ※ 03 October 2023 — Revised ※ 10 October 2023 — Accepted ※ 06 December 2023 — Issued ※ 19 December 2023 | ||
Cite • | reference for this paper using ※ BibTeX, ※ LaTeX, ※ Text/Word, ※ RIS, ※ EndNote (xml) | ||