ϟ

G. Cerminara

Here are all the papers by G. Cerminara that you can download and read on OA.mg.
G. Cerminara’s last known institution is . Download G. Cerminara PDFs here.

Claim this Profile →
DOI: 10.1007/s41781-018-0018-8
2019
Cited 114 times
A Roadmap for HEP Software and Computing R&D for the 2020s
Particle physics has an ambitious and broad experimental programme for the coming decades. This programme requires large investments in detector hardware, either to build new facilities and experiments, or to upgrade existing ones. Similarly, it requires commensurate investment in the R&D of software to acquire, manage, process, and analyse the shear amounts of data to be recorded. In planning for the HL-LHC in particular, it is critical that all of the collaborating stakeholders agree on the software goals and priorities, and that the efforts complement each other. In this spirit, this white paper describes the R&D activities required to prepare for this software upgrade.
DOI: 10.3389/fdata.2020.598927
2021
Cited 41 times
Distance-Weighted Graph Neural Networks on FPGAs for Real-Time Particle Reconstruction in High Energy Physics
Graph neural networks have been shown to achieve excellent performance for several crucial tasks in particle physics, such as charged particle tracking, jet tagging, and clustering. An important domain for the application of these networks is the FGPA-based first layer of real-time data filtering at the CERN Large Hadron Collider, which has strict latency and resource constraints. We discuss how to design distance-weighted graph networks that can be executed with a latency of less than 1$\mu\mathrm{s}$ on an FPGA. To do so, we consider a representative task associated to particle reconstruction and identification in a next-generation calorimeter operating at a particle collider. We use a graph network architecture developed for such purposes, and apply additional simplifications to match the computing constraints of Level-1 trigger systems, including weight quantization. Using the $\mathtt{hls4ml}$ library, we convert the compressed models into firmware to be implemented on an FPGA. Performance of the synthesized models is presented both in terms of inference accuracy and resource usage.
DOI: 10.1109/icmla.2019.00270
2019
Cited 56 times
Anomaly Detection with Conditional Variational Autoencoders
Exploiting the rapid advances in probabilistic inference, in particular variational Bayes and variational autoencoders (VAEs), for anomaly detection (AD) tasks remains an open research question. Previous works argued that training VAE models only with inliers is insufficient and the framework should be significantly modified in order to discriminate the anomalous instances. In this work, we exploit the deep conditional variational autoencoder (CVAE) and we define an original loss function together with a metric that targets hierarchically structured data AD. Our motivating application is a real world problem: monitoring the trigger system which is a basic component of many particle physics experiments at the CERN Large Hadron Collider (LHC). In the experiments we show the superior performance of this method for classical machine learning (ML) benchmarks and for our application.
2006
Cited 41 times
CP Studies and Non-Standard Higgs Physics
There are many possibilities for new physics beyond the Standard Model that feature non-standard Higgs sectors. These may introduce new sources of CP violation, and there may be mixing between multiple Higgs bosons or other new scalar bosons. Alternatively, the Higgs may be a composite state, or there may even be no Higgs at all. These non-standard Higgs scenarios have important implications for collider physics as well as for cosmology, and understanding their phenomenology is essential for a full comprehension of electroweak symmetry breaking. This report discusses the most relevant theories which go beyond the Standard Model and its minimal, CP-conserving supersymmetric extension: two-Higgs-doublet models and minimal supersymmetric models with CP violation, supersymmetric models with an extra singlet, models with extra gauge groups or Higgs triplets, Little Higgs models, models in extra dimensions, and models with technicolour or other new strong dynamics. For each of these scenarios, this report presents an introduction to the phenomenology, followed by contributions on more detailed theoretical aspects and studies of possible experimental signatures at the LHC and other colliders.
DOI: 10.1088/1748-0221/16/04/t04002
2021
Cited 14 times
Construction and commissioning of CMS CE prototype silicon modules
Abstract As part of its HL-LHC upgrade program, the CMS collaboration is developing a High Granularity Calorimeter (CE) to replace the existing endcap calorimeters. The CE is a sampling calorimeter with unprecedented transverse and longitudinal readout for both electromagnetic (CE-E) and hadronic (CE-H) compartments. The calorimeter will be built with ∼30,000 hexagonal silicon modules. Prototype modules have been constructed with 6-inch hexagonal silicon sensors with cell areas of 1.1 cm 2 , and the SKIROC2-CMS readout ASIC. Beam tests of different sampling configurations were conducted with the prototype modules at DESY and CERN in 2017 and 2018. This paper describes the construction and commissioning of the CE calorimeter prototype, the silicon modules used in the construction, their basic performance, and the methods used for their calibration.
DOI: 10.1088/1748-0221/18/08/p08014
2023
Cited 3 times
Performance of the CMS High Granularity Calorimeter prototype to charged pion beams of 20–300 GeV/c
Abstract The upgrade of the CMS experiment for the high luminosity operation of the LHC comprises the replacement of the current endcap calorimeter by a high granularity sampling calorimeter (HGCAL). The electromagnetic section of the HGCAL is based on silicon sensors interspersed between lead and copper (or copper tungsten) absorbers. The hadronic section uses layers of stainless steel as an absorbing medium and silicon sensors as an active medium in the regions of high radiation exposure, and scintillator tiles directly read out by silicon photomultipliers in the remaining regions. As part of the development of the detector and its readout electronic components, a section of a silicon-based HGCAL prototype detector along with a section of the CALICE AHCAL prototype was exposed to muons, electrons and charged pions in beam test experiments at the H2 beamline at the CERN SPS in October 2018. The AHCAL uses the same technology as foreseen for the HGCAL but with much finer longitudinal segmentation. The performance of the calorimeters in terms of energy response and resolution, longitudinal and transverse shower profiles is studied using negatively charged pions, and is compared to GEANT4 predictions. This is the first report summarizing results of hadronic showers measured by the HGCAL prototype using beam test data.
DOI: 10.1088/1748-0221/17/05/p05022
2022
Cited 7 times
Response of a CMS HGCAL silicon-pad electromagnetic calorimeter prototype to 20–300 GeV positrons
Abstract The Compact Muon Solenoid collaboration is designing a new high-granularity endcap calorimeter, HGCAL, to be installed later this decade. As part of this development work, a prototype system was built, with an electromagnetic section consisting of 14 double-sided structures, providing 28 sampling layers. Each sampling layer has an hexagonal module, where a multipad large-area silicon sensor is glued between an electronics circuit board and a metal baseplate. The sensor pads of approximately 1.1 cm 2 are wire-bonded to the circuit board and are readout by custom integrated circuits. The prototype was extensively tested with beams at CERN's Super Proton Synchrotron in 2018. Based on the data collected with beams of positrons, with energies ranging from 20 to 300 GeV, measurements of the energy resolution and linearity, the position and angular resolutions, and the shower shapes are presented and compared to a detailed Geant4 simulation.
DOI: 10.48550/arxiv.hep-ph/0601013
2006
Cited 22 times
HERA and the LHC - A workshop on the implications of HERA for LHC physics: Proceedings - Part B
The HERA electron--proton collider has collected 100 pb$^{-1}$ of data since its start-up in 1992, and recently moved into a high-luminosity operation mode, with upgraded detectors, aiming to increase the total integrated luminosity per experiment to more than 500 pb$^{-1}$. HERA has been a machine of excellence for the study of QCD and the structure of the proton. The Large Hadron Collider (LHC), which will collide protons with a centre-of-mass energy of 14 TeV, will be completed at CERN in 2007. The main mission of the LHC is to discover and study the mechanisms of electroweak symmetry breaking, possibly via the discovery of the Higgs particle, and search for new physics in the TeV energy scale, such as supersymmetry or extra dimensions. Besides these goals, the LHC will also make a substantial number of precision measurements and will offer a new regime to study the strong force via perturbative QCD processes and diffraction. For the full LHC physics programme a good understanding of QCD phenomena and the structure function of the proton is essential. Therefore, in March 2004, a one-year-long workshop started to study the implications of HERA on LHC physics. This included proposing new measurements to be made at HERA, extracting the maximum information from the available data, and developing/improving the theoretical and experimental tools. This report summarizes the results achieved during this workshop.
DOI: 10.1103/physrevlett.101.171803
2008
Cited 20 times
Observation of<mml:math xmlns:mml="http://www.w3.org/1998/Math/MathML" display="inline"><mml:mi>Z</mml:mi><mml:mi>Z</mml:mi></mml:math>Production in<mml:math xmlns:mml="http://www.w3.org/1998/Math/MathML" display="inline"><mml:mi>p</mml:mi><mml:mover accent="true"><mml:mi>p</mml:mi><mml:mo>¯</mml:mo></mml:mover></mml:math>Collisions at<mml:math xmlns:mml="http://www.w3.org/1998/Math/MathML" display="inline"><mml:msqrt><mml:mi>s</mml:mi></mml:msqrt><mml:mo>=</mml:mo><mml:mn>1.96</mml:mn><mml:…
We present an observation for ZZ-->l+l-l'+l'- (l, l'=e or mu) production in p[over]p collisions at a center-of-mass energy of sqrt[s]=1.96 TeV. Using 1.7 fb(-1) of data collected by the D0 experiment at the Fermilab Tevatron Collider, we observe three candidate events with an expected background of 0.14(+0.03)_(-0.02) events. The significance of this observation is 5.3 standard deviations. The combination of D0 results in this channel, as well as in ZZ-->l+l- nu[over]nu, yields a significance of 5.7 standard deviations and a combined cross section of sigma(ZZ)=1.60+/-0.63(stat)+0.16_-0.17(syst) pb.
DOI: 10.1007/s41781-018-0020-1
2019
Cited 11 times
Detector Monitoring with Artificial Neural Networks at the CMS Experiment at the CERN Large Hadron Collider
Reliable data quality monitoring is a key asset in delivering collision data suitable for physics analysis in any modern large-scale high energy physics experiment. This paper focuses on the use of artificial neural networks for supervised and semi-supervised problems related to the identification of anomalies in the data collected by the CMS muon detectors. We use deep neural networks to analyze LHC collision data, represented as images organized geographically. We train a classifier capable of detecting the known anomalous behaviors with unprecedented efficiency and explore the usage of convolutional autoencoders to extend anomaly detection capabilities to unforeseen failure modes. A generalization of this strategy could pave the way to the automation of the data quality assessment process for present and future high energy physics experiments.
DOI: 10.1051/epjconf/201921406008
2019
Cited 10 times
Anomaly detection using Deep Autoencoders for the assessment of the quality of the data acquired by the CMS experiment
The certification of the CMS experiment data as usable for physics analysis is a crucial task to ensure the quality of all physics results published by the collaboration. Currently, the certification conducted by human experts is labor intensive and based on the scrutiny of distributions integrated on several hours of data taking. This contribution focuses on the design and prototype of an automated certification system assessing data quality on a per-luminosity section (i.e. 23 seconds of data taking) basis. Anomalies caused by detector malfunctioning or sub-optimal reconstruction are difficult to enumerate a priori and occur rarely, making it difficult to use classical supervised classification methods such as feedforward neural networks. We base our prototype on a semi-supervised approach which employs deep autoencoders. This approach has been qualified successfully on CMS data collected during the 2016 LHC run: we demonstrate its ability to detect anomalies with high accuracy and low false positive rate, when compared against the outcome of the manual certification by experts. A key advantage of this approach over other machine learning technologies is the great interpretability of the results, which can be further used to ascribe the origin of the problems in the data to a specific sub-detector or physics objects.
DOI: 10.1103/physrevd.78.072002
2008
Cited 12 times
<mml:math xmlns:mml="http://www.w3.org/1998/Math/MathML" display="inline"><mml:mi>Z</mml:mi><mml:mi>Z</mml:mi><mml:mo>→</mml:mo><mml:msup><mml:mi>l</mml:mi><mml:mo>+</mml:mo></mml:msup><mml:msup><mml:mi>l</mml:mi><mml:mo>−</mml:mo></mml:msup><mml:mi>ν</mml:mi><mml:mover accent="true"><mml:mi>ν</mml:mi><mml:mo>¯</mml:mo></mml:mover></mml:math>production in<mml:math xmlns:mml="http://www.w3.org/1998/Math/MathML" display="inline"><mml:mi>p</mml:mi><mml:mover accent="true"><mml:mi>p</mml:mi><mml:mo…
We describe a search for Z boson pair production in pp collisions at sqrt(s)= 1.96 TeV with the D0 detector at the Fermilab Tevatron Collider using a data sample corresponding to an integrated luminosity of 2.7 fb^-1. Using the final state decay ZZ to llvv (where l = e or mu) we measure a signal with a 2.6 standard deviations significance (2.0 expected) and measure a cross section of sigma(pp to ZZ + X) = 2.01 +- 0.93 (stat.) +- 0.29 (sys.) pb.
DOI: 10.1140/epjc/s10052-008-0674-7
2008
Cited 11 times
Reconstruction of cosmic and beam-halo muons with the CMS detector
The powerful muon and tracker systems of the CMS detector together with dedicated reconstruction software allow precise and efficient measurement of muon tracks originating from proton-proton collisions. The standard muon reconstruction algorithms, however, are inadequate to deal with muons that do not originate from collisions. This note discusses the design, implementation, and performance results of a dedicated cosmic muon track reconstruction algorithm, which features pattern recognition optimized for muons that are not coming from the interaction point, i.e., cosmic muons and beam-halo muons. To evaluate the performance of the new algorithm, data taken during Cosmic Challenge phases I and II were studied and compared with simulated cosmic data. In addition, a variety of more general topologies of cosmic muons and beam-halo muons were studied using simulated data to demonstrate some key features of the new algorithm.
DOI: 10.48550/arxiv.hep-ph/0601012
2006
Cited 10 times
HERA and the LHC - A workshop on the implications of HERA for LHC physics: Proceedings - Part A
The HERA electron--proton collider has collected 100 pb$^{-1}$ of data since its start-up in 1992, and recently moved into a high-luminosity operation mode, with upgraded detectors, aiming to increase the total integrated luminosity per experiment to more than 500 pb$^{-1}$. HERA has been a machine of excellence for the study of QCD and the structure of the proton. The Large Hadron Collider (LHC), which will collide protons with a centre-of-mass energy of 14 TeV, will be completed at CERN in 2007. The main mission of the LHC is to discover and study the mechanisms of electroweak symmetry breaking, possibly via the discovery of the Higgs particle, and search for new physics in the TeV energy scale, such as supersymmetry or extra dimensions. Besides these goals, the LHC will also make a substantial number of precision measurements and will offer a new regime to study the strong force via perturbative QCD processes and diffraction. For the full LHC physics programme a good understanding of QCD phenomena and the structure function of the proton is essential. Therefore, in March 2004, a one-year-long workshop started to study the implications of HERA on LHC physics. This included proposing new measurements to be made at HERA, extracting the maximum information from the available data, and developing/improving the theoretical and experimental tools. This report summarizes the results achieved during this workshop.
2019
Cited 6 times
A roadmap for HEP software and computing R&D for the 2020s
Particle physics has an ambitious and broad experimental programme for the coming decades. This programme requires large investments in detector hardware, either to build new facilities and experiments, or to upgrade existing ones. Similarly, it requires commensurate investment in the R&D of software to acquire, manage, process, and analyse the shear amounts of data to be recorded. In planning for the HL-LHC in particular, it is critical that all of the collaborating stakeholders agree on the software goals and priorities, and that the efforts complement each other. In this spirit, this white paper describes the R&D activities required to prepare for this software upgrade.
DOI: 10.1016/j.nima.2007.06.007
2007
Cited 8 times
Results of the first integration test of the CMS drift tubes muon trigger
Two drift tubes (DTs) chambers of the CMS muon barrel system were exposed to a 40 MHz bunched muon beam at the CERN SPS, and for the first time the whole CMS Level-1 DTs-based trigger system chain was tested. Data at different energies and inclination angles of the incident muon beam were collected, as well as data with and without an iron absorber placed between the two chambers, to simulate the electromagnetic shower development in CMS. Special data-taking runs were dedicated to test for the first time the Track Finder system, which reconstructs track trigger candidates by performing a proper matching of the muon segments delivered by the two chambers. The present paper describes the results of these measurements.
DOI: 10.1051/epjconf/201921401007
2019
Cited 4 times
Improving data quality monitoring via a partnership of technologies and resources between the CMS experiment at CERN and industry
The Compact Muon Solenoid (CMS) experiment dedicates significant effort to assess the quality of its data, online and offline. A real-time data quality monitoring system is in place to spot and diagnose problems as promptly as possible to avoid data loss. The a posteriori evaluation of processed data is designed to categorize it in terms of their usability for physics analysis. These activities produce data quality metadata. The data quality evaluation relies on a visual inspection of the monitoring features. This practice has a cost in term of human resources and is naturally subject to human arbitration. Potential limitations are linked to the ability to spot a problem within the overwhelming number of quantities to monitor, or to the lack of understanding of detector evolving conditions. In view of Run 3, CMS aims at integrating deep learning technique in the online workflow to promptly recognize and identify anomalies and improve data quality metadata precision. The CMS experiment engaged in a partnership with IBM with the objective to support, through automatization, the online operations and to generate benchmarking technological results. The research goals, agreed within the CERN Openlab framework, how they matured in a demonstration applic tion and how they are achieved, through a collaborative contribution of technologies and resources, are presented
DOI: 10.1142/9789811234033_0005
2022
Data Quality Monitoring Anomaly Detection
DOI: 10.1016/j.nima.2006.04.046
2006
Cited 6 times
Fine synchronization of the CMS muon drift tubes local trigger
The drift tubes based CMS barrel muon trigger, which uses self-triggering arrays of drift tubes, is able to perform the identification of the muon parent bunch crossing using a rather sophisticated algorithm. The identification is unique only if the trigger chain is correctly synchronized. Some beam test time was devoted to take data useful to investigate the synchronization of the trigger electronics with the machine clock. Possible alternatives were verified and the dependence on muon track properties was studied.
DOI: 10.1088/1748-0221/4/05/p05002
2009
Cited 4 times
Offline calibration procedure of the CMS Drift Tube detectors
The barrel region of the Compact Muon Solenoid (CMS) experiment at the Large Hadron Collider is instrumented with Drift Tube (DT) detectors. This paper describes in full details the calibration of the DT hit reconstruction algorithm. After inter-channel synchronization has been verified through the appropriate hardware procedure, the time pedestals are extracted directly from the distribution of the recorded times. Further corrections for time-of-flight and time of signal propagation are applied as soon as the three-dimensional hit position within the DT chamber is known. The different effects of the time pedestal miscalibration on the two main hit reconstruction algorithms are shown. The drift velocity calibration algorithm is based on the meantimer technique. Different meantimer relations for different track angles and patterns of hit cells are used. This algorithm can also be used to determine the uncertainty on the reconstructed hit position.
2009
Cited 4 times
Local Muon Reconstruction in the Drift Tube Detectors
This note describes the local reconstruction in the Drift Tube subdetector of the CMS muon subsystem. The local reconstruction is the sequence of steps leading from the TDC measurements to reconstructed three-dimensional segments inside each DT chamber. These segments are the input to the muon track reconstruction. This note updates and supersedes CMS NOTE 2002/043
DOI: 10.1088/1742-6596/664/7/072009
2015
Automated workflows for critical time-dependent calibrations at the CMS experiment
Fast and efficient methods for the calibration and the alignment of the detector are a key asset to exploit the physics potential of the Compact Muon Solenoid (CMS) detector and to ensure timely preparation of results for conferences and publications. To achieve this goal, the CMS experiment has set up a powerful framework. This includes automated workflows in the context of a prompt calibration concept, which allows for a quick turnaround of the calibration process following as fast as possible any change in running conditions. The presentation will review the design and operational experience of these workflows and the related monitoring system during the LHC Run I and focus on the development, deployment and commissioning in preparation of Run II.
DOI: 10.1088/1748-0221/18/08/p08024
2023
Neutron irradiation and electrical characterisation of the first 8” silicon pad sensor prototypes for the CMS calorimeter endcap upgrade
As part of its HL-LHC upgrade program, the CMS collaboration is replacing its existing endcap calorimeters with a high-granularity calorimeter (CE). The new calorimeter is a sampling calorimeter with unprecedented transverse and longitudinal readout for both electromagnetic and hadronic compartments. Due to its compactness, intrinsic time resolution, and radiation hardness, silicon has been chosen as active material for the regions exposed to higher radiation levels. The silicon sensors are fabricated as 20 cm (8") wide hexagonal wafers and are segmented into several hundred pads which are read out individually. As part of the sensor qualification strategy, 8" sensor irradiation with neutrons has been conducted at the Rhode Island Nuclear Science Center (RINSC) and followed by their electrical characterisation in 2020-21. The completion of this important milestone in the CE's R&D program is documented in this paper and it provides detailed account of the associated infrastructure and procedures. The results on the electrical properties of the irradiated CE silicon sensors are presented.
2007
Measurement of drift velocity in the CMS barrel muon chambers at the CMS magnet test cosmic challenge
DOI: 10.1016/j.nuclphysbps.2007.07.016
2007
The Drift Tube System of the CMS Experiment
The Compact Muon Solenoid (CMS) is a multi-purpose detector at the Large Hadron Collider (LHC) being built at CERN. The muon spectrometer, designed to identify, reconstruct and measure muons with high efficiency and accuracy, plays a key role in the trigger of the experiment. The barrel region is instrumented with Drift Tube chambers (DT) which are used both for tracking and trigger purpose. The design and the performance of the DT system are presented, focusing in particular on the status of its installation in the experimental area and on the analysis of data from the commissioning phase.
DOI: 10.22323/1.120.0485
2011
Operation of the CMS detector with first collisions at 7 TeV at the LHC
2011
Alignment and calibration of the CMS detector
DOI: 10.48550/arxiv.1711.07051
2017
Deep learning for inferring cause of data anomalies
Daily operation of a large-scale experiment is a resource consuming task, particularly from perspectives of routine data quality monitoring. Typically, data comes from different sub-detectors and the global quality of data depends on the combinatorial performance of each of them. In this paper, the problem of identifying channels in which anomalies occurred is considered. We introduce a generic deep learning model and prove that, under reasonable assumptions, the model learns to identify 'channels' which are affected by an anomaly. Such model could be used for data quality manager cross-check and assistance and identifying good channels in anomalous data samples. The main novelty of the method is that the model does not require ground truth labels for each channel, only global flag is used. This effectively distinguishes the model from classical classification methods. Being applied to CMS data collected in the year 2010, this approach proves its ability to decompose anomaly by separate channels.
DOI: 10.1088/1742-6596/898/3/032034
2017
A Web-based application for the collection, management and release of Alignment and Calibration configurations used in data processing at the Compact Muon Solenoid experiment
The Compact Muon Solenoid (CMS) experiment makes a vast use of alignment and calibration measurements in several data processing workflows: in the High Level Trigger, in the processing of the recorded collisions and in the production of simulated events for data analysis and studies of detector upgrades. A complete alignment and calibration scenario is factored in approximately three-hundred records, which are updated independently and can have a time-dependent content, to reflect the evolution of the detector and data taking conditions. Given the complexity of the CMS condition scenarios and the large number (50) of experts who actively measure and release calibration data, in 2015 a novel web-based service has been developed to structure and streamline their management. The cmsDbBrowser provides an intuitive and easily accessible entry point for the navigation of existing conditions by any CMS member, for the bookkeeping of record updates and for the actual composition of complete calibration scenarios. This paper describes the design, choice of technologies and the first year of usage in production of the cmsDbBrowser.
2010
Operation of the CMS detector with first collisions at 7 TeV at the LHC
DOI: 10.1016/j.nima.2008.08.100
2009
The CMS muon barrel drift tubes system commissioning
The CMS muon barrel drift tubes system has been recently fully installed and commissioned in the experiment. The performance and the current status of the detector are briefly presented and discussed.
DOI: 10.1088/1742-6596/119/3/032031
2008
CMS event display and data quality monitoring at LHC start-up
The event display and data quality monitoring visualisation systems are especially crucial for commissioning CMS in the imminent CMS physics run at the LHC. They have already proved invaluable for the CMS magnet test and cosmic challenge. We describe how these systems are used to navigate and filter the immense amounts of complex event data from the CMS detector and prepare clear and flexible views of the salient features to the shift crews and offline users. These allow shift staff and experts to navigate from a top-level general view to very specific monitoring elements in real time to help validate data quality and ascertain causes of problems. We describe how events may be accessed in the higher level trigger filter farm, at the CERN Tier-0 centre, and in offsite centres to help ensure good data quality at all points in the data processing workflow. Emphasis has been placed on deployment issues in order to ensure that experts and general users may use the visualization systems at CERN, in remote operations and monitoring centres offsite, and from their own desktops.
2007
Offline calibration procedure of the drift tube detectors. CERN-CMS-NOTE-2007-034
2019
Trigger Rate Anomaly Detection with Conditional Variational Autoencoders at the CMS Experiment
Exploiting the rapid advances in probabilistic inference, in particular variational autoencoders (VAEs) for machine learning (ML) anomaly detection (AD) tasks, remains an open research question. In this work, we use the deep conditional varia-tional autoencoders (CVAE), and we define an original loss function together with a metric that targets AD for hierarchically structured data. Our target application is a real world problem: monitoring the trigger system which is a component of many particle physics experiments at the CERN Large Hadron Collider (LHC). Experiments show the superior performance of this method over vanilla VAEs.
2006
Comparison of DT testbeam results on local track reconstruction with the OSCAR + ORCA simulation
2003
Facolt a di Scienze Matematiche, Fisiche e Naturali Corso di Laurea in Fisica A Study of the W W -fusion Process at CMS as a Probe of Symmetry Breaking
DOI: 10.1088/1742-6596/664/4/042017
2015
User and group storage management the CMS CERN T2 centre
A wide range of detector commissioning, calibration and data analysis tasks is carried out by CMS using dedicated storage resources available at the CMS CERN Tier-2 centre. Relying on the functionalities of the EOS disk-only storage technology, the optimal exploitation of the CMS user/group resources has required the introduction of policies for data access management, data protection, cleanup campaigns based on access pattern, and long term tape archival. The resource management has been organised around the definition of working groups and the delegation to an identified responsible of each group composition. In this paper we illustrate the user/group storage management, and the development and operational experience at the CMS CERN Tier-2 centre in the 2012-2015 period.
2016
CMS operations for Run II: preparation and commissioning of the offline infrastructure
2016
Physics performance and fast turn around: the challenge of calibration and alignment at the CMS experiment during the LHC Run-II
DOI: 10.22323/1.134.0186
2012
Alignment and calibration of the CMS detector
DOI: 10.1016/j.nima.2009.06.079
2010
Commissioning, operation and performance of the CMS drift tube chambers
The CMS muon spectrometer, designed to trigger, identify, reconstruct and measure muons with high efficiency and accuracy, is equipped with Drift Tube (DT) chambers in the barrel region. The DT system has been fully commissioned using cosmic muons with and without magnetic field, and during months of cosmic data taking has provided millions of triggers to the rest of the CMS detector. This contribution describes the challenges in the operation of the DT system, including reconstruction performance, and the result of the analysis of the collected cosmic data.
DOI: 10.1088/1742-6596/898/3/032041
2017
Continuous and fast calibration of the CMS experiment: design of the automated workflows and operational experience
The exploitation of the full physics potential of the LHC experiments requires fast and efficient processing of the largest possible dataset with the most refined understanding of the detector conditions. To face this challenge, the CMS collaboration has setup an infrastructure for the continuous unattended computation of the alignment and calibration constants, allowing for a refined knowledge of the most time-critical parameters already a few hours after the data have been saved to disk. This is the prompt calibration framework which, since the beginning of the LHC Run-I, enables the analysis and the High Level Trigger of the experiment to consume the most up-to-date conditions optimizing the performance of the physics objects. In the Run-II this setup has been further expanded to include even more complex calibration algorithms requiring higher statistics to reach the needed precision. This imposed the introduction of a new paradigm in the creation of the calibration datasets for unattended workflows and opened the door to a further step in performance. The paper reviews the design of these automated calibration workflows, the operational experience in the Run-II and the monitoring infrastructure developed to ensure the reliability of the service.
DOI: 10.22323/1.282.0988
2017
Physics performance and fast turn around: the challenge of calibration and alignment at the CMS experiment during the LHC Run-II
The CMS detector at the Large Hadron Collider (LHC) is a very complex apparatus with more than 70 million acquisition channels.To exploit its full physics potential, a very careful calibration of the various components, together with an optimal knowledge of their position in space, is essential.The CMS Collaboration has set up a powerful infrastructure to allow for the best knowledge of these conditions at any given moment.The quick turnaround of these workflows was proven crucial both for the algorithms performing the online event selection and for the ultimate resolution of the offline reconstruction of the physics objects.The contribution will report about the design and performance of these workflows during the operations of the 13TeV LHC RunII.
DOI: 10.22323/1.282.0169
2017
CMS operations for Run II: preparation and commissioning of the offline infrastructure
The restart of the LHC coincided with an intense activity for the CMS experiment. Both at the beginning of Run II in 2015 and the restart of operations in 2016, the collaboration was engaged in an extensive re-commissioning of the CMS data-taking operations. After the long stop, the detector was fully aligned and calibrated. Data streams were redesigned, to fit the priorities dictated by the physics program for 2015 and 2016. A new reconstruction software (both online and offline) was commissioned with early collisions and further developed during the year. A massive campaign of Monte Carlo production was launched, to assist physics analyses. This presentation reviews the main events of this commissioning journey and describes the status of CMS physics performances for 2016.
DOI: 10.5281/zenodo.1034149
2017
Anomaly detection using machine learning for data quality monitoring in the CMS experiment
DOI: 10.1142/9789811234026_0005
2022
Data Quality Monitoring Anomaly Detection
2018
Detector monitoring with artificial neural networks at the CMS experiment at the CERN Large Hadron Collider
Reliable data quality monitoring is a key asset in delivering collision data suitable for physics analysis in any modern large-scale High Energy Physics experiment. This paper focuses on the use of artificial neural networks for supervised and semi-supervised problems related to the identification of anomalies in the data collected by the CMS muon detectors. We use deep neural networks to analyze LHC collision data, represented as images organized geographically. We train a classifier capable of detecting the known anomalous behaviors with unprecedented efficiency and explore the usage of convolutional autoencoders to extend anomaly detection capabilities to unforeseen failure modes. A generalization of this strategy could pave the way to the automation of the data quality assessment process for present and future high-energy physics experiments.
DOI: 10.1088/1742-6596/1085/4/042015
2018
Deep learning for inferring cause of data anomalies
Daily operation of a large-scale experiment is a resource consuming task, particularly from perspectives of routine data quality monitoring. Typically, data comes from different sub-detectors and the global quality of data depends on the combinatorial performance of each of them. In this paper, the problem of identifying channels in which anomalies occurred is considered. We introduce a generic deep learning model and prove that, under reasonable assumptions, the model learns to identify 'channels' which are affected by an anomaly. Such model could be used for data quality manager cross-check and assistance and identifying good channels in anomalous data samples. The main novelty of the method is that the model does not require ground truth labels for each channel, only global flag is used. This effectively distinguishes the model from classical classification methods. Being applied to CMS data collected in the year 2010, this approach proves its ability to decompose anomaly by separate channels.
DOI: 10.1088/1742-6596/1525/1/012045
2020
Deep learning for certification of the quality of the data acquired by the CMS Experiment
Abstract Certifying the data recorded by the Compact Muon Solenoid (CMS) experiment at CERN is a crucial and demanding task as the data is used for publication of physics results. Anomalies caused by detector malfunctioning or sub-optimal data processing are difficult to enumerate a priori and occur rarely, making it difficult to use classical supervised classification. We base out prototype towards the automation of such procedure on a semi-supervised approach using deep autoencoders. We demonstrate the ability of the model to detect anomalies with high accuracy, when compared against the outcome of the fully supervised methods. We show that the model has great interpretability of the results, ascribing the origin of the problems in the data to a specific sub-detector or physics object. Finally, we address the issue of feature dependency on the LHC beam intensity.
DOI: 10.48550/arxiv.2012.06336
2020
Construction and commissioning of CMS CE prototype silicon modules
As part of its HL-LHC upgrade program, the CMS Collaboration is developing a High Granularity Calorimeter (CE) to replace the existing endcap calorimeters. The CE is a sampling calorimeter with unprecedented transverse and longitudinal readout for both electromagnetic (CE-E) and hadronic (CE-H) compartments. The calorimeter will be built with $\sim$30,000 hexagonal silicon modules. Prototype modules have been constructed with 6-inch hexagonal silicon sensors with cell areas of 1.1~$cm^2$, and the SKIROC2-CMS readout ASIC. Beam tests of different sampling configurations were conducted with the prototype modules at DESY and CERN in 2017 and 2018. This paper describes the construction and commissioning of the CE calorimeter prototype, the silicon modules used in the construction, their basic performance, and the methods used for their calibration.
DOI: 10.48550/arxiv.1808.00911
2018
Detector monitoring with artificial neural networks at the CMS experiment at the CERN Large Hadron Collider
Reliable data quality monitoring is a key asset in delivering collision data suitable for physics analysis in any modern large-scale High Energy Physics experiment. This paper focuses on the use of artificial neural networks for supervised and semi-supervised problems related to the identification of anomalies in the data collected by the CMS muon detectors. We use deep neural networks to analyze LHC collision data, represented as images organized geographically. We train a classifier capable of detecting the known anomalous behaviors with unprecedented efficiency and explore the usage of convolutional autoencoders to extend anomaly detection capabilities to unforeseen failure modes. A generalization of this strategy could pave the way to the automation of the data quality assessment process for present and future high-energy physics experiments.
2006
VV-fusion in CMS: a model-independent way to investigate EWSB
2006
Further Tests of the CMS Drift Tubes Muon Trigger
2006
CP Studies and Non-Standard Higgs Physics
There are many possibilities for new physics beyond the Standard Model that feature non-standard Higgs sectors. These may introduce new sources of CP violation, and there may be mixing between multiple Higgs bosons or other new scalar bosons. Alternatively, the Higgs may be a composite state, or there may even be no Higgs at all. These non-standard Higgs scenarios have important implications for collider physics as well as for cosmology, and understanding their phenomenology is essential for a full comprehension of electroweak symmetry breaking. This report discusses the most relevant theories which go beyond the Standard Model and its minimal, CP-conserving supersymmetric extension: two-Higgs-doublet models and minimal supersymmetric models with CP violation, supersymmetric models with an extra singlet, models with extra gauge groups or Higgs triplets, Little Higgs models, models in extra dimensions, and models with technicolour or other new strong dynamics. For each of these scenarios, this report presents an introduction to the phenomenology, followed by contributions on more detailed theoretical aspects and studies of possible experimental signatures at the LHC and other colliders.
2006
Simulation, Performance and Local Track Reconstruction of the Drift Tube Detectors of the CMS Experiment.
DOI: 10.1063/1.2125627
2005
VV-Fusion Process to Investigate EWSB; Interplay LC-LHC
Views Icon Views Article contents Figures & tables Video Audio Supplementary Data Peer Review Share Icon Share Twitter Facebook Reddit LinkedIn Tools Icon Tools Reprints and Permissions Cite Icon Cite Search Site Citation G. Cerminara; VV‐Fusion Process to Investigate EWSB; Interplay LC‐LHC. AIP Conf. Proc. 12 October 2005; 794 (1): 97–100. https://doi.org/10.1063/1.2125627 Download citation file: Ris (Zotero) Reference Manager EasyBib Bookends Mendeley Papers EndNote RefWorks BibTex toolbar search Search Dropdown Menu toolbar search search input Search input auto suggest filter your search All ContentAIP Publishing PortfolioAIP Conference Proceedings Search Advanced Search |Citation Search
DOI: 10.5170/cern-2005-014.561
2005
Vector boson fusion at CMS