ϟ

K. Lassila-Perini

Here are all the papers by K. Lassila-Perini that you can download and read on OA.mg.
K. Lassila-Perini’s last known institution is . Download K. Lassila-Perini PDFs here.

Claim this Profile →
DOI: 10.1038/s41567-018-0342-2
2018
Cited 94 times
Open is not enough
The solutions adopted by the high-energy physics community to foster reproducible research are examples of best practices that could be embraced more widely. This first experience suggests that reproducibility requires going beyond openness. The solutions adopted by the high-energy physics community to foster reproducible research are examples of best practices that could be embraced more widely. This first experience suggests that reproducibility requires going beyond openness.
DOI: 10.1140/epjc/s10052-022-10541-4
2022
Cited 27 times
Unveiling hidden physics at the LHC
Abstract The field of particle physics is at the crossroads. The discovery of a Higgs-like boson completed the Standard Model (SM), but the lacking observation of convincing resonances Beyond the SM (BSM) offers no guidance for the future of particle physics. On the other hand, the motivation for New Physics has not diminished and is, in fact, reinforced by several striking anomalous results in many experiments. Here we summarise the status of the most significant anomalies, including the most recent results for the flavour anomalies, the multi-lepton anomalies at the LHC, the Higgs-like excess at around 96 GeV, and anomalies in neutrino physics, astrophysics, cosmology, and cosmic rays. While the LHC promises up to 4 $$\hbox {ab}^{-1}$$ <mml:math xmlns:mml="http://www.w3.org/1998/Math/MathML"> <mml:msup> <mml:mtext>ab</mml:mtext> <mml:mrow> <mml:mo>-</mml:mo> <mml:mn>1</mml:mn> </mml:mrow> </mml:msup> </mml:math> of integrated luminosity and far-reaching physics programmes to unveil BSM physics, we consider the possibility that the latter could be tested with present data, but that systemic shortcomings of the experiments and their search strategies may preclude their discovery for several reasons, including: final states consisting in soft particles only, associated production processes, QCD-like final states, close-by SM resonances, and SUSY scenarios where no missing energy is produced. New search strategies could help to unveil the hidden BSM signatures, devised by making use of the CERN open data as a new testing ground. We discuss the CERN open data with its policies, challenges, and potential usefulness for the community. We showcase the example of the CMS collaboration, which is the only collaboration regularly releasing some of its data. We find it important to stress that individuals using public data for their own research does not imply competition with experimental efforts, but rather provides unique opportunities to give guidance for further BSM searches by the collaborations. Wide access to open data is paramount to fully exploit the LHCs potential.
DOI: 10.21468/scipostphys.9.2.022
2020
Cited 47 times
Reinterpretation of LHC Results for New Physics: Status and recommendations after Run 2
We report on the status of efforts to improve the reinterpretation of searches and measurements at the LHC in terms of models for new physics, in the context of the LHC Reinterpretation Forum. We detail current experimental offerings in direct searches for new particles, measurements, technical implementations and Open Data, and provide a set of recommendations for further improving the presentation of LHC results in order to better enable reinterpretation in the future. We also provide a brief description of existing software reinterpretation frameworks and recent global analyses of new physics that make use of the current data.
DOI: 10.1038/s41597-023-02298-6
2023
Cited 8 times
FAIR for AI: An interdisciplinary and international community building perspective
A foundational set of findable, accessible, interoperable, and reusable (FAIR) principles were proposed in 2016 as prerequisites for proper data management and stewardship, with the goal of enabling the reusability of scholarly data. The principles were also meant to apply to other digital assets, at a high level, and over time, the FAIR guiding principles have been re-interpreted or extended to include the software, tools, algorithms, and workflows that produce data. FAIR principles are now being adapted in the context of AI models and datasets. Here, we present the perspectives, vision, and experiences of researchers from different countries, disciplines, and backgrounds who are leading the definition and adoption of FAIR principles in their communities of practice, and discuss outcomes that may result from pursuing and incentivizing FAIR AI research. The material for this report builds on the FAIR for AI Workshop held at Argonne National Laboratory on June 7, 2022.
DOI: 10.1140/epjcd/s2004-02-003-9
2005
Cited 60 times
Summary of the CMS potential for the Higgs boson discovery
This work summarizes the studies for the Higgs boson searches in CMS at the LHC collider. The main discovery channels are presented and the potential is given for the discovery of the SM Higgs boson and the Higgs bosons of the MSSM. The phenomenology, detector, trigger and reconstruction issues are briefly discussed.
DOI: 10.1016/j.nima.2003.11.018
2004
Cited 56 times
Radiation hardness of Czochralski silicon, Float Zone silicon and oxygenated Float Zone silicon studied by low energy protons
We processed pin-diodes on Czochralski silicon (Cz-Si), standard Float Zone silicon (Fz-Si) and oxygenated Fz-Si. The diodes were irradiated with 10, 20, and 30MeV protons. Depletion voltages and leakage currents were measured as a function of the irradiation dose. Additionally, the samples were characterized by TCT and DLTS methods. The high-resistivity Cz-Si was found to be more radiation hard than the other studied materials.
DOI: 10.1016/j.nima.2003.08.102
2003
Cited 50 times
Processing of microstrip detectors on Czochralski grown high resistivity silicon substrates
We have processed large-area strip sensors on silicon wafers grown by the magnetic Czochralski (MCZ) method. The n-type MCZ silicon wafers manufactured by Okmetic Oyj have nominal resistivity of 900Ωcm and oxygen concentration of less than 10ppma. The Photoconductive Decay (PCD) measurements, current–voltage measurements and capacitance–voltage measurements were made to characterise the samples. The leakage current of 3μA at 900V bias voltage was measured on the 32.5cm2 detector. Detector depletion took place at about 420V. According to PCD measurements, process induced contamination was effectively bound and neutralised by the oxygen present in Czochralski silicon. During the sample processing, the silicon resistivity increased in spite of the lack of specific donor-killing heat treatment.
DOI: 10.48550/arxiv.1205.4667
2012
Cited 19 times
Status Report of the DPHEP Study Group: Towards a Global Effort for Sustainable Data Preservation in High Energy Physics
Data from high-energy physics (HEP) experiments are collected with significant financial and human effort and are mostly unique. An inter-experimental study group on HEP data preservation and long-term analysis was convened as a panel of the International Committee for Future Accelerators (ICFA). The group was formed by large collider-based experiments and investigated the technical and organisational aspects of HEP data preservation. An intermediate report was released in November 2009 addressing the general issues of data preservation in HEP. This paper includes and extends the intermediate report. It provides an analysis of the research case for data preservation and a detailed description of the various projects at experiment, laboratory and international levels. In addition, the paper provides a concrete proposal for an international organisation in charge of the data management and policies in high-energy physics.
DOI: 10.1016/0168-9002(95)00344-4
1995
Cited 30 times
Energy loss in thin layers in GEANT
A method for the simulation of the energy loss distribution in thin gaseous layers has been implemented in GEANT and tested. Comparisons are made between the new code and the standard method in GEANT. Improvements are made to the standard method to enable a fast and reliable simulation of energy losses in thin layers.
DOI: 10.1016/s0168-9002(02)00548-x
2002
Cited 25 times
Processing and recombination lifetime characterization of silicon microstrip detectors
Three sets of silicon microstrip detectors have been processed and characterized. Recombination lifetimes of each set have been measured by Microwave Photoconductivity Decay (μPCD) method. In the this method, the silicon is illuminated by a laser pulse that generates electron hole pairs. The transient of the decaying carrier concentration is monitored by using a microwave signal. The recombination lifetime is a measure of the material quality i.e., defect/impurity concentration which affects the detectors’ electrical properties. A correlation between the recombination lifetime and the leakage current has been observed and discussed. The leakage current density in the best devices was about 6nAcm−2 at 40V. The average lifetime in the monitor wafer of this set was about 6500μs. In comparison, average lifetime less than 1000μs resulted in leakage currents of more than 100nAcm−2.
DOI: 10.1007/s10723-010-9152-1
2010
Cited 12 times
Distributed Analysis in CMS
The CMS experiment expects to manage several Pbytes of data each year during the LHC programme, distributing them over many computing sites around the world and enabling data access at those centers for analysis. CMS has identified the distributed sites as the primary location for physics analysis to support a wide community with thousands potential users. This represents an unprecedented experimental challenge in terms of the scale of distributed computing resources and number of user. An overview of the computing architecture, the software tools and the distributed infrastructure is reported. Summaries of the experience in establishing efficient and scalable operations to get prepared for CMS distributed analysis are presented, followed by the user experience in their current analysis activities.
DOI: 10.48550/arxiv.2203.10057
2022
Cited 4 times
Data and Analysis Preservation, Recasting, and Reinterpretation
We make the case for the systematic, reliable preservation of event-wise data, derived data products, and executable analysis code. This preservation enables the analyses' long-term future reuse, in order to maximise the scientific impact of publicly funded particle-physics experiments. We cover the needs of both the experimental and theoretical particle physics communities, and outline the goals and benefits that are uniquely enabled by analysis recasting and reinterpretation. We also discuss technical challenges and infrastructure needs, as well as sociological challenges and changes, and give summary recommendations to the particle-physics community.
DOI: 10.1038/s42254-022-00546-z
2023
Opportunities and challenges in data sharing at multi-user facilities
DOI: 10.1109/tns.2003.821405
2003
Cited 14 times
Radiation hardness of Czochralski silicon studied by 10-MeV and 20-MeV protons
We have processed pin-diodes on Czochralski silicon (Cz-Si), standard float zone silicon (Fz-Si), and diffusion oxygenated float zone silicon (DOF) and irradiated them with 10- and 20-MeV protons. Evolutions of depletion voltage and leakage current as a function of irradiation dose were measured. Space charge sign inversion (SCSI) was investigated by an annealing study and verified by transient current technique (TCT). Czochralski silicon was found to be significantly more radiation hard than the other materials.
DOI: 10.1051/epjconf/202125101004
2021
Cited 6 times
Using CMS Open Data in research – challenges and directions
The CMS experiment at CERN has released research-quality data from particle collisions at the LHC since 2014. Almost all data from the first LHC run in 2010–2012 with the corresponding simulated samples are now in the public domain, and several scientific studies have been performed using these data. This paper summarizes the available data and tools, reviews the challenges in using them in research, and discusses measures to improve their usability.
DOI: 10.48550/arxiv.1512.02019
2015
Cited 5 times
Status Report of the DPHEP Collaboration: A Global Effort for Sustainable Data Preservation in High Energy Physics
Data from High Energy Physics (HEP) experiments are collected with significant financial and human effort and are mostly unique. An inter-experimental study group on HEP data preservation and long-term analysis was convened as a panel of the International Committee for Future Accelerators (ICFA). The group was formed by large collider-based experiments and investigated the technical and organizational aspects of HEP data preservation. An intermediate report was released in November 2009 addressing the general issues of data preservation in HEP and an extended blueprint paper was published in 2012. In July 2014 the DPHEP collaboration was formed as a result of the signature of the Collaboration Agreement by seven large funding agencies (others have since joined or are in the process of acquisition) and in June 2015 the first DPHEP Collaboration Workshop and Collaboration Board meeting took place. This status report of the DPHEP collaboration details the progress during the period from 2013 to 2015 inclusive.
DOI: 10.48550/arxiv.hep-ph/0112045
2001
Cited 11 times
Summary of the CMS Discovery Potential for the MSSM SUSY Higgses
This work summarises the present understanding of the expected MSSM SUSY Higgs reach for CMS. Many of the studies presented here result from detailed detector simulations incorporating final CMS detector design and response. With 30 fb-1 the h -&gt; gamma,gamma and h -&gt; bb channels allow to cover most of the MSSM parameter space. For the massive A,H,H+ MSSM Higgs states the channels A,H -&gt; tau,tau and H+ -&gt; tau,nu turn out to be the most profitable ones in terms of mass reach and parameter space coverage. Consequently CMS has made a big effort to trigger efficiently on taus. Provided neutralinos and sleptons are not too heavy, there is an interesting complementarity in the reaches for A,H -&gt; tau,tau and A,H -&gt; chi,chi.
2015
Cited 4 times
An ontology design pattern for particle physics analysis
The detector final state is the core element of particle physics analysis as it defines the physical characteristics that form the basis of the measurement presented in a published paper. Although they are a crucial part of the research process, detector final states are not yet formally described, published in papers or searchable in a convenient way. This paper aims at providing an ontology pattern for the detector final state that can be used as a building block for an ontology covering the whole particle physics analysis life cycle.
DOI: 10.1109/nssmic.2004.1462661
2005
Cited 7 times
An object-oriented simulation program for CMS
The CMS detector simulation package, OSCAR, is based on the Geant4 simulation toolkit and the CMS object-oriented framework for simulation and reconstruction. Geant4 provides a rich set of physics processes describing in detail electromagnetic and hadronic interactions. It also provides the tools for the implementation of the full CMS detector geometry and the interfaces required for recovering information from the particle tracking in the detectors. This functionality is interfaced to the CMS framework, which, via its "action on demand" mechanisms, allows the user to selectively load desired modules and to configure and tune the final application. The complete CMS detector is rather complex with more than 1 million geometrical volumes. OSCAR has been validated by comparing its results with test beam data and with results from simulation with a GEANT3-based program. It has been successfully deployed in the 2004 data challenge for CMS, where more than 35 million events for various LHC physics channels were simulated and analysed.
DOI: 10.1088/0031-8949/2004/t114/021
2004
Cited 6 times
Particle Detectors made of High Resistivity Czochralski Grown Silicon
We describe the fabrication process of fullsize silicon microstrip detectors processed on silicon wafers grown by magnetic Czochralski method. Defect analysis by DLTS spectroscopy as well as minority carrier lifetime measurements by µPCD method are presented. The electrical and detection properties of the Czochralski silicon detectors are comparable to those of leading commercial detector manufacturers. The radiation hardness of the Czochralski silicon detectors was proved to be superior to the devices made of traditional Float Zone silicon material.
DOI: 10.1016/j.nima.2004.05.058
2004
Cited 6 times
Results of proton irradiations of large area strip detectors made on high-resistivity Czochralski silicon
We have processed full-size strip detectors on Czochralski grown silicon wafers with resistivity of about 1.2 kΩ cm. Wafers grown with Czochralski method intrinsically contain high concentrations of oxygen, and thus have potential for high radiation tolerance. Detectors and test diodes were irradiated with 10 MeV protons. The 1-MeV neutron equivalent irradiation doses were 1.6×1014 and 8.5×1013 cm−2 for detectors, and up to 5.0×1014 cm−3 for test diodes. After irradiations, depletion voltages and leakage currents were measured. Czochralski silicon devices proved to be significantly more radiation hard than the reference devices made on traditional detector materials.
DOI: 10.1088/1742-6596/664/3/032027
2015
Open access to high-level data and analysis tools in the CMS experiment at the LHC
The CMS experiment, in recognition of its commitment to data preservation and open access as well as to education and outreach, has made its first public release of high-level data under the CC0 waiver: up to half of the proton-proton collision data (by volume) at 7 TeV from 2010 in CMS Analysis Object Data format. CMS has prepared, in collaboration with CERN and the other LHC experiments, an open-data web portal based on Invenio. The portal provides access to CMS public data as well as to analysis tools and documentation for the public. The tools include an event display and histogram application that run in the browser. In addition a virtual machine containing a CMS software environment along with XRootD access to the data is available. Within the virtual machine the public can analyse CMS data; example code is provided. We describe the accompanying tools and documentation and discuss the first experiences of data use.
DOI: 10.1109/tns.2002.805345
2002
Cited 6 times
The effect of oxygenation on the radiation hardness of silicon studied by surface photovoltage method
The effect of oxygenation on the radiation hardness of silicon detectors was studied. Oxygen-enriched and standard float-zone silicon pin-diodes and oxidized samples were processed and irradiated with 15-MeV protons. After the irradiations, the surface photovoltage (SPV) method was applied to extract minority carrier diffusion lengths of the silicon samples. Adding oxygen to silicon was found to improve the radiation hardness of silicon. The effect was visible in minority carrier diffusion lengths as well as in reverse bias leakage currents. The suitability of SPV method for characterizing irradiated silicon samples was proved.
DOI: 10.1088/1742-6596/331/8/082006
2011
A Perspective of User Support for the CMS Experiment
The CMS (Compact Muon Solenoid) experiment is one of two large general-purpose particle physics detectors at the LHC (Large Hadron Collider). An international collaboration of nearly 3500 people operates this complex detector whose main goal is to answer the most fundamental questions about our universe. The size and globally diversified nature of the collaboration and the Petabytes/year of data being collected, presents a big challenging task in bringing users up to speed to contribute to the physics analysis. The CMS User Support performs this task by helping users quickly learn about the CMS computing and the needed physics analysis tools. In this presentation we give an overview of its goals, organization and usage of collaborative tools to maintain the software and computing documentation and conduct year around tutorials on several physics tools needed as a pre-requisite for physics. We also talk about the user feedback evaluating its work.
DOI: 10.1088/1742-6596/513/4/042029
2014
Implementing the data preservation and open access policy in CMS
Implementation of the CMS policy on long-term data preservation, re-use and open access has started.Current practices in providing data additional to published papers and distributing simplified data-samples for outreach are promoted and consolidated.The first measures have been taken for analysis and data preservation for the internal use of the collaboration and for open access to part of the data.Two complementary approaches are followed.First, a virtual machine environment, which will pack all ingredients needed to compile and run a software release with which the legacy data was reconstructed.Second, a validation framework, maintaining the capability not only to read the old raw data, but also to reprocess them with an updated release or to another format to help ensure long-term reusability of the legacy data.
DOI: 10.1088/1742-6596/396/6/062013
2012
Maintaining and improving of the training program on the analysis software in CMS
Since 2009, the CMS experiment at LHC has provided intensive training on the use of Physics Analysis Tools (PAT), a collection of common analysis tools designed to share expertise and maximize productivity in the physics analysis. More than ten one-week courses preceded by prerequisite studies have been organized and the feedback from the participants has been carefully analyzed. This note describes how the training team designs, maintains and improves the course contents based on the feedback, the evolving analysis practices and the software development.
DOI: 10.48550/arxiv.1110.0355
2011
An outlook of the user support model to educate the users community at the CMS Experiment
The CMS (Compact Muon Solenoid) experiment is one of the two large general-purpose particle physics detectors built at the LHC (Large Hadron Collider) at CERN in Geneva, Switzerland. The diverse collaboration combined with a highly distributed computing environment and Petabytes/year of data being collected makes CMS unlike any other High Energy Physics collaborations before. This presents new challenges to educate and bring users, coming from different cultural, linguistics and social backgrounds, up to speed to contribute to the physics analysis. CMS has been able to deal with this new paradigm by deploying a user support structure model that uses collaborative tools to educate about software, computing an physics tools specific to CMS. To carry out the user support mission worldwide, an LHC Physics Centre (LPC) was created few years back at Fermilab as a hub for US physicists. The LPC serves as a "brick and mortar" location for physics excellence for the CMS physicists where graduate and postgraduate scientists can find experts in all aspects of data analysis and learn via tutorials, workshops, conferences and gatherings. Following the huge success of LPC, a centre at CERN itself called LHC Physics Centre at CERN (LPCC) and Terascale Analysis Centre at DESY have been created with similar goals. The CMS user support model would also facilitate in making the non-CMS scientific community learn about CMS physics. A good example of this is the effort by HEP experiments, including CMS, to focus on data preservation efforts. In order to facilitate its use by the future scientific community, who may want to re-visit our data, and re-analyze it, CMS is evaluating the resources required. A detailed, good quality and well-maintained documentation by the user support group about the CMS computing and software may go a long way to help in this endeavour.
DOI: 10.1088/1742-6596/396/3/032065
2012
Preparing for long-term data preservation and access in CMS
The data collected by the LHC experiments are unique and present an opportunity and a challenge for a long-term preservation and re-use. The CMS experiment has defined a policy for the data preservation and access to its data and is starting its implementation. This note describes the driving principles of the policy and summarises the actions and activities which are planned in the starting phase of the project.
DOI: 10.1088/1742-6596/219/8/082011
2010
Improving collaborative documentation in CMS
Complete and up-to-date documentation is essential for efficient data analysis in a large and complex collaboration like CMS. Good documentation reduces the time spent in problem solving for users and software developers. The scientists in our research environment do not necessarily have the interests or skills of professional technical writers. This results in inconsistencies in the documentation. To improve the quality, we have started a multidisciplinary project involving CMS user support and expertise in technical communication from the University of Turku, Finland. In this paper, we present possible approaches to study the usability of the documentation, for instance, usability tests conducted recently for the CMS software and computing user documentation.
DOI: 10.1051/epjconf/202024508014
2020
Open data provenance and reproducibility: a case study from publishing CMS open data
In this paper we present the latest CMS open data release published on the CERN Oopen Data portal. Samples of collision and simulated datasets were released together with detailed information about the data provenance. The associated data production chains cover the necessary computing environments, the configuration files and the computational procedures used in each data production step. We describe data curation techniques used to obtain and publish the data provenance information and we study the possibility of reproducing parts of the released data using the publicly available information. The present work demonstrates the usefulness of releasing selected samples of raw and primary data in order to fully ensure the completeness of information about the data production chain for the attention of general data scientists and other non-specialists interested in using particle physics data for education or research purposes.
DOI: 10.22323/1.390.0963
2020
In Pursuit of Authenticity – CMS Open Data in Education
There are some universally acknowledged problems in school sciences.In developed countries worldwide, young people are not interested in studying STEM subjects.Whether that is because of perceived lack of personal relevance, disconnect from the actual fields of study, "sanitized" school practices or other factors is a matter of debate, but it is eminently clear that, as educators, we must do our best to combat this trend.In this paper, we discuss how open data from the CMS experiment has been used in education and present feedback from Finnish teachers who have received training in using these freely available programming resources to bring modern physics into their teaching.The main focus here is on the teachers' perception of authenticity in the use of "real world" research data, although there is an additional benefit of learning general scientific methods and cross-disciplinary data handling skills as well.
DOI: 10.48550/arxiv.hep-ex/0605042
2006
Early LHC physics studies: what can be obtained before the discoveries?
The Large Hadron Collider will provide an unprecedented quantity of collision data right from the start-up. The challenge for the LHC experiments is the quick use of these data for the final commissioning of the detectors, including calibration, alignment, measuring of detector and trigger efficiencies. A new energy frontier will open up, and measurement of basic Standard Model processes will build a solid basement for any discovery studies.
DOI: 10.1109/nssmic.2006.354216
2006
The CMS Simulation Software
In this paper we present the features and the expected performance of the re-designed CMS simulation software, as well as the experience from the migration process. Today, the CMS simulation suite is based on the two principal components - Geant4 detector simulation toolkit and the new CMS offline Framework and Event Data Model. The simulation chain includes event generation, detector simulation, and digitization steps. With Geant4, we employ the full set of electromagnetic and hadronic physics processes and detailed particle tracking in the 4 Tesla magnetic field. The Framework provides "action on demand" mechanisms, to allow users to load dynamically the desired modules and to configure and tune the final application at the run time. The simulation suite is used to model the complete central CMS detector (over 1 million of geometrical volumes) and the forward systems, such as Castor calorimeter and Zero Degree Calorimeter, the Totem telescopes, Roman Pots, and the Luminosity Monitor. The designs also previews the use of the electromagnetic and hadronic showers parametrization, instead of full modelling of high energy particles passage through a complex hierarchy of volumes and materials, allowing significant gain in speed while tuning the simulation to test beam and collider data. Physics simulation has been extensively validated by comparison with test beam data and previous simulation results. The redesigned and upgraded simulation software was exercised for performance and robustness tests. It went into Production in July 2006, running in the US and EU grids, and has since delivered about 60 millions of events.
DOI: 10.1016/j.nuclphysbps.2015.09.195
2016
Preparations for the public release of high-level CMS data
The CMS Collaboration, in accordance with its commitment to open access and data preservation, is preparing for the public release of up to half of the reconstructed collision data collected in 2010. Efforts at present are focused on the usability of the data in education. The data will be accompanied by example applications tailored for different levels of access, including ready-to-use web-based applications for histogramming or visualising individual collision events and a virtual machine image of the CMS software environment that is compatible with these data. The virtual machine image will contain instructions for using the data with the online applications as well as examples of simple analyses. The novelty of this initiative is two-fold: in terms of open science, it lies in releasing the data in a format that is good for analysis; from an outreach perspective, it is to provide the possibility for people outside CMS to build educational applications using our public data. CMS will rely on services for data preservation and open access being prototyped at CERN with input from CMS and the other LHC experiments.
DOI: 10.1088/1742-6596/396/6/062020
2012
Developing CMS software documentation system
CMSSW (CMS SoftWare) is the overall collection of software and services needed by the simulation, calibration and alignment, and reconstruction modules that process data so that physicists can perform their analyses. It is a long term project, with a large amount of source code. In large scale and complex projects is important to have as up-to-date and automated software documentation as possible. The core of the documentation should be version-based and available online with the source code. CMS uses Doxygen and Twiki as the main tools to provide automated and non-automated documentation. Both of them are heavily cross-linked to prevent duplication of information. Doxygen is used to generate functional documentation and dependency graphs from the source code. Twiki is divided into two parts: WorkBook and Software Guide. WorkBook contains tutorial-type instructions on accessing computing resources and using the software to perform analysis within the CMS collaboration and Software Guide gives further details. This note describes the design principles, the basic functionality and the technical implementations of the CMSSW documentation.
DOI: 10.1007/s1010502cs116
2002
Higgs physics at the LHC
DOI: 10.48550/arxiv.2210.08973
2022
FAIR for AI: An interdisciplinary and international community building perspective
A foundational set of findable, accessible, interoperable, and reusable (FAIR) principles were proposed in 2016 as prerequisites for proper data management and stewardship, with the goal of enabling the reusability of scholarly data. The principles were also meant to apply to other digital assets, at a high level, and over time, the FAIR guiding principles have been re-interpreted or extended to include the software, tools, algorithms, and workflows that produce data. FAIR principles are now being adapted in the context of AI models and datasets. Here, we present the perspectives, vision, and experiences of researchers from different countries, disciplines, and backgrounds who are leading the definition and adoption of FAIR principles in their communities of practice, and discuss outcomes that may result from pursuing and incentivizing FAIR AI research. The material for this report builds on the FAIR for AI Workshop held at Argonne National Laboratory on June 7, 2022.
DOI: 10.1038/s41567-018-0382-7
2018
Publisher Correction: Open is not enough
In the version of this Perspective originally published, one of the authors’ names was incorrectly given as Kati Lassili-Perini; it should have been Kati Lassila-Perini. This has been corrected in all versions of the Perspective.
DOI: 10.1142/9789813271647_0008
2019
Early Experience with Open Data from CERN’s Large Hadron Collider
This chapter covers perspectives from the various partners who have worked on the release of large volumes of open research data from the Large Hadron Collider via the CERN Open Data Portal. The early experiences mentioned in the title refer to the launch of the Portal in November 2014 with the release of the first batch of high-level research data collected in 2010 by the CMS Collaboration. This chapter covers the motivation for releasing particle physics data openly as well as the challenges faced in doing so and solutions developed to facilitate these efforts. The authors also touch upon the use cases of the open datasets and the impact the first release has had. Caveat lector: The experiences and figures described in this chapter correspond to the year 2016 when this piece was originally written. The reader may want to consult Nature Physics 15 (2019) 113–119 to learn about later developments.
2019
Jet trigger prescale analyzer for CMS Open Data
DOI: 10.21468/scipost.report.1656
2020
Report on 2003.07868v2
We report on the status of efforts to improve the reinterpretation of searches and measurements at the LHC in terms of models for new physics, in the context of the LHC Reinterpretation Forum.We detail current experimental offerings in direct searches for new particles, measurements, technical implementations and Open Data, and provide a set of recommendations for further improving the presentation of LHC results in order to better enable reinterpretation in the future.We also provide a brief description of existing software reinterpretation frameworks and recent global analyses of new physics that make use of the current data.
DOI: 10.21468/scipost.report.1652
2020
Report on 2003.07868v2
We report on the status of efforts to improve the reinterpretation of searches and measurements at the LHC in terms of models for new physics, in the context of the LHC Reinterpretation Forum.We detail current experimental offerings in direct searches for new particles, measurements, technical implementations and Open Data, and provide a set of recommendations for further improving the presentation of LHC results in order to better enable reinterpretation in the future.We also provide a brief description of existing software reinterpretation frameworks and recent global analyses of new physics that make use of the current data.
DOI: 10.21468/scipost.report.1710
2020
Report on 2003.07868v2
We report on the status of efforts to improve the reinterpretation of searches and measurements at the LHC in terms of models for new physics, in the context of the LHC Reinterpretation Forum.We detail current experimental offerings in direct searches for new particles, measurements, technical implementations and Open Data, and provide a set of recommendations for further improving the presentation of LHC results in order to better enable reinterpretation in the future.We also provide a brief description of existing software reinterpretation frameworks and recent global analyses of new physics that make use of the current data.
DOI: 10.1109/nssmic.2005.1596421
2006
The CMS Object-Oriented Simulation
The CMS object oriented Geant4-based program is used to simulate the complete central CMS detector (over 1 million geometrical volumes) and the forward systems such as the Totem telescopes, Castor calorimeter, zero degree calorimeter, Roman pots, and the luminosity monitor. The simulation utilizes the full set of electromagnetic and hadronic physics processes provided by Geant4 and detailed particle tracking in the 4 tesla magnetic field. Electromagnetic shower parameterization can be used instead of full tracking of high-energy electrons and positrons, allowing significant gains in speed without detrimental precision losses. The simulation physics has been validated by comparisons with test beam data and previous simulation results. The system has been in production for almost two years and has delivered over 100 million events for various LHC physics channels. Productions are run on the US and EU grids at a rate of 3-5 million events per month. At the same time, the simulation has evolved to fulfill emerging requirements for new physics simulations, including very large heavy ion events and a variety of SUSY scenarios. The software has also undergone major technical upgrades. The framework and core services have been ported to the new CMS offline software architecture and event data model. In parallel, the program is subjected to ever more stringent quality assurance procedures, including a recently commissioned automated physics validation suite
1992
CMS : letter of intent by the CMS Collaboration for a general purpose detector at LHC
DOI: 10.1016/s0168-9002(03)01880-1
2003
Annealing study of oxygenated and non-oxygenated float zone silicon irradiated with protons
Introducing oxygen into the silicon material is believed to improve the radiation hardness of silicon detectors. In this study, oxygenated and non-oxygenated silicon samples were processed and irradiated with 15MeV protons. In order to speed up the defect reactions after the exposure to particle radiation, the samples were heat treated at elevated temperatures. In this way, the long-term stability of silicon detectors in hostile radiation environment could be estimated. Current–voltage measurements and Surface Photovoltage (SPV) method were used to characterize the samples.
2021
arXiv : Unveiling Hidden Physics at the LHC
DOI: 10.48550/arxiv.2109.06065
2021
Unveiling Hidden Physics at the LHC
The field of particle physics is at the crossroads. The discovery of a Higgs-like boson completed the Standard Model (SM), but the lacking observation of convincing resonances Beyond the SM (BSM) offers no guidance for the future of particle physics. On the other hand, the motivation for New Physics has not diminished and is, in fact, reinforced by several striking anomalous results in many experiments. Here we summarise the status of the most significant anomalies, including the most recent results for the flavour anomalies, the multi-lepton anomalies at the LHC, the Higgs-like excess at around 96 GeV, and anomalies in neutrino physics, astrophysics, cosmology, and cosmic rays. While the LHC promises up to 4/ab of integrated luminosity and far-reaching physics programmes to unveil BSM physics, we consider the possibility that the latter could be tested with present data, but that systemic shortcomings of the experiments and their search strategies may preclude their discovery for several reasons, including: final states consisting in soft particles only, associated production processes, QCD-like final states, close-by SM resonances, and SUSY scenarios where no missing energy is produced. New search strategies could help to unveil the hidden BSM signatures, devised by making use of the CERN open data as a new testing ground. We discuss the CERN open data with its policies, challenges, and potential usefulness for the community. We showcase the example of the CMS collaboration, which is the only collaboration regularly releasing some of its data. We find it important to stress that individuals using public data for their own research does not imply competition with experimental efforts, but rather provides unique opportunities to give guidance for further BSM searches by the collaborations. Wide access to open data is paramount to fully exploit the LHCs potential.
2002
Particle Detectors Manufactured by using the Multichamber processing Equipment
2002
Processing and recombination lifetime characterization of silicon microstrip detectors