ϟ

C. Lange

Here are all the papers by C. Lange that you can download and read on OA.mg.
C. Lange’s last known institution is . Download C. Lange PDFs here.

Claim this Profile →
DOI: 10.1140/epjc/s10052-022-10541-4
2022
Cited 27 times
Unveiling hidden physics at the LHC
Abstract The field of particle physics is at the crossroads. The discovery of a Higgs-like boson completed the Standard Model (SM), but the lacking observation of convincing resonances Beyond the SM (BSM) offers no guidance for the future of particle physics. On the other hand, the motivation for New Physics has not diminished and is, in fact, reinforced by several striking anomalous results in many experiments. Here we summarise the status of the most significant anomalies, including the most recent results for the flavour anomalies, the multi-lepton anomalies at the LHC, the Higgs-like excess at around 96 GeV, and anomalies in neutrino physics, astrophysics, cosmology, and cosmic rays. While the LHC promises up to 4 $$\hbox {ab}^{-1}$$ <mml:math xmlns:mml="http://www.w3.org/1998/Math/MathML"> <mml:msup> <mml:mtext>ab</mml:mtext> <mml:mrow> <mml:mo>-</mml:mo> <mml:mn>1</mml:mn> </mml:mrow> </mml:msup> </mml:math> of integrated luminosity and far-reaching physics programmes to unveil BSM physics, we consider the possibility that the latter could be tested with present data, but that systemic shortcomings of the experiments and their search strategies may preclude their discovery for several reasons, including: final states consisting in soft particles only, associated production processes, QCD-like final states, close-by SM resonances, and SUSY scenarios where no missing energy is produced. New search strategies could help to unveil the hidden BSM signatures, devised by making use of the CERN open data as a new testing ground. We discuss the CERN open data with its policies, challenges, and potential usefulness for the community. We showcase the example of the CMS collaboration, which is the only collaboration regularly releasing some of its data. We find it important to stress that individuals using public data for their own research does not imply competition with experimental efforts, but rather provides unique opportunities to give guidance for further BSM searches by the collaborations. Wide access to open data is paramount to fully exploit the LHCs potential.
DOI: 10.21468/scipostphys.9.2.022
2020
Cited 47 times
Reinterpretation of LHC Results for New Physics: Status and recommendations after Run 2
We report on the status of efforts to improve the reinterpretation of searches and measurements at the LHC in terms of models for new physics, in the context of the LHC Reinterpretation Forum. We detail current experimental offerings in direct searches for new particles, measurements, technical implementations and Open Data, and provide a set of recommendations for further improving the presentation of LHC results in order to better enable reinterpretation in the future. We also provide a brief description of existing software reinterpretation frameworks and recent global analyses of new physics that make use of the current data.
DOI: 10.48550/arxiv.2404.02100
2024
Analysis Facilities White Paper
This white paper presents the current status of the R&D for Analysis Facilities (AFs) and attempts to summarize the views on the future direction of these facilities. These views have been collected through the High Energy Physics (HEP) Software Foundation's (HSF) Analysis Facilities forum, established in March 2022, the Analysis Ecosystems II workshop, that took place in May 2022, and the WLCG/HSF pre-CHEP workshop, that took place in May 2023. The paper attempts to cover all the aspects of an analysis facility.
DOI: 10.48550/arxiv.2203.10057
2022
Cited 4 times
Data and Analysis Preservation, Recasting, and Reinterpretation
We make the case for the systematic, reliable preservation of event-wise data, derived data products, and executable analysis code. This preservation enables the analyses' long-term future reuse, in order to maximise the scientific impact of publicly funded particle-physics experiments. We cover the needs of both the experimental and theoretical particle physics communities, and outline the goals and benefits that are uniquely enabled by analysis recasting and reinterpretation. We also discuss technical challenges and infrastructure needs, as well as sociological challenges and changes, and give summary recommendations to the particle-physics community.
DOI: 10.1051/epjconf/202125101004
2021
Cited 6 times
Using CMS Open Data in research – challenges and directions
The CMS experiment at CERN has released research-quality data from particle collisions at the LHC since 2014. Almost all data from the first LHC run in 2010–2012 with the corresponding simulated samples are now in the public domain, and several scientific studies have been performed using these data. This paper summarizes the available data and tools, reviews the challenges in using them in research, and discusses measures to improve their usability.
DOI: 10.3389/fphy.2022.897719
2022
Cited 3 times
Jets and Jet Substructure at Future Colliders
Even though jet substructure was not an original design consideration for the Large Hadron Collider (LHC) experiments, it has emerged as an essential tool for the current physics program. We examine the role of jet substructure on the motivation for and design of future energy Frontier colliders. In particular, we discuss the need for a vibrant theory and experimental research and development program to extend jet substructure physics into the new regimes probed by future colliders. Jet substructure has organically evolved with a close connection between theorists and experimentalists and has catalyzed exciting innovations in both communities. We expect such developments will play an important role in the future energy Frontier physics program.
DOI: 10.1093/rpd/nci682
2006
Cited 9 times
Application of advanced Monte Carlo Methods in numerical dosimetry
Many tasks in different sectors of dosimetry are very complex and highly sensitive to changes in the radiation field. Often, only the simulation of radiation transport is capable of describing the radiation field completely. Down to sub-cellular dimensions the energy deposition by cascades of secondary electrons is the main pathway for damage induction in matter. A large number of interactions take place until such electrons are slowed down to thermal energies. Also for some problems of photon transport a large number of photon histories need to be processed. Thus the efficient non-analogue Monte Carlo program, AMOS, has been developed for photon and electron transport. Various applications and benchmarks are presented showing its ability. For radiotherapy purposes the radiation field of a brachytherapy source is calculated according to the American Association of Physicists in Medicine Task Group Report 43 (AAPM/TG43). As additional examples, results for the detector efficiency of a high-purity germanium (HPGe) detector and a dose estimation for an X-ray shielding for radiation protection are shown.
DOI: 10.1016/j.nima.2019.03.029
2019
Cited 4 times
Reconstruction of <mml:math xmlns:mml="http://www.w3.org/1998/Math/MathML" display="inline" overflow="scroll" id="d1e281" altimg="si100.gif"><mml:mi>τ</mml:mi></mml:math> lepton pair invariant mass using an artificial neural network
The reconstruction of the invariant mass of τ lepton pairs is important for analyses containing Higgs and Z bosons decaying to τ+τ−, but highly challenging due to the neutrinos from the τ lepton decays, which cannot be measured in the detector. In this paper, we demonstrate how artificial neural networks can be used to reconstruct the mass of a di-τ system and compare this procedure to an algorithm used by the CMS Collaboration for this purpose. We find that the neural network output shows a smaller bias and better resolution of the di-τ mass reconstruction and an improved discrimination between a Higgs boson signal and the Drell–Yan background with a much shorter computation time.
DOI: 10.3389/fdata.2021.661501
2021
Cited 3 times
Scalable Declarative HEP Analysis Workflows for Containerised Compute Clouds
We describe a novel approach for experimental High-Energy Physics (HEP) data analyses that is centred around the declarative rather than imperative paradigm when describing analysis computational tasks. The analysis process can be structured in the form of a Directed Acyclic Graph (DAG), where each graph vertex represents a unit of computation with its inputs and outputs, and the graph edges describe the interconnection of various computational steps. We have developed REANA, a platform for reproducible data analyses, that supports several such DAG workflow specifications. The REANA platform parses the analysis workflow and dispatches its computational steps to various supported computing backends (Kubernetes, HTCondor, Slurm). The focus on declarative rather than imperative programming enables researchers to concentrate on the problem domain at hand without having to think about implementation details such as scalable job orchestration. The declarative programming approach is further exemplified by a multi-level job cascading paradigm that was implemented in the Yadage workflow specification language. We present two recent LHC particle physics analyses, ATLAS searches for dark matter and CMS jet energy correction pipelines, where the declarative approach was successfully applied. We argue that the declarative approach to data analyses, combined with recent advancements in container technology, facilitates the portability of computational data analyses to various compute backends, enhancing the reproducibility and the knowledge preservation behind particle physics data analyses.
DOI: 10.48550/arxiv.1411.7279
2014
Beyond Standard Model Higgs
Recent LHC highlights of searches for Higgs bosons beyond the Standard Model are presented. The results by the ATLAS and CMS collaborations are based on 2011 and 2012 proton-proton collision data at centre-of-mass energies of 7 and 8 TeV, respectively. They test a wide range of theoretical models.
DOI: 10.5506/aphyspolbsupp.6.873
2013
QCD with Two Light Dynamical Chirally Improved Quarks
Results for the excited meson and baryon spectrum with two flavors of Chirally Improved sea quarks are presented. We simulate several ensembles with pion masses ranging from 250 to 600 MeV and extrapolate to the physical pion mass. Strange quarks are treated within the partially quenched approximation. Using the variational method, we investigate the content of the states. Among others, we discuss the flavor singlet/octet content of Lambda states. In general, our results compare well with experiment, in particular we get very good agreement with the �(1405) and confirm its flavor singlet nature.
DOI: 10.17433/978-3-17-043821-7
2023
Das Feuerwehr-Lehrbuch
DOI: 10.48550/arxiv.2010.05102
2020
Software Sustainability &amp; High Energy Physics
New facilities of the 2020s, such as the High Luminosity Large Hadron Collider (HL-LHC), will be relevant through at least the 2030s. This means that their software efforts and those that are used to analyze their data need to consider sustainability to enable their adaptability to new challenges, longevity, and efficiency, over at least this period. This will help ensure that this software will be easier to develop and maintain, that it remains available in the future on new platforms, that it meets new needs, and that it is as reusable as possible. This report discusses a virtual half-day workshop on "Software Sustainability and High Energy Physics" that aimed 1) to bring together experts from HEP as well as those from outside to share their experiences and practices, and 2) to articulate a vision that helps the Institute for Research and Innovation in Software for High Energy Physics (IRIS-HEP) to create a work plan to implement elements of software sustainability. Software sustainability practices could lead to new collaborations, including elements of HEP software being directly used outside the field, and, as has happened more frequently in recent years, to HEP developers contributing to software developed outside the field rather than reinventing it. A focus on and skills related to sustainable software will give HEP software developers an important skill that is essential to careers in the realm of software, inside or outside HEP. The report closes with recommendations to improve software sustainability in HEP, aimed at the HEP community via IRIS-HEP and the HEP Software Foundation (HSF).
DOI: 10.3389/fdata.2021.673163
2021
Increasing the Execution Speed of Containerized Analysis Workflows Using an Image Snapshotter in Combination With CVMFS
The past years have shown a revolution in the way scientific workloads are being executed thanks to the wide adoption of software containers. These containers run largely isolated from the host system, ensuring that the development and execution environments are the same everywhere. This enables full reproducibility of the workloads and therefore also the associated scientific analyses performed. However, as the research software used becomes increasingly complex, the software images grow easily to sizes of multiple gigabytes. Downloading the full image onto every single compute node on which the containers are executed becomes unpractical. In this paper, we describe a novel way of distributing software images on the Kubernetes platform, with which the container can start before the entire image contents become available locally (so-called “lazy pulling”). Each file required for the execution is fetched individually and subsequently cached on-demand using the CernVM file system (CVMFS), enabling the execution of very large software images on potentially thousands of Kubernetes nodes with very little overhead. We present several performance benchmarks making use of typical high-energy physics analysis workloads.
DOI: 10.1007/s41781-021-00069-9
2021
Software Training in HEP
Abstract The long-term sustainability of the high-energy physics (HEP) research software ecosystem is essential to the field. With new facilities and upgrades coming online throughout the 2020s, this will only become increasingly important. Meeting the sustainability challenge requires a workforce with a combination of HEP domain knowledge and advanced software skills. The required software skills fall into three broad groups. The first is fundamental and generic software engineering (e.g., Unix, version control, C++, and continuous integration). The second is knowledge of domain-specific HEP packages and practices (e.g., the ROOT data format and analysis framework). The third is more advanced knowledge involving specialized techniques, including parallel programming, machine learning and data science tools, and techniques to maintain software projects at all scales. This paper discusses the collective software training program in HEP led by the HEP Software Foundation (HSF) and the Institute for Research and Innovation in Software in HEP (IRIS-HEP). The program equips participants with an array of software skills that serve as ingredients for the solution of HEP computing challenges. Beyond serving the community by ensuring that members are able to pursue research goals, the program serves individuals by providing intellectual capital and transferable skills important to careers in the realm of software and computing, inside or outside HEP.
DOI: 10.22323/1.398.0846
2022
Status and plans for the CMS High Granularity Calorimeter upgrade project
The CMS Collaboration is preparing to build replacement endcap calorimeters for the highluminosity LHC (HL-LHC) era.The new high-granularity calorimeter (HGCAL) is, as the name implies, a highly-granular sampling calorimeter with approximately six million silicon sensor channels (~1.1 cm 2 or 0.5 cm 2 cells) and about four hundred thousand channels of scintillator tiles read out with on-tile silicon photomultipliers.The calorimeter is designed to operate in the harsh radiation environment at the HL-LHC, where the average number of interactions per bunch crossing is expected to exceed 140.Besides measuring energy and position of the energy deposits, the electronics is also designed to measure the time of particles' arrival with a precision on the order of 50 ps.In this talk, the reasoning and ideas behind the HGCAL, the current status of the project, the many lessons learnt so far, in particular from beam tests, and the challenges ahead will be presented.
DOI: 10.48550/arxiv.2203.07462
2022
Jets and Jet Substructure at Future Colliders
Even though jet substructure was not an original design consideration for the Large Hadron Collider (LHC) experiments, it has emerged as an essential tool for the current physics program. We examine the role of jet substructure on the motivation for and design of future energy frontier colliders. In particular, we discuss the need for a vibrant theory and experimental research and development program to extend jet substructure physics into the new regimes probed by future colliders. Jet substructure has organically evolved with a close connection between theorists and experimentalists and has catalyzed exciting innovations in both communities. We expect such developments will play an important role in the future energy frontier physics program.
DOI: 10.22323/1.191.0284
2013
Measurement of V+heavy flavour production at ATLAS
Measurement of vector boson + heavy quark production is an important test of QCD, providing probes of higher order QCD processes and measurements of the heavy flavour content of the proton.Cross-sections are measured for Z+b and W +b production.For W +b they are also presented differentially as a function of jet multiplicity and transverse momentum of the leading b-jet.The results are compared to the QCD predictions at NLO.
2009
Triggering top quark events
DOI: 10.1051/epjconf/202024508014
2020
Open data provenance and reproducibility: a case study from publishing CMS open data
In this paper we present the latest CMS open data release published on the CERN Oopen Data portal. Samples of collision and simulated datasets were released together with detailed information about the data provenance. The associated data production chains cover the necessary computing environments, the configuration files and the computational procedures used in each data production step. We describe data curation techniques used to obtain and publish the data provenance information and we study the possibility of reproducing parts of the released data using the publicly available information. The present work demonstrates the usefulness of releasing selected samples of raw and primary data in order to fully ensure the completeness of information about the data production chain for the attention of general data scientists and other non-specialists interested in using particle physics data for education or research purposes.
DOI: 10.22323/1.254.0002
2015
CMS inner detector: the Run 1 to Run 2 transition and first experience of Run 2
DOI: 10.1055/s-0036-1581284
2016
Reduzierung von Artefakten für Stammzellmonitoring in Magnetic Particle Imaging
Die Verbesserung von geeigneten biomedizinischen Bildgebungsverfahren für Stammzelltracking und SPION (superparamagnetic iron oxide nanoparticles) monitoring ist von grundlegendem Interesse für ein verbessertesVerständnis für die Migration und Verteilung von Zellen und funktionalisierten SPIONs. Drei unterschiedliche Kalibrationsmethoden in Magnetic Particle Imaging (MPI)wurden untersucht um die Bildqualität entscheidend zu steigern.
2016
High-$p_T$ multi-jet final states at ATLAS and CMS
The increase of the centre-of-mass energy of the Large Hadron Collider (LHC) to 13 TeV has opened up a new energy regime. Final states including high-momentum multi-jet signatures often dominate beyond standard model phenomena, in particular decay products of new heavy particles. While the potential di-photon resonance currently receives a lot of attention, multi-jet final states pose strong constraints on what physics model an observation could actually be described with. In this presentation, the latest results of the ATLAS and CMS collaborations in high transverse momentum multi-jet final states are summarised. This includes searches for heavy resonances and new phenomena in the di-jet mass spectrum, di-jet angular distributions, and the sum of transverse momenta in different event topologies. Furthermore, results on leptoquark pair production will be shown. A particular focus is laid on the different background estimation methods.
DOI: 10.48550/arxiv.1606.08283
2016
High-$p_T$ multi-jet final states at ATLAS and CMS
The increase of the centre-of-mass energy of the Large Hadron Collider (LHC) to 13 TeV has opened up a new energy regime. Final states including high-momentum multi-jet signatures often dominate beyond standard model phenomena, in particular decay products of new heavy particles. While the potential di-photon resonance currently receives a lot of attention, multi-jet final states pose strong constraints on what physics model an observation could actually be described with. In this presentation, the latest results of the ATLAS and CMS collaborations in high transverse momentum multi-jet final states are summarised. This includes searches for heavy resonances and new phenomena in the di-jet mass spectrum, di-jet angular distributions, and the sum of transverse momenta in different event topologies. Furthermore, results on leptoquark pair production will be shown. A particular focus is laid on the different background estimation methods.
DOI: 10.18452/16786;
2013
A novel approach to precision measurements of the top quark-antiquark pair production cross section with the ATLAS experiment
In dieser Dissertation werden drei Messungen des Produktionswirkungsquerschnitts von Top-Quark-Antiquark-Paaren in Proton-Proton-Kollisionen bei einer Schwerpunktsenergie von 7 TeV vorgestellt. Die Daten wurden mit dem ATLAS-Experiment am Large Hadron Collider in den Jahren 2010 und 2011 aufgezeichnet. Fur die Analyse werden Endzustande mit genau einem Myon oder Elektron, mindestens drei Jets sowie grosem fehlenden Tranversalimpuls selektiert. Wahrend eine Analyse ausschlieslich kinematische Informationen fur die Trennung von Signal- und Untergrundprozessen verwendet, nutzen die anderen beiden zusatzlich Informationen zur Identifizierung von Bottom-Quark-Jets. Mit Hilfe von multivariaten Methoden werden die prazisesten Messungen in dieser Ereignistopologie erreicht. Dies ist fur zwei der Analysen insbesondere dank der Profile-Likelihood-Methode moglich, welche sorgfaltig untersucht wird. Desweiteren wird zum ersten Mal ein sogenannter sichtbarer Wirkungsquerschnitt in Top-Quark-Ereignissen gemessen. Alle Ergebnisse sind in Ubereinstimmung mit den theoretischen Vorhersagen in angenaherter nachstnachstfuhrender Ordnung der Storungstheorie (approx. NNLO).%%%%This doctoral thesis presents three measurements of the top quark-antiquark pair production cross section in proton-proton collisions at a centre-of-mass energy of 7TeV recorded in 2010 and 2011 with the ATLAS Experiment at the Large Hadron Collider. Events are selected in the single lepton topology by requiring an electron or muon, large missing transverse momentum and at least three jets. While one analysis relies on kinematic information only to discriminate the top quark-antiquark pair signal from the background processes, the other two also make use of b-tagging information. With the help of multivariate methods the most precise measurements in this topology are obtained. This is for two of the measurements in particular possible due to the use of a profile likelihood method which is studied in detail. For the first time a fiducial inclusive cross section measurement for top quark events is performed allowing a measurement almost independent of theoretical uncertainties. All measurements are in agreement with theory predictions performed in perturbation theory at approximate NNLO.
DOI: 10.1109/nssmic.2011.6154404
2011
Operational experience with the ATLAS Pixel Detector
The ATLAS Pixel Detector is the innermost part of the ATLAS Detector which is operated at the Large Hadron Collider (LHC) at CERN. It is a three-hit tracking system that provides high-resolution measurements of charged particle tracks and secondary vertices that are essential to the physics goals of ATLAS. The detector consists of 1744 silicon sensors totalling approximately 80 million electronic channels. The system operated smoothly with very high efficiency. In this paper operational experience with the Pixel Detector will be presented, including monitoring and detector calibration procedures. In addition, the current status of the Pixel Detector and its response to LHC high energy proton-proton collisions are presented.
2013
Measurement of $W/Z$ boson + heavy quark production at ATLAS
DOI: 10.22323/1.134.0468
2012
Measurement of the top quark cross-section in the single-lepton channel in pp collisions at sqrt{s} = 7 TeV using kinematic fits and b-tagging information
2011
Measurement of the top quark cross-section in the single-lepton channel in pp collisions at sqrt{s} = 7 TeV using kinematic fits and b-tagging information
DOI: 10.4016/40700.01
2012
2995 Leucine-rich Alpha-2 Glycoprotein 1 (Lrg1) Contributes To The Development Of Ocular Neovascularisation
DOI: 10.22323/1.314.0302
2017
Search for diboson resonances decaying into W, Z and H bosons at CMS
Beyond the standard model theories like extra-dimensions and composite Higgs scenarios predict the existence of very heavy resonances compatible with a spin-0 (radion), spin-1 (W', Z') and spin-2 (graviton) particle with large branching fractions in pairs of standard model bosons and negligible branching fractions to light fermions.We present an overview of searches for new physics containing W, Z or H bosons in the final state, using proton-proton collision data collected with the CMS detector at the CERN LHC.Many results use novel analysis techniques to identify and reconstruct highly boosted final states that are created in these topologies.These techniques provide increased sensitivity to new high-mass particles over traditional search methods.
DOI: 10.1109/nssmic.2017.8532605
2017
Offline Reconstruction Algorithms for the CMS High Granularity Calorimeter for HL-LHC
The upgraded High Luminosity LHC, after the third Long Shutdown (LS3), will provide an instantaneous luminosity of 7.5 × 10 <sup xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">34</sup> cm <sup xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">-2</sup> s <sup xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">-1</sup> (levelled), at the price of extreme pileup of up to 200 interactions per crossing. Such extreme pileup poses significant challenges, in particular for forward calorimetry. As part of its HL-LHC upgrade program, the CMS collaboration is designing a High Granularity Calorimeter to replace the existing endcap calorimeters. It features unprecedented transverse and longitudinal segmentation for both electromagnetic and hadronic compartments. The electromagnetic and a large fraction of the hadronic portions will be based on hexagonal silicon sensors of 0.5 - 1 cm <sup xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">2</sup> cell size, with the remainder of the hadronic portion based on highly-segmented scintillators with SiPM readout. Offline clustering algorithms that make use of this extreme granularity require novel approaches to preserve the fine structure of showers and to be stable against pileup, while supporting the particle flow approach by enhancing pileup rejection and particle identification. We discuss the principle and performance of a set of clustering algorithms for the HGCAL based on techniques borrowed from machine learning and computer vision.These algorithms lend themselves particularly well to be deployed on GPUs. The features of the algorithm, as well as an analysis of the CPU requirements in the presence of large pileup, are discussed.
DOI: 10.48550/arxiv.2209.08054
2022
Reinterpretation and Long-Term Preservation of Data and Code
Careful preservation of experimental data, simulations, analysis products, and theoretical work maximizes their long-term scientific return on investment by enabling new analyses and reinterpretation of the results in the future. Key infrastructure and technical developments needed for some high-value science targets are not in scope for the operations program of the large experiments and are often not effectively funded. Increasingly, the science goals of our projects require contributions that span the boundaries between individual experiments and surveys, and between the theoretical and experimental communities. Furthermore, the computational requirements and technical sophistication of this work is increasing. As a result, it is imperative that the funding agencies create programs that can devote significant resources to these efforts outside of the context of the operations of individual major experiments, including smaller experiments and theory/simulation work. In this Snowmass 2021 Computational Frontier topical group report (CompF7: Reinterpretation and long-term preservation of data and code), we summarize the current state of the field and make recommendations for the future.
DOI: 10.21468/scipost.report.1656
2020
Report on 2003.07868v2
We report on the status of efforts to improve the reinterpretation of searches and measurements at the LHC in terms of models for new physics, in the context of the LHC Reinterpretation Forum.We detail current experimental offerings in direct searches for new particles, measurements, technical implementations and Open Data, and provide a set of recommendations for further improving the presentation of LHC results in order to better enable reinterpretation in the future.We also provide a brief description of existing software reinterpretation frameworks and recent global analyses of new physics that make use of the current data.
DOI: 10.21468/scipost.report.1652
2020
Report on 2003.07868v2
We report on the status of efforts to improve the reinterpretation of searches and measurements at the LHC in terms of models for new physics, in the context of the LHC Reinterpretation Forum.We detail current experimental offerings in direct searches for new particles, measurements, technical implementations and Open Data, and provide a set of recommendations for further improving the presentation of LHC results in order to better enable reinterpretation in the future.We also provide a brief description of existing software reinterpretation frameworks and recent global analyses of new physics that make use of the current data.
DOI: 10.21468/scipost.report.1710
2020
Report on 2003.07868v2
We report on the status of efforts to improve the reinterpretation of searches and measurements at the LHC in terms of models for new physics, in the context of the LHC Reinterpretation Forum.We detail current experimental offerings in direct searches for new particles, measurements, technical implementations and Open Data, and provide a set of recommendations for further improving the presentation of LHC results in order to better enable reinterpretation in the future.We also provide a brief description of existing software reinterpretation frameworks and recent global analyses of new physics that make use of the current data.
2021
Software Training in HEP
Long term sustainability of the high energy physics (HEP) research software ecosystem is essential for the field. With upgrades and new facilities coming online throughout the 2020s this will only become increasingly relevant throughout this decade. Meeting this sustainability challenge requires a workforce with a combination of HEP domain knowledge and advanced software skills. The required software skills fall into three broad groups. The first is fundamental and generic software engineering (e.g. Unix, version control,C++, continuous integration). The second is knowledge of domain specific HEP packages and practices (e.g., the ROOT data format and analysis framework). The third is more advanced knowledge involving more specialized techniques. These include parallel programming, machine learning and data science tools, and techniques to preserve software projects at all scales. This paper dis-cusses the collective software training program in HEP and its activities led by the HEP Software Foundation (HSF) and the Institute for Research and Innovation in Software in HEP (IRIS-HEP). The program equips participants with an array of software skills that serve as ingredients from which solutions to the computing challenges of HEP can be formed. Beyond serving the community by ensuring that members are able to pursue research goals, this program serves individuals by providing intellectual capital and transferable skills that are becoming increasingly important to careers in the realm of software and computing, whether inside or outside HEP
2021
arXiv : Unveiling Hidden Physics at the LHC
DOI: 10.48550/arxiv.2109.06065
2021
Unveiling Hidden Physics at the LHC
The field of particle physics is at the crossroads. The discovery of a Higgs-like boson completed the Standard Model (SM), but the lacking observation of convincing resonances Beyond the SM (BSM) offers no guidance for the future of particle physics. On the other hand, the motivation for New Physics has not diminished and is, in fact, reinforced by several striking anomalous results in many experiments. Here we summarise the status of the most significant anomalies, including the most recent results for the flavour anomalies, the multi-lepton anomalies at the LHC, the Higgs-like excess at around 96 GeV, and anomalies in neutrino physics, astrophysics, cosmology, and cosmic rays. While the LHC promises up to 4/ab of integrated luminosity and far-reaching physics programmes to unveil BSM physics, we consider the possibility that the latter could be tested with present data, but that systemic shortcomings of the experiments and their search strategies may preclude their discovery for several reasons, including: final states consisting in soft particles only, associated production processes, QCD-like final states, close-by SM resonances, and SUSY scenarios where no missing energy is produced. New search strategies could help to unveil the hidden BSM signatures, devised by making use of the CERN open data as a new testing ground. We discuss the CERN open data with its policies, challenges, and potential usefulness for the community. We showcase the example of the CMS collaboration, which is the only collaboration regularly releasing some of its data. We find it important to stress that individuals using public data for their own research does not imply competition with experimental efforts, but rather provides unique opportunities to give guidance for further BSM searches by the collaborations. Wide access to open data is paramount to fully exploit the LHCs potential.
DOI: 10.17433/978-3-17-040622-3
2021
Das Feuerwehr-Lehrbuch