ϟ

Lukasz Kreczko

Here are all the papers by Lukasz Kreczko that you can download and read on OA.mg.
Lukasz Kreczko’s last known institution is . Download Lukasz Kreczko PDFs here.

Claim this Profile →
DOI: 10.1140/epjc/s10052-011-1661-y
2011
Cited 292 times
Boosted objects: a probe of beyond the standard model physics
We present the report of the hadronic working group of the BOOST2010 workshop held at the University of Oxford in June 2010. The first part contains a review of the potential of hadronic decays of highly boosted particles as an aid for discovery at the LHC and a discussion of the status of tools developed to meet the challenge of reconstructing and isolating these topologies. In the second part, we present new results comparing the performance of jet grooming techniques and top tagging algorithms on a common set of benchmark channels. We also study the sensitivity of jet substructure observables to the uncertainties in Monte Carlo predictions.
DOI: 10.1103/physrevd.101.052002
2020
Cited 171 times
Projected WIMP sensitivity of the LUX-ZEPLIN dark matter experiment
LUX-ZEPLIN (LZ) is a next generation dark matter direct detection experiment that will operate 4850 feet underground at the Sanford Underground Research Facility (SURF) in Lead, South Dakota, USA. Using a two-phase xenon detector with an active mass of 7~tonnes, LZ will search primarily for low-energy interactions with Weakly Interacting Massive Particles (WIMPs), which are hypothesized to make up the dark matter in our galactic halo. In this paper, the projected WIMP sensitivity of LZ is presented based on the latest background estimates and simulations of the detector. For a 1000~live day run using a 5.6~tonne fiducial mass, LZ is projected to exclude at 90\% confidence level spin-independent WIMP-nucleon cross sections above $1.4 \times 10^{-48}$~cm$^{2}$ for a 40~$\mathrm{GeV}/c^{2}$ mass WIMP. Additionally, a $5\sigma$ discovery potential is projected reaching cross sections below the exclusion limits of recent experiments. For spin-dependent WIMP-neutron(-proton) scattering, a sensitivity of $2.3 \times 10^{-43}$~cm$^{2}$ ($7.1 \times 10^{-42}$~cm$^{2}$) for a 40~$\mathrm{GeV}/c^{2}$ mass WIMP is expected. With underground installation well underway, LZ is on track for commissioning at SURF in 2020.
DOI: 10.1103/physrevd.108.072006
2023
Cited 5 times
Search for new physics in low-energy electron recoils from the first LZ exposure
The LUX-ZEPLIN (LZ) experiment is a dark matter detector centered on a dual-phase xenon time projection chamber. We report searches for new physics appearing through few-keV-scale electron recoils, using the experiment's first exposure of 60 live days and a fiducial mass of 5.5 t. The data are found to be consistent with a background-only hypothesis, and limits are set on models for new physics including solar axion electron coupling, solar neutrino magnetic moment and millicharge, and electron couplings to galactic axionlike particles and hidden photons. Similar limits are set on weakly interacting massive particle (WIMP) dark matter producing signals through ionized atomic states from the Migdal effect.
DOI: 10.1103/physrevc.102.014602
2020
Cited 25 times
Projected sensitivity of the LUX-ZEPLIN experiment to the <mml:math xmlns:mml="http://www.w3.org/1998/Math/MathML"><mml:mrow><mml:mn>0</mml:mn><mml:mi>ν</mml:mi><mml:mi>β</mml:mi><mml:mi>β</mml:mi></mml:mrow></mml:math> decay of <mml:math xmlns:mml="http://www.w3.org/1998/Math/MathML"><mml:mmultiscripts><mml:mi>Xe</mml:mi><mml:mprescripts /><mml:none /><mml:mn>136</mml:mn></mml:mmultiscripts></mml:math>
The LUX-ZEPLIN (LZ) experiment will enable a neutrinoless double β decay search in parallel to the main science goal of discovering dark matter particle interactions. We report the expected LZ sensitivity to 136Xe neutrinoless double β decay, taking advantage of the significant (>600 kg) 136Xe mass contained within the active volume of LZ without isotopic enrichment. After 1000 live-days, the median exclusion sensitivity to the half-life of 136Xe is projected to be 1.06×1026 years (90% confidence level), similar to existing constraints. We also report the expected sensitivity of a possible subsequent dedicated exposure using 90% enrichment with 136Xe at 1.06×1027 years.Received 12 December 2019Accepted 2 June 2020DOI:https://doi.org/10.1103/PhysRevC.102.014602©2020 American Physical SocietyPhysics Subject Headings (PhySH)Research AreasNeutrinoless double beta decayRare decaysProperties90 ≤ A ≤ 149TechniquesDark matter detectorsRadiation detectorsParticles & FieldsNuclear Physics
DOI: 10.1103/physrevd.104.092009
2021
Cited 19 times
Projected sensitivities of the LUX-ZEPLIN experiment to new physics via low-energy electron recoils
LUX-ZEPLIN is a dark matter detector expected to obtain world-leading sensitivity to weaklyinteracting massive particles interacting via nuclear recoils with a ∼7-tonne xenon target mass.This paper presents sensitivity projections to several low-energy signals of the complementary electron recoil signal type: 1) an effective neutrino magnetic moment, and 2) an effective neutrino millicharge, both for pp-chain solar neutrinos, 3) an axion flux generated by the Sun, 4) axionlike particles forming the Galactic dark matter, 5) hidden photons, 6) mirror dark matter, and 7) leptophilic dark matter.World-leading sensitivities are expected in each case, a result of the large 5.6 t 1000 d exposure and low expected rate of electron-recoil backgrounds in the < 100 keV energy regime.A consistent signal generation, background model and profile-likelihood analysis framework is used throughout.
DOI: 10.1016/j.astropartphys.2020.102480
2021
Cited 18 times
Simulations of events for the LUX-ZEPLIN (LZ) dark matter experiment
The LUX-ZEPLIN dark matter search aims to achieve a sensitivity to the WIMP-nucleon spin-independent cross-section down to (1--2)$\times10^{-12}$\,pb at a WIMP mass of 40 GeV/$c^2$. This paper describes the simulations framework that, along with radioactivity measurements, was used to support this projection, and also to provide mock data for validating reconstruction and analysis software. Of particular note are the event generators, which allow us to model the background radiation, and the detector response physics used in the production of raw signals, which can be converted into digitized waveforms similar to data from the operational detector. Inclusion of the detector response allows us to process simulated data using the same analysis routines as developed to process the experimental data.
DOI: 10.1016/j.astropartphys.2019.102391
2020
Cited 14 times
Measurement of the gamma ray background in the Davis cavern at the Sanford Underground Research Facility
Deep underground environments are ideal for low background searches due to the attenuation of cosmic rays by passage through the earth. However, they are affected by backgrounds from $\gamma$-rays emitted by $^{40}$K and the $^{238}$U and $^{232}$Th decay chains in the surrounding rock. The LUX-ZEPLIN (LZ) experiment will search for dark matter particle interactions with a liquid xenon TPC located within the Davis campus at the Sanford Underground Research Facility, Lead, South Dakota, at the 4,850-foot level. In order to characterise the cavern background, in-situ $\gamma$-ray measurements were taken with a sodium iodide detector in various locations and with lead shielding. The integral count rates (0--3300~keV) varied from 596~Hz to 1355~Hz for unshielded measurements, corresponding to a total flux in the cavern of $1.9\pm0.4$~$\gamma~$cm$^{-2}$s$^{-1}$. The resulting activity in the walls of the cavern can be characterised as $220\pm60$~Bq/kg of $^{40}$K, $29\pm15$~Bq/kg of $^{238}$U, and $13\pm3$~Bq/kg of $^{232}$Th.
DOI: 10.48550/arxiv.2402.08865
2024
New constraints on ultraheavy dark matter from the LZ experiment
Searches for dark matter with liquid xenon time projection chamber experiments have traditionally focused on the region of the parameter space that is characteristic of weakly interacting massive particles, ranging from a few GeV/$c^2$ to a few TeV/$c^2$. Models of dark matter with a mass much heavier than this are well motivated by early production mechanisms different from the standard thermal freeze-out, but they have generally been less explored experimentally. In this work, we present a re-analysis of the first science run (SR1) of the LZ experiment, with an exposure of $0.9$ tonne$\times$year, to search for ultraheavy particle dark matter. The signal topology consists of multiple energy deposits in the active region of the detector forming a straight line, from which the velocity of the incoming particle can be reconstructed on an event-by-event basis. Zero events with this topology were observed after applying the data selection calibrated on a simulated sample of signal-like events. New experimental constraints are derived, which rule out previously unexplored regions of the dark matter parameter space of spin-independent interactions beyond a mass of 10$^{17}$ GeV/$c^2$.
DOI: 10.48550/arxiv.2404.02100
2024
Analysis Facilities White Paper
This white paper presents the current status of the R&D for Analysis Facilities (AFs) and attempts to summarize the views on the future direction of these facilities. These views have been collected through the High Energy Physics (HEP) Software Foundation's (HSF) Analysis Facilities forum, established in March 2022, the Analysis Ecosystems II workshop, that took place in May 2022, and the WLCG/HSF pre-CHEP workshop, that took place in May 2023. The paper attempts to cover all the aspects of an analysis facility.
DOI: 10.1103/physrevd.105.082004
2022
Cited 6 times
Cosmogenic production of <mml:math xmlns:mml="http://www.w3.org/1998/Math/MathML" display="inline"><mml:mrow><mml:mmultiscripts><mml:mrow><mml:mi>Ar</mml:mi></mml:mrow><mml:mprescripts /><mml:none /><mml:mrow><mml:mn>37</mml:mn></mml:mrow></mml:mmultiscripts></mml:mrow></mml:math> in the context of the LUX-ZEPLIN experiment
We estimate the amount of $^{37}$Ar produced in natural xenon via cosmic ray-induced spallation, an inevitable consequence of the transportation and storage of xenon on the Earth's surface. We then calculate the resulting $^{37}$Ar concentration in a 10-tonne payload~(similar to that of the LUX-ZEPLIN experiment) assuming a representative schedule of xenon purification, storage and delivery to the underground facility. Using the spallation model by Silberberg and Tsao, the sea level production rate of $^{37}$Ar in natural xenon is estimated to be 0.024~atoms/kg/day. Assuming the xenon is successively purified to remove radioactive contaminants in 1-tonne batches at a rate of 1~tonne/month, the average $^{37}$Ar activity after 10~tonnes are purified and transported underground is 0.058--0.090~$\mu$Bq/kg, depending on the degree of argon removal during above-ground purification. Such cosmogenic $^{37}$Ar will appear as a noticeable background in the early science data, while decaying with a 35~day half-life. This newly-noticed production mechanism of $^{37}$Ar should be considered when planning for future liquid xenon-based experiments.
DOI: 10.1103/physrevc.104.065501
2021
Cited 6 times
Projected sensitivity of the LUX-ZEPLIN experiment to the two-neutrino and neutrinoless double <mml:math xmlns:mml="http://www.w3.org/1998/Math/MathML"><mml:mi>β</mml:mi></mml:math> decays of <mml:math xmlns:mml="http://www.w3.org/1998/Math/MathML"><mml:mmultiscripts><mml:mi>Xe</mml:mi><mml:mprescripts /><mml:none /><mml:mn>134</mml:mn></mml:mmultiscripts></mml:math>
The projected sensitivity of the LUX-ZEPLIN (LZ) experiment to two-neutrino and neutrinoless double β decay of 134Xe is presented. LZ is a 10-tonne xenon time-projection chamber optimized for the detection of dark matter particles and is expected to start operating in 2021 at Sanford Underground Research Facility, USA. Its large mass of natural xenon provides an exceptional opportunity to search for the double β decay of 134Xe, for which xenon detectors enriched in 136Xe are less effective. For the two-neutrino decay mode, LZ is predicted to exclude values of the half-life up to 1.7×1024 years at 90% confidence level (CL) and has a three-sigma observation potential of 8.7×1023 years, approaching the predictions of nuclear models. For the neutrinoless decay mode LZ, is projected to exclude values of the half-life up to 7.3×1024 years at 90% CL.Received 19 May 2021Accepted 19 November 2021DOI:https://doi.org/10.1103/PhysRevC.104.065501©2021 American Physical SocietyPhysics Subject Headings (PhySH)Spectrometers & spectroscopic techniquesTechniquesExperimental TechniquesSpectrometers & spectroscopic techniquesResearch AreasDouble beta decayNeutrinoless double beta decayProperties90 ≤ A ≤ 149TechniquesSpectrometers & spectroscopic techniquesNuclear PhysicsParticles & Fields
DOI: 10.1109/rtc.2016.7543077
2016
Cited 3 times
SWATCH: Common software for controlling and monitoring the upgraded level-1 trigger of the CMS experiment
The Large Hadron Collider at CERN restarted in 2015 with a higher centre-of-mass energy of 13TeV. The instantaneous luminosity is expected to increase significantly in the coming years. An upgraded Level-1 trigger system has been deployed in the Compact Muon Solenoid experiment, in order to maintain the same efficiencies for searches and precision measurements as those achieved in the previous run. This system consists of the order of 100 electronics boards connected by the order of 3000 optical links, which must be controlled and monitoring coherently through software, with high operational efficiency. In this paper, we present the design of the software framework that is used to control and monitor the upgraded Level-1 trigger system, and experiences from using this software to commission the upgraded system.
2021
Cited 3 times
Projected sensitivities of the LUX-ZEPLIN (LZ) experiment to new physics via low-energy electron recoils
LUX-ZEPLIN (LZ) is a dark matter detector expected to obtain world-leading sensitivity to weakly interacting massive particles (WIMPs) interacting via nuclear recoils with a ~7-tonne xenon target mass. This manuscript presents sensitivity projections to several low-energy signals of the complementary electron recoil signal type: 1) an effective neutrino magnetic moment and 2) an effective neutrino millicharge, both for pp-chain solar neutrinos, 3) an axion flux generated by the Sun, 4) axion-like particles forming the galactic dark matter, 5) hidden photons, 6) mirror dark matter, and 7) leptophilic dark matter. World-leading sensitivities are expected in each case, a result of the large 5.6t 1000d exposure and low expected rate of electron recoil backgrounds in the <100keV energy regime. A consistent signal generation, background model and profile-likelihood analysis framework is used throughout.
DOI: 10.48550/arxiv.2312.02030
2023
First Constraints on WIMP-Nucleon Effective Field Theory Couplings in an Extended Energy Region From LUX-ZEPLIN
Following the first science results of the LUX-ZEPLIN (LZ) experiment, a dual-phase xenon time projection chamber operating from the Sanford Underground Research Facility in Lead, South Dakota, USA, we report the initial limits on a model-independent non-relativistic effective field theory describing the complete set of possible interactions of a weakly interacting massive particle (WIMP) with a nucleon. These results utilize the same 5.5 t fiducial mass and 60 live days of exposure collected for the LZ spin-independent and spin-dependent analyses while extending the upper limit of the energy region of interest by a factor of 7.5 to 270 keVnr. No significant excess in this high energy region is observed. Using a profile-likelihood ratio analysis, we report 90% confidence level exclusion limits on the coupling of each individual non-relativistic WIMP-nucleon operator for both elastic and inelastic interactions in the isoscalar and isovector bases.
DOI: 10.2172/1436702
2018
HEP Software Foundation Community White Paper Working Group - Data Analysis and Interpretation
At the heart of experimental high energy physics (HEP) is the development of facilities and instrumentation that provide sensitivity to new phenomena. Our understanding of nature at its most fundamental level is advanced through the analysis and interpretation of data from sophisticated detectors in HEP experiments. The goal of data analysis systems is to realize the maximum possible scientific potential of the data within the constraints of computing and human resources in the least time. To achieve this goal, future analysis systems should empower physicists to access the data with a high level of interactivity, reproducibility and throughput capability. As part of the HEP Software Foundation Community White Paper process, a working group on Data Analysis and Interpretation was formed to assess the challenges and opportunities in HEP data analysis and develop a roadmap for activities in this area over the next decade. In this report, the key findings and recommendations of the Data Analysis and Interpretation Working Group are presented.
DOI: 10.5281/zenodo.18897
2015
rootpy: 0.8.0
DOI: 10.1051/epjconf/201921406035
2019
Pandas DataFrames for a FAST binned analysis at CMS
Binned data frames are a generalisation of multi-dimensional histograms, represented in a tabular format with one category per row containing the labels, bin contents, uncertainties and so on. Pandas is an industry-standard tool, which provides a data frame implementation complete with routines for data frame manipultion, persistency, visualisation, and easy access to “big data” scientific libraries and machine learning tools. FAST (the Faster Analysis Software Taskforce) has developed a generic approach for typical binned HEP analyses, driving the summary of ROOT Trees to multiple binned DataFrames with a yaml-based analysis description. Using Continuous Integration to run subsets of the analysis, we can monitor and test changes to the analysis itself, and deploy documentation automatically. This report describes this approach using examples from a public CMS tutorial and details the benefit over traditional methods.
DOI: 10.48550/arxiv.2101.08753
2021
Enhancing the sensitivity of the LUX-ZEPLIN (LZ) dark matter experiment to low energy signals
Two-phase xenon detectors, such as that at the core of the forthcoming LZ dark matter experiment, use photomultiplier tubes to sense the primary (S1) and secondary (S2) scintillation signals resulting from particle interactions in their liquid xenon target. This paper describes a simulation study exploring two techniques to lower the energy threshold of LZ to gain sensitivity to low-mass dark matter and astrophysical neutrinos, which will be applicable to other liquid xenon detectors. The energy threshold is determined by the number of detected S1 photons; typically, these must be recorded in three or more photomultiplier channels to avoid dark count coincidences that mimic real signals. To lower this threshold: a) we take advantage of the double photoelectron emission effect, whereby a single vacuum ultraviolet photon has a $\sim20\%$ probability of ejecting two photoelectrons from a photomultiplier tube photocathode; and b) we drop the requirement of an S1 signal altogether, and use only the ionization signal, which can be detected more efficiently. For both techniques we develop signal and background models for the nominal exposure, and explore accompanying systematic effects, including the dependence on the free electron lifetime in the liquid xenon. When incorporating double photoelectron signals, we predict a factor of $\sim 4$ sensitivity improvement to the dark matter-nucleon scattering cross-section at $2.5$ GeV/c$^2$, and a factor of $\sim1.6$ increase in the solar $^8$B neutrino detection rate. Dropping the S1 requirement may allow sensitivity gains of two orders of magnitude in both cases. Finally, we apply these techniques to even lower masses by taking into account the atomic Migdal effect; this could lower the dark matter particle mass threshold to $80$ MeV/c$^2$.
DOI: 10.5281/zenodo.18815
2015
rootpy: 0.7.0
DOI: 10.5281/zenodo.32493
2015
NTupleProduction: Release for 2015 50ns data
DOI: 10.5281/zenodo.18005
2015
DailyPythonScripts: Run2 version before merging with Run1
DOI: 10.5281/zenodo.31439
2015
NTupleProduction: For full 50ns data set
DOI: 10.5281/zenodo.17828
2015
DailyPythonScripts: DailyPythonScript version for TOP-12-042 paper
DOI: 10.5281/zenodo.13818
2015
AnalysisSoftware: Release used for AN-14-071 V0.4
DOI: 10.5281/zenodo.13792
2015
DailyPythonScripts: Release used for AN-14-071 V0.4
DOI: 10.5281/zenodo.32491
2015
NTupleProduction RunII-EA
DOI: 10.5281/zenodo.17410
2015
NTupleProduction: Run1 legacy release
DOI: 10.5281/zenodo.17801
2015
AnalysisSoftware: AnalysisSoftware version for TOP-12-042 paper
DOI: 10.5281/zenodo.17795
2015
NTupleProduction: NTupleProduction version for TOP-12-042 paper
DOI: 10.5281/zenodo.17408
2015
AnalysisSoftware: Relase for TOP-12-042 version 4
DOI: 10.5281/zenodo.17411
2015
DailyPythonScripts: Release for TOP-12-042 version 4
DOI: 10.5281/zenodo.49340
2016
puppet-dmlite: puppet-dmlite v0.4.1
DOI: 10.5281/zenodo.15102
2015
DailyPythonScripts: Release used for AN-14-071 V0.5
DOI: 10.5281/zenodo.18816
2015
rootpy: 0.6.0
DOI: 10.1088/1742-6596/898/3/032040
2017
SWATCH: Common software for controlling and monitoring the upgraded CMS Level-1 trigger
The Large Hadron Collider at CERN restarted in 2015 with a 13 TeV centre-of-mass energy. In addition, the instantaneous luminosity is expected to increase significantly in the coming years. In order to maintain the same efficiencies for searches and precision measurements as those achieved in the previous run, the CMS experiment upgraded the Level-1 trigger system. The new system consists of the order of 100 electronics boards connected by approximately 3000 optical links, which must be controlled and monitored coherently through software, with high operational efficiency. These proceedings present the design of the control software for the upgraded Level-1 Trigger, and the experience from using this software to commission and operate the upgraded system.
2009
Estimation of QCD Multijet Background for Top Antitop Events from Data
DOI: 10.1051/epjconf/202024506016
2020
The FAST-HEP toolset: Using YAML to make tables out of trees
The Faster Analysis Software Taskforce (FAST) is a small, European group of HEP researchers that have been investigating and developing modern software approaches to improve HEP analyses. We present here an overview of the key product of this effort: a set of packages that allows a complete implementation of an analysis using almost exclusively YAML files. Serving as an analysis description language (ADL), this toolset builds on top of the evolving technologies from the Scikit-HEP and IRIS-HEP projects as well as industry-standard libraries such as Pandas and Matplotlib. Data processing starts with event-level data (the trees) and can proceed by adding variables, selecting events, performing complex user-defined operations and binning data, as defined in the YAML description. The resulting outputs (the tables) are stored as Pandas dataframes which can be programmatically manipulated and converted to plots or inputs for fitting frameworks. No longer just a proof-of-principle, these tools are now being used in CMS analyses, the LUX-ZEPLIN experiment, and by students on several other experiments. In this talk we will showcase these tools through examples, highlighting how they address the different experiments’ needs, and compare them to other similar approaches.
DOI: 10.5281/zenodo.2565840
2019
delphes/delphes: Delphes-3.4.2pre17
DOI: 10.5281/zenodo.3599661
2019
The F.A.S.T. toolset: Using YAML to make tables out of trees
DOI: 10.48550/arxiv.1804.03983
2018
HEP Software Foundation Community White Paper Working Group - Data Analysis and Interpretation
At the heart of experimental high energy physics (HEP) is the development of facilities and instrumentation that provide sensitivity to new phenomena. Our understanding of nature at its most fundamental level is advanced through the analysis and interpretation of data from sophisticated detectors in HEP experiments. The goal of data analysis systems is to realize the maximum possible scientific potential of the data within the constraints of computing and human resources in the least time. To achieve this goal, future analysis systems should empower physicists to access the data with a high level of interactivity, reproducibility and throughput capability. As part of the HEP Software Foundation Community White Paper process, a working group on Data Analysis and Interpretation was formed to assess the challenges and opportunities in HEP data analysis and develop a roadmap for activities in this area over the next decade. In this report, the key findings and recommendations of the Data Analysis and Interpretation Working Group are presented.