ϟ

Richard Cavanaugh

Here are all the papers by Richard Cavanaugh that you can download and read on OA.mg.
Richard Cavanaugh’s last known institution is . Download Richard Cavanaugh PDFs here.

Claim this Profile →
DOI: 10.1023/a:1024000426962
2003
Cited 461 times
DOI: 10.1140/epjc/s10052-012-2243-3
2012
Cited 149 times
The CMSSM and NUHM1 in light of 7 TeV LHC, B s →μ + μ − and XENON100 data
We make a frequentist analysis of the parameter space of the CMSSM and NUHM1, using a Markov Chain Monte Carlo (MCMC) with 95 (221) million points to sample the CMSSM (NUHM1) parameter spaces. Our analysis includes the ATLAS search for supersymmetric jets + MET signals using ~ 5/fb of LHC data at 7 TeV, which we apply using PYTHIA and a Delphes implementation that we validate in the relevant parameter regions of the CMSSM and NUHM1. Our analysis also includes the constraint imposed by searches for B_s to mu+mu- by LHCb, CMS, ATLAS and CDF, and the limit on spin-independent dark matter scattering from 225 live days of XENON100 data. We assume M_h ~ 125 GeV, and use a full set of electroweak precision and other flavour-physics observables, as well as the cold dark matter density constraint. The ATLAS 5/fb constraint has relatively limited effects on the 68 and 95% CL regions in the (m_0, m_1/2) planes of the CMSSM and NUHM1. The new B_s to mu+mu- constraint has greater impacts on these CL regions, and also impacts significantly the 68 and 95% CL regions in the (M_A, tan beta) planes of both models, reducing the best-fit values of tan beta. The recent XENON100 data eliminate the focus-point region in the CMSSM and affect the 68 and 95% CL regions in the NUHM1. In combination, these new constraints reduce the best-fit values of m_0, m_1/2 in the CMSSM, and increase the global chi^2 from 31.0 to 32.8, reducing the p-value from 12% to 8.5%. In the case of the NUHM1, they have little effect on the best-fit values of m_0, m_1/2, but increase the global chi^2 from 28.9 to 31.3, thereby reducing the p-value from 15% to 9.1%.
DOI: 10.1140/epjc/s10052-008-0716-1
2008
Cited 146 times
B, D and K decays
The present report documents the results of Working Group 2: B, D and K decays, of the workshop on Flavor in the Era of the LHC, held at CERN from November 2005 through March 2007. With the advent of the LHC, we will be able to probe New Physics (NP) up to energy scales almost one order of magnitude larger than it has been possible with present accelerator facilities. While direct detection of new particles will be the main avenue to establish the presence of NP at the LHC, indirect searches will provide precious complementary information, since most probably it will not be possible to measure the full spectrum of new particles and their couplings through direct production. In particular, precision measurements and computations in the realm of flavor physics are expected to play a key role in constraining the unknown parameters of the Lagrangian of any NP model emerging from direct searches at the LHC. The aim of Working Group 2 was twofold: on the one hand, to provide a coherent up-to-date picture of the status of flavor physics before the start of the LHC; on the other hand, to initiate activities on the path towards integrating information on NP from high-p T and flavor data. This report is organized as follows: in Sect. 1, we give an overview of NP models, focusing on a few examples that have been discussed in some detail during the workshop, with a short description of the available computational tools for flavor observables in NP models. Section 2 contains a concise discussion of the main theoretical problem in flavor physics: the evaluation of the relevant hadronic matrix elements for weak decays. Section 3 contains a detailed discussion of NP effects in a set of flavor observables that we identified as "benchmark channels" for NP searches. The experimental prospects for flavor physics at future facilities are discussed in Sect. 4. Finally, Sect. 5 contains some assessments on the work done at the workshop and the prospects for future developments.
DOI: 10.1140/epjc/s10052-012-2020-3
2012
Cited 121 times
Higgs and supersymmetry
Global frequentist fits to the CMSSM and NUHM1 using the MasterCode framework predicted M h ≃119 GeV in fits incorporating the (g−2) μ constraint and ≃126 GeV without it. Recent results by ATLAS and CMS could be compatible with a Standard Model-like Higgs boson around M h ≃125 GeV. We use the previous MasterCode analysis to calculate the likelihood for a measurement of any nominal Higgs mass within the range of 115 to 130 GeV. Assuming a Higgs mass measurement at M h ≃125 GeV, we display updated global likelihood contours in the (m 0,m 1/2) and other parameter planes of the CMSSM and NUHM1, and present updated likelihood functions for $m_{\tilde{g}}, m_{\tilde{q}_{R}}$ , BR(B s →μ + μ −) and the spin-independent dark matter cross section $\sigma^{\mathrm{SI}}_{p}$ . The implications of dropping (g−2) μ from the fits are also discussed. We furthermore comment on a hypothetical measurement of M h ≃119 GeV.
DOI: 10.1016/j.nuclphysbps.2010.03.001
2010
Cited 120 times
The Hunt for New Physics at the Large Hadron Collider
The Large Hadron Collider presents an unprecedented opportunity to probe the realm of new physics in the TeV region and shed light on some of the core unresolved issues of particle physics. These include the nature of electroweak symmetry breaking, the origin of mass, the possible constituent of cold dark matter, new sources of CP violation needed to explain the baryon excess in the universe, the possible existence of extra gauge groups and extra matter, and importantly the path Nature chooses to resolve the hierarchy problem – is it supersymmetry or extra dimensions. Many models of new physics beyond the standard model contain a hidden sector which can be probed at the LHC. Additionally, the LHC will be a top factory and accurate measurements of the properties of the top and its rare decays will provide a window to new physics. Further, the LHC could shed light on the origin of neutralino masses if the new physics associated with their generation lies in the TeV region. Finally, the LHC is also a laboratory to test the hypothesis of TeV scale strings and D brane models. An overview of these possibilities is presented in the spirit that it will serve as a companion to the Technical Design Reports (TDRs) by the particle detector groups ATLAS and CMS to facilitate the test of the new theoretical ideas at the LHC. Which of these ideas stands the test of the LHC data will govern the course of particle physics in the subsequent decades.
DOI: 10.1140/epjc/s10052-014-2922-3
2014
Cited 99 times
The CMSSM and NUHM1 after LHC Run 1
We analyze the impact of data from the full Run 1 of the LHC at 7 and 8 TeV on the CMSSM with [Formula: see text] and [Formula: see text] and the NUHM1 with [Formula: see text], incorporating the constraints imposed by other experiments such as precision electroweak measurements, flavour measurements, the cosmological density of cold dark matter and the direct search for the scattering of dark matter particles in the LUX experiment. We use the following results from the LHC experiments: ATLAS searches for events with [Formula: see text] accompanied by jets with the full 7 and 8 TeV data, the ATLAS and CMS measurements of the mass of the Higgs boson, the CMS searches for heavy neutral Higgs bosons and a combination of the LHCb and CMS measurements of [Formula: see text] and [Formula: see text]. Our results are based on samplings of the parameter spaces of the CMSSM for both [Formula: see text] and [Formula: see text] and of the NUHM1 for [Formula: see text] with 6.8[Formula: see text], 6.2[Formula: see text] and 1.6[Formula: see text] points, respectively, obtained using the MultiNest tool. The impact of the Higgs-mass constraint is assessed using FeynHiggs 2.10.0, which provides an improved prediction for the masses of the MSSM Higgs bosons in the region of heavy squark masses. It yields in general larger values of [Formula: see text] than previous versions of FeynHiggs, reducing the pressure on the CMSSM and NUHM1. We find that the global [Formula: see text] functions for the supersymmetric models vary slowly over most of the parameter spaces allowed by the Higgs-mass and the [Formula: see text] searches, with best-fit values that are comparable to the [Formula: see text] for the best Standard Model fit. We provide 95 % CL lower limits on the masses of various sparticles and assess the prospects for observing them during Run 2 of the LHC.
DOI: 10.1140/epjc/s10052-015-3718-9
2015
Cited 88 times
Supersymmetric dark matter after LHC run 1
Different mechanisms operate in various regions of the MSSM parameter space to bring the relic density of the lightest neutralino, $$\tilde{\chi }^0_{1}$$ , assumed here to be the lightest SUSY particle (LSP) and thus the dark matter (DM) particle, into the range allowed by astrophysics and cosmology. These mechanisms include coannihilation with some nearly degenerate next-to-lightest supersymmetric particle such as the lighter stau $$\tilde{\tau }_{1}$$ , stop $$\tilde{t}_{1}$$ or chargino $$\tilde{\chi }^\pm _{1}$$ , resonant annihilation via direct-channel heavy Higgs bosons H / A, the light Higgs boson h or the Z boson, and enhanced annihilation via a larger Higgsino component of the LSP in the focus-point region. These mechanisms typically select lower-dimensional subspaces in MSSM scenarios such as the CMSSM, NUHM1, NUHM2, and pMSSM10. We analyze how future LHC and direct DM searches can complement each other in the exploration of the different DM mechanisms within these scenarios. We find that the $${\tilde{\tau }_1}$$ coannihilation regions of the CMSSM, NUHM1, NUHM2 can largely be explored at the LHC via searches for $$/ \!\!\!\! E_T$$ events and long-lived charged particles, whereas their H / A funnel, focus-point and $$\tilde{\chi }^\pm _{1}$$ coannihilation regions can largely be explored by the LZ and Darwin DM direct detection experiments. We find that the dominant DM mechanism in our pMSSM10 analysis is $$\tilde{\chi }^\pm _{1}$$ coannihilation: parts of its parameter space can be explored by the LHC, and a larger portion by future direct DM searches.
DOI: 10.1140/epjc/s10052-009-1159-z
2009
Cited 89 times
Likelihood functions for supersymmetric observables in frequentist analyses of the CMSSM and NUHM1
On the basis of frequentist analyses of experimental constraints from electroweak precision data, (g−2) μ , B-physics and cosmological data, we investigate the parameters of the constrained MSSM (CMSSM) with universal soft supersymmetry-breaking mass parameters, and a model with common non-universal Higgs masses (NUHM1). We present χ 2 likelihood functions for the masses of supersymmetric particles and Higgs bosons, as well as BR(b→s γ), BR(B s →μ + μ −) and the spin-independent dark-matter scattering cross section, σ SI . In the CMSSM we find preferences for sparticle masses that are relatively light. In the NUHM1 the best-fit values for many sparticle masses are even slightly smaller, but with greater uncertainties. The likelihood functions for most sparticle masses are cut off sharply at small masses, in particular by the LEP Higgs mass constraint. Both in the CMSSM and the NUHM1, the coannihilation region is favored over the focus-point region at about the 3-σ level, largely but not exclusively because of (g−2) μ . Many sparticle masses are highly correlated in both the CMSSM and NUHM1, and most of the regions preferred at the 95% C.L. are accessible to early LHC running, though high-luminosity running would be needed to cover the regions allowed at the 3-σ levels. Some slepton and chargino/neutralino masses should be in reach at the ILC. The masses of the heavier Higgs bosons should be accessible at the LHC and the ILC in portions of the preferred regions in the (M A ,tan β) plane. In the CMSSM, the likelihood function for BR(B s →μ + μ −) is peaked close to the Standard Model value, but much larger values are possible in the NUHM1. We find that values of σ SI >10−10 pb are preferred in both the CMSSM and the NUHM1. We study the effects of dropping the (g−2) μ , BR(b→s γ), Ω χ h 2 and M h constraints, demonstrating that they are not in tension with the other constraints.
DOI: 10.1140/epjc/s10052-012-1878-4
2012
Cited 74 times
Supersymmetry in light of 1/fb of LHC data
We update previous frequentist analyses of the CMSSM and NUHM1 parameter spaces to include the public results of searches for supersymmetric signals using ∼1/fb of LHC data recorded by ATLAS and CMS and ∼0.3/fb of data recorded by LHCb in addition to electroweak precision and B-physics observables. We also include the constraints imposed by the cosmological dark matter density and the XENON100 search for spin-independent dark matter scattering. The LHC data set includes ATLAS and CMS searches for jets + events and for the heavier MSSM Higgs bosons, and the upper limits on BR(B s →μ + μ −) from LHCb and CMS. The absences of jets + signals in the LHC data favour heavier mass spectra than in our previous analyses of the CMSSM and NUHM1, which may be reconciled with (g−2) μ if tanβ∼40, a possibility that is, however, under pressure from heavy Higgs searches and the upper limits on BR(B s →μ + μ −). As a result, the p-value for the CMSSM fit is reduced to ∼15(38)%, and that for the NUHM1 to ∼16(38)%, to be compared with ∼9(49)% for the Standard Model limit of the CMSSM for the same set of observables (dropping (g−2) μ ), ignoring the dark matter relic density. We discuss the sensitivities of the fits to the (g−2) μ and BR(b→sγ) constraints, contrasting fits with and without the (g−2) μ constraint, and combining the theoretical and experimental errors for BR(b→sγ) linearly or in quadrature. We present predictions for $m_{\tilde{g}}$ , BR(B s →μ + μ −), M h and M A , and update predictions for spin-independent dark matter scattering, incorporating the uncertainty in the π-nucleon σ term Σ πN . Finally, we present predictions based on our fits for the likely thresholds for sparticle pair production in e + e − collisions in the CMSSM and NUHM1.
DOI: 10.1140/epjc/s10052-015-3599-y
2015
Cited 60 times
The pMSSM10 after LHC run 1
We present a frequentist analysis of the parameter space of the pMSSM10, in which the following 10 soft SUSY-breaking parameters are specified independently at the mean scalar top mass scale Msusy = Sqrt[M_stop1 M_stop2]: the gaugino masses M_{1,2,3}, the 1st-and 2nd-generation squark masses M_squ1 = M_squ2, the third-generation squark mass M_squ3, a common slepton mass M_slep and a common trilinear mixing parameter A, the Higgs mixing parameter mu, the pseudoscalar Higgs mass M_A and tan beta. We use the MultiNest sampling algorithm with 1.2 x 10^9 points to sample the pMSSM10 parameter space. A dedicated study shows that the sensitivities to strongly-interacting SUSY masses of ATLAS and CMS searches for jets, leptons + MET signals depend only weakly on many of the other pMSSM10 parameters. With the aid of the Atom and Scorpion codes, we also implement the LHC searches for EW-interacting sparticles and light stops, so as to confront the pMSSM10 parameter space with all relevant SUSY searches. In addition, our analysis includes Higgs mass and rate measurements using the HiggsSignals code, SUSY Higgs exclusion bounds, the measurements B-physics observables, EW precision observables, the CDM density and searches for spin-independent DM scattering. We show that the pMSSM10 is able to provide a SUSY interpretation of (g-2)_mu, unlike the CMSSM, NUHM1 and NUHM2. As a result, we find (omitting Higgs rates) that the minimum chi^2/dof = 20.5/18 in the pMSSM10, corresponding to a chi^2 probability of 30.8 %, to be compared with chi^2/dof = 32.8/24 (31.1/23) (30.3/22) in the CMSSM (NUHM1) (NUHM2). We display 1-dimensional likelihood functions for SUSY masses, and show that they may be significantly lighter in the pMSSM10 than in the CMSSM, NUHM1 and NUHM2. We discuss the discovery potential of future LHC runs, e+e- colliders and direct detection experiments.
DOI: 10.1088/1126-6708/2008/09/117
2008
Cited 77 times
Predictions for supersymmetric particle masses using indirect experimental and cosmological constraints
In view of the imminent start of the LHC experimental programme, we use the available indirect experimental and cosmological information to estimate the likely range of parameters of the constrained minimal supersymmetric extension of the Standard Model (CMSSM), using a Markov-chain Monte Carlo (MCMC) technique to sample the parameter space. The 95% confidence-level area in the (m0, m1/2) plane of the CMSSM lies largely within the region that could be explored with 1 fb−1 of integrated luminosity at 14 TeV, and much of the 68% confidence-level area lies within the region that could be explored with 50 pb−1 of integrated luminosity at 10 TeV. A same-sign dilepton signal could well be visible in most of the 68% confidence-level area with 1 fb−1 of integrated luminosity at 14 TeV. We discuss the sensitivities of the preferred ranges to variations in the most relevant indirect experimental and cosmological constraints and also to deviations from the universality of the supersymmetry-breaking contributions to the masses of the Higgs bosons.
DOI: 10.1016/j.physletb.2007.09.058
2007
Cited 76 times
Prediction for the lightest Higgs boson mass in the CMSSM using indirect experimental constraints
Measurements at low energies provide interesting indirect information about masses of particles that are (so far) too heavy to be produced directly. Motivated by recent progress in consistently and rigorously calculating electroweak precision observables and flavour related observables, we derive the preferred value for mh in the Constrained Minimal Supersymmetric Standard Model (CMSSM), obtained from a fit taking into account electroweak precision data, flavour physics observables and the abundance of cold dark matter. No restriction is imposed on mh itself: the experimental bound from direct Higgs boson search at LEP is not included in the fit. A multi-parameter χ2 is minimized with respect to the free parameters of the CMSSM, M0, M1/2, A0, tanβ. A statistical comparison with the Standard Model fit to the electroweak precision data is made. The preferred value for the lightest Higgs boson mass in the CMSSM is found to be mhCMSSM=110−10+8(exp.)±3(theo.)GeV/c2, where the first uncertainty is experimental and the second uncertainty is theoretical. This value is compatible with the limit from direct Higgs boson search at LEP.
DOI: 10.1140/epjc/s10052-011-1722-2
2011
Cited 68 times
Supersymmetry and dark matter in light of LHC 2010 and XENON100 data
We make frequentist analyses of the CMSSM, NUHM1, VCMSSM and mSUGRA parameter spaces taking into account all the public results of searches for supersymmetry using data from the 2010 LHC run and the XENON100 direct search for dark matter scattering. The LHC data set includes ATLAS and CMS searches for $\mathrm{jets} + {\not}E_{T}$ events (with or without leptons) and for the heavier MSSM Higgs bosons, and the upper limit on BR(B s →μ + μ −) including data from LHCb as well as CDF and DØ. The absence of signals in the LHC data favours somewhat heavier mass spectra than in our previous analyses of the CMSSM, NUHM1 and VCMSSM, and somewhat smaller dark matter scattering cross sections, all close to or within the pre-LHC 68% CL ranges, but does not impact significantly the favoured regions of the mSUGRA parameter space. We also discuss the impact of the XENON100 constraint on spin-independent dark matter scattering, stressing the importance of taking into account the uncertainty in the π-nucleon σ term Σ πN , which affects the spin-independent scattering matrix element, and we make predictions for spin-dependent dark matter scattering. Finally, we discuss briefly the potential impact of the updated predictions for sparticle masses in the CMSSM, NUHM1, VCMSSM and mSUGRA on future e + e − colliders.
DOI: 10.1140/epjc/s10052-011-1634-1
2011
Cited 52 times
Implications of initial LHC searches for supersymmetry
The CMS and ATLAS Collaborations have recently published the results of initial direct LHC searches for supersymmetry analyzing ∼35/pb of data taken at 7 TeV in the centre of mass. We incorporate these results into a frequentist analysis of the probable ranges of parameters of simple versions of the minimal supersymmetric extension of the Standard Model (MSSM), namely the constrained MSSM (CMSSM), a model with common non-universal Higgs masses (NUHM1), the very constrained MSSM (VCMSSM) and minimal supergravity (mSUGRA). We present updated predictions for the gluino mass, $m_{\tilde{g}}$ , the light-Higgs boson mass, M h , BR(B s →μ + μ −) and the spin-independent dark matter scattering cross section, $\sigma^{\mathrm{SI}}_{p}$ . The CMS and ATLAS data make inroads into the CMSSM, NUHM1 and VCMSSM (but not mSUGRA) parameter spaces, thereby strengthening previous lower limits on sparticle masses and upper limits on $\sigma^{\mathrm{SI}}_{p}$ in the CMSSM and VCMSSM. The favoured ranges of BR(B s →μ + μ −) in the CMSSM, VCMSSM and mSUGRA are close to the Standard Model, but considerably larger values of BR(B s →μ + μ −) are possible in the NUHM1. Applying the CMS and ATLAS constraints improves the consistency of the model predictions for M h with the LEP exclusion limits.
DOI: 10.1140/epjc/s10052-011-1583-8
2011
Cited 44 times
Frequentist analysis of the parameter space of minimal supergravity
We make a frequentist analysis of the parameter space of minimal supergravity (mSUGRA), in which, as well as the gaugino and scalar soft supersymmetry-breaking parameters being universal, there is a specific relation between the trilinear, bilinear and scalar supersymmetry-breaking parameters, A 0=B 0+m 0, and the gravitino mass is fixed by m 3/2=m 0. We also consider a more general model, in which the gravitino mass constraint is relaxed (the VCMSSM). We combine in the global likelihood function the experimental constraints from low-energy electroweak precision data, the anomalous magnetic moment of the muon, the lightest Higgs boson mass M h , B physics and the astrophysical cold dark matter density, assuming that the lightest supersymmetric particle (LSP) is a neutralino. In the VCMSSM, we find a preference for values of m 1/2 and m 0 similar to those found previously in frequentist analyses of the constrained MSSM (CMSSM) and a model with common non-universal Higgs masses (NUHM1). On the other hand, in mSUGRA we find two preferred regions: one with larger values of both m 1/2 and m 0 than in the VCMSSM, and one with large m 0 but small m 1/2. We compare the probabilities of the frequentist fits in mSUGRA, the VCMSSM, the CMSSM and the NUHM1: the probability that mSUGRA is consistent with the present data is significantly less than in the other models. We also discuss the mSUGRA and VCMSSM predictions for sparticle masses and other observables, identifying potential signatures at the LHC and elsewhere.
DOI: 10.1140/epjc/s10052-014-3212-9
2014
Cited 39 times
The NUHM2 after LHC Run 1
We make a frequentist analysis of the parameter space of the NUHM2, in which the soft supersymmetry (SUSY)-breaking contributions to the masses of the two Higgs multiplets, [Formula: see text], vary independently from the universal soft SUSY-breaking contributions [Formula: see text] to the masses of squarks and sleptons. Our analysis uses the MultiNest sampling algorithm with over [Formula: see text] points to sample the NUHM2 parameter space. It includes the ATLAS and CMS Higgs mass measurements as well as the ATLAS search for supersymmetric jets + [Formula: see text] signals using the full LHC Run 1 data, the measurements of [Formula: see text] by LHCb and CMS together with other B-physics observables, electroweak precision observables and the XENON100 and LUX searches for spin-independent dark-matter scattering. We find that the preferred regions of the NUHM2 parameter space have negative SUSY-breaking scalar masses squared at the GUT scale for squarks and sleptons, [Formula: see text], as well as [Formula: see text]. The tension present in the CMSSM and NUHM1 between the supersymmetric interpretation of [Formula: see text] and the absence to date of SUSY at the LHC is not significantly alleviated in the NUHM2. We find that the minimum [Formula: see text] with 21 degrees of freedom (dof) in the NUHM2, to be compared with [Formula: see text] in the CMSSM, and [Formula: see text] in the NUHM1. We find that the one-dimensional likelihood functions for sparticle masses and other observables are similar to those found previously in the CMSSM and NUHM1.
DOI: 10.1109/hpdc.2004.36
2004
Cited 49 times
The Grid2003 production grid: principles and practice
The Grid2003 Project has deployed a multivirtual organization, application-driven grid laboratory (Grid3) that has sustained for several months the production-level services required by physics experiments of the Large Hadron Collider at CERN (ATLAS and CMS), the Sloan Digital Sky Survey project, the gravitational wave search experiment LIGO, the BTeV experiment at Fermilab, as well as applications in molecular structure analysis and genome analysis, and computer science research projects in such areas as job and data scheduling. The deployed infrastructure has been operating since November 2003 with 27 sites, a peak of 2800 processors, work loads from 10 different applications exceeding 1300 simultaneous jobs, and data transfers among sites of greater than 2 TB/day. We describe the principles that have guided the development of this unique infrastructure and the practical experiences that have resulted from its creation and use. We discuss application requirements for grid services deployment and configuration, monitoring infrastructure, application performance, metrics, and operational experiences. We also summarize lessons learned.
DOI: 10.1140/epjc/s10052-017-4810-0
2017
Cited 24 times
Likelihood analysis of the minimal AMSB model
We perform a likelihood analysis of the minimal anomaly-mediated supersymmetry-breaking (mAMSB) model using constraints from cosmology and accelerator experiments. We find that either a wino-like or a Higgsino-like neutralino LSP, $$\tilde{\chi }^0_{1}$$ , may provide the cold dark matter (DM), both with similar likelihoods. The upper limit on the DM density from Planck and other experiments enforces $$m_{\tilde{\chi }^0_{1}} \lesssim 3 \,\, \mathrm {TeV}$$ after the inclusion of Sommerfeld enhancement in its annihilations. If most of the cold DM density is provided by the $$\tilde{\chi }^0_{1}$$ , the measured value of the Higgs mass favours a limited range of $$\tan \beta \sim 5$$ (and also for $$\tan \beta \sim 45$$ if $$\mu > 0$$ ) but the scalar mass $$m_0$$ is poorly constrained. In the wino-LSP case, $$m_{3/2}$$ is constrained to about $$900\,\, \mathrm {TeV}$$ and $$m_{\tilde{\chi }^0_{1}}$$ to $$2.9\pm 0.1\,\, \mathrm {TeV}$$ , whereas in the Higgsino-LSP case $$m_{3/2}$$ has just a lower limit $$\gtrsim 650\,\, \mathrm {TeV}$$ ( $$\gtrsim 480\,\, \mathrm {TeV}$$ ) and $$m_{\tilde{\chi }^0_{1}}$$ is constrained to $$1.12 ~(1.13) \pm 0.02\,\, \mathrm {TeV}$$ in the $$\mu >0$$ ( $$\mu <0$$ ) scenario. In neither case can the anomalous magnetic moment of the muon, $$(g-2)_\mu $$ , be improved significantly relative to its Standard Model (SM) value, nor do flavour measurements constrain the model significantly, and there are poor prospects for discovering supersymmetric particles at the LHC, though there are some prospects for direct DM detection. On the other hand, if the $$\tilde{\chi }^0_{1}$$ contributes only a fraction of the cold DM density, future LHC -based searches for gluinos, squarks and heavier chargino and neutralino states as well as disappearing track searches in the wino-like LSP region will be relevant, and interference effects enable $$\mathrm{BR}(B_{s, d} \rightarrow \mu ^+\mu ^-)$$ to agree with the data better than in the SM in the case of wino-like DM with $$\mu > 0$$ .
DOI: 10.1109/hpdc.2004.1323544
2004
Cited 44 times
The grid2003 production grid: principles and practice
The Grid2003 Project has deployed a multivirtual organization, application-driven grid laboratory ("Grid3") that has sustained for several months the production-level services required by physics experiments of the Large Hadron Collider at CERN (ATLAS and CMS), the Sloan Digital Sky Survey project, the gravitational wave search experiment LIGO, the BTeV experiment at Fermilab, as well as applications in molecular structure analysis and genome analysis, and computer science research projects in such areas as job and data scheduling. The deployed infrastructure has been operating since November 2003 with 27 sites, a peak of 2800 processors, work loads from 10 different applications exceeding 1300 simultaneous jobs, and data transfers among sites of greater than 2 TB/day. We describe the principles that have guided the development of this unique infrastructure and the practical experiences that have resulted from its creation and use. We discuss application requirements for grid services deployment and configuration, monitoring infrastructure, application performance, metrics, and operational experiences. We also summarize lessons learned.
DOI: 10.1140/epjc/s10052-017-4639-6
2017
Cited 21 times
Likelihood analysis of supersymmetric SU(5) GUTs
We perform a likelihood analysis of the constraints from accelerator experiments and astrophysical observations on supersymmetric (SUSY) models with SU(5) boundary conditions on soft SUSY-breaking parameters at the GUT scale. The parameter space of the models studied has 7 parameters: a universal gaugino mass $m_{1/2}$, distinct masses for the scalar partners of matter fermions in five- and ten-dimensional representations of SU(5), $m_5$ and $m_{10}$, and for the $\mathbf{5}$ and $\mathbf{\bar 5}$ Higgs representations $m_{H_u}$ and $m_{H_d}$, a universal trilinear soft SUSY-breaking parameter $A_0$, and the ratio of Higgs vevs $\tan \beta$. In addition to previous constraints from direct sparticle searches, low-energy and flavour observables, we incorporate constraints based on preliminary results from 13 TeV LHC searches for jets + MET events and long-lived particles, as well as the latest PandaX-II and LUX searches for direct Dark Matter detection. In addition to previously-identified mechanisms for bringing the supersymmetric relic density into the range allowed by cosmology, we identify a novel ${\tilde u_R}/{\tilde c_R} - \tilde{\chi}^0_1$ coannihilation mechanism that appears in the supersymmetric SU(5) GUT model and discuss the role of ${\tilde \nu_\tau}$ coannihilation. We find complementarity between the prospects for direct Dark Matter detection and SUSY searches at the LHC.
DOI: 10.1109/ipdps.2004.1302932
2004
Cited 37 times
Policy based scheduling for simple quality of service in grid computing
Summary form only given. We discuss a novel framework for policy based scheduling in resource allocation of grid computing. The framework has several features. First, the scheduling strategy can control the request assignment to grid resources by adjusting usage accounts or request priorities. Second, Efficient resource management is achieved by assigning usage quotas to intended users. Third, the scheduling method supports reservation based grid resource allocation. Fourth, quality of service feature allows special privileges to various classes of requests. Experimental results demonstrate the usefulness of the framework.
DOI: 10.1117/1.oe.63.1.015101
2024
Cell tower contrast in the visible, short-wave infrared, and long-wave infrared bands
In a GPS-denied environment, distinct structures, such as cell towers and transmission towers, are useful as an aid to vision-based navigation. Cell towers are surveyed such that their locations are well known, and the imagery of these towers can be compared to imagery databases to assist in navigation. In this research, imagery of the cell towers was taken in the visible (VIS), short-wave infrared (SWIR), and long-wave infrared (LWIR) bands with both clear sky and portions of the ground in the background. The contrast of the cell towers in the two reflective bands (VIS and SWIR) was determined against the sky and the ground in terms of equivalent reflectivity. The contrast of the cell towers was also determined in the LWIR in terms of equivalent blackbody temperature. The analysis of contrast, the results, and recommendations on band use are provided for use in three-dimensional map image comparisons.
DOI: 10.1364/opticaopen.25285402
2024
Comparison of plane-to-sky contrast and detection range performance in the visible, short-wave infrared, mid-wave infrared, and long-wave infrared bands
Recent advances in uncrewed aerial vehicle (UAV) technology have the potential to benefit diverse civil, commercial, scientific, and defense projects. Many applications would benefit from beyond visual line of sight (BVLOS) operations. Such operations require automated safety systems that allow uncrewed aircraft to avoid collisions with both crewed and uncrewed aircraft. Electro-optical and infrared sensors are commonly employed by aircraft collision avoidance systems. This study compares plane-to-sky contrast in the VIS (0.4 to 0.7 µm), SWIR (1 to 1.7 µm), MWIR (3 to 5 µm), and LWIR (8 to 14 µm) to determine which band is most sensitive to aircraft signal against a clear sky background. Contrast in the two reflective bands (VIS and SWIR) is determined in terms of equivalent reflectivity, and contrast in the two emissive bands (MWIR and LWIR) is determined in terms of equivalent blackbody temperature. Sensitivity data is then used alongside resolution specs to estimate detection performance at-range using the Night Vision Integrated Performance Model (NVIPM). Results are extrapolated to a maritime atmosphere using MODTRAN. The analysis of contrast, range-performance, and recommendations on band selection are provided for reference in the design of EO/IR systems for aircraft collision avoidance. Future research may study band performance at night and against other backgrounds (e.g. clouds, ocean, ground terrain).
DOI: 10.1364/opticaopen.25285402.v1
2024
Comparison of plane-to-sky contrast and detection range performance in the visible, short-wave infrared, mid-wave infrared, and long-wave infrared bands
Recent advances in uncrewed aerial vehicle (UAV) technology have the potential to benefit diverse civil, commercial, scientific, and defense projects. Many applications would benefit from beyond visual line of sight (BVLOS) operations. Such operations require automated safety systems that allow uncrewed aircraft to avoid collisions with both crewed and uncrewed aircraft. Electro-optical and infrared sensors are commonly employed by aircraft collision avoidance systems. This study compares plane-to-sky contrast in the VIS (0.4 to 0.7 µm), SWIR (1 to 1.7 µm), MWIR (3 to 5 µm), and LWIR (8 to 14 µm) to determine which band is most sensitive to aircraft signal against a clear sky background. Contrast in the two reflective bands (VIS and SWIR) is determined in terms of equivalent reflectivity, and contrast in the two emissive bands (MWIR and LWIR) is determined in terms of equivalent blackbody temperature. Sensitivity data is then used alongside resolution specs to estimate detection performance at-range using the Night Vision Integrated Performance Model (NVIPM). Results are extrapolated to a maritime atmosphere using MODTRAN. The analysis of contrast, range-performance, and recommendations on band selection are provided for reference in the design of EO/IR systems for aircraft collision avoidance. Future research may study band performance at night and against other backgrounds (e.g. clouds, ocean, ground terrain).
DOI: 10.1364/opticaopen.25285402.v2
2024
Comparison of plane-to-sky contrast and detection range performance in the visible, short-wave infrared, mid-wave infrared, and long-wave infrared bands
Recent advances in uncrewed aerial vehicle (UAV) technology have the potential to benefit diverse civil, commercial, scientific, and defense projects. Many applications would benefit from beyond visual line of sight (BVLOS) operations. Such operations require automated safety systems that allow uncrewed aircraft to avoid collisions with both crewed and uncrewed aircraft. Electro-optical and infrared sensors are commonly employed by aircraft collision avoidance systems. This study compares plane-to-sky contrast in the VIS (0.4 to 0.7 µm), SWIR (1 to 1.7 µm), MWIR (3 to 5 µm), and LWIR (8 to 14 µm) to determine which band is most sensitive to aircraft signal against a clear sky background. Contrast in the two reflective bands (VIS and SWIR) is determined in terms of equivalent reflectivity, and contrast in the two emissive bands (MWIR and LWIR) is determined in terms of equivalent blackbody temperature. Sensitivity data is then used alongside resolution specs to estimate detection performance at-range using the Night Vision Integrated Performance Model (NVIPM). Results are extrapolated to a maritime atmosphere using MODTRAN. The analysis of contrast, range-performance, and recommendations on band selection are provided for reference in the design of EO/IR systems for aircraft collision avoidance. Future research may study band performance at night and against other backgrounds (e.g. clouds, ocean, ground terrain).
DOI: 10.48550/arxiv.1508.01173
2015
Cited 13 times
Supersymmetric Dark Matter after LHC Run 1
Different mechanisms operate in various regions of the MSSM parameter space to bring the relic density of the lightest neutralino, neutralino_1, assumed here to be the LSP and thus the Dark Matter (DM) particle, into the range allowed by astrophysics and cosmology. These mechanisms include coannihilation with some nearly-degenerate next-to-lightest supersymmetric particle (NLSP) such as the lighter stau (stau_1), stop (stop_1) or chargino (chargino_1), resonant annihilation via direct-channel heavy Higgs bosons H/A, the light Higgs boson h or the Z boson, and enhanced annihilation via a larger Higgsino component of the LSP in the focus-point region. These mechanisms typically select lower-dimensional subspaces in MSSM scenarios such as the CMSSM, NUHM1, NUHM2 and pMSSM10. We analyze how future LHC and direct DM searches can complement each other in the exploration of the different DM mechanisms within these scenarios. We find that the stau_1 coannihilation regions of the CMSSM, NUHM1, NUHM2 can largely be explored at the LHC via searches for missing E_T events and long-lived charged particles, whereas their H/A funnel, focus-point and chargino_1 coannihilation regions can largely be explored by the LZ and Darwin DM direct detection experiments. We find that the dominant DM mechanism in our pMSSM10 analysis is chargino_1 coannihilation: {parts of its parameter space can be explored by the LHC, and a larger portion by future direct DM searches.
DOI: 10.48550/arxiv.hep-ph/0604120
2006
Cited 21 times
Les Houches Physics at TeV Colliders 2005, Standard Model and Higgs working group: Summary report
This Report summarises the activities of the "SM and Higgs" working group for the Workshop "Physics at TeV Colliders", Les Houches, France, 2-20 May, 2005. On the one hand, we performed a variety of experimental and theoretical studies on standard candles (such as W, Z, and ttbar production), treating them either as proper signals of known physics, or as backgrounds to unknown physics; we also addressed issues relevant to those non-perturbative or semi-perturbative ingredients, such as Parton Density Functions and Underlying Events, whose understanding will be crucial for a proper simulation of the actual events taking place in the detectors. On the other hand, several channels for the production of the Higgs, or involving the Higgs, have been considered in some detail. The report is structured into four main parts. The first one deals with Standard Model physics, except the Higgs. A variety of arguments are treated here, from full simulation of processes constituting a background to Higgs production, to studies of uncertainties due to PDFs and to extrapolations of models for underlying events, from small-$x$ issues to electroweak corrections which may play a role in vector boson physics. The second part of the report treats Higgs physics from the point of view of the signal. In the third part, reviews are presented on the current status of multi-leg, next-to-leading order and of next-to-next-to-leading order QCD computations. Finally, the fourth part deals with the use of Monte Carlos for simulation of LHC physics.
2006
Cited 19 times
Les Houches Physics at TeV Colliders 2005, Standard Model and Higgs working group: Summary report
This Report summarises the activities of the SM and working group for the Workshop Physics at TeV Colliders, Les Houches, France, 2-20 May, 2005. On the one hand, we performed a variety of experimental and theoretical studies on standard candles (such as W, Z, and ttbar production), treating them either as proper signals of known physics, or as backgrounds to unknown physics; we also addressed issues relevant to those non-perturbative or semi-perturbative ingredients, such as Parton Density Functions and Underlying Events, whose understanding will be crucial for a proper simulation of the actual events taking place in the detectors. On the other hand, several channels for the production of the Higgs, or involving the Higgs, have been considered in some detail. The report is structured into four main parts. The first one deals with Standard Model physics, except the Higgs. A variety of arguments are treated here, from full simulation of processes constituting a background to Higgs production, to studies of uncertainties due to PDFs and to extrapolations of models for underlying events, from small-$x$ issues to electroweak corrections which may play a role in vector boson physics. The second part of the report treats Higgs physics from the point of view of the signal. In the third part, reviews are presented on the current status of multi-leg, next-to-leading order and of next-to-next-to-leading order QCD computations. Finally, the fourth part deals with the use of Monte Carlos for simulation of LHC physics.
DOI: 10.1103/physrevd.81.035009
2010
Cited 14 times
Predictions for<mml:math xmlns:mml="http://www.w3.org/1998/Math/MathML" display="inline"><mml:msub><mml:mi>m</mml:mi><mml:mi>t</mml:mi></mml:msub></mml:math>and<mml:math xmlns:mml="http://www.w3.org/1998/Math/MathML" display="inline"><mml:msub><mml:mi>M</mml:mi><mml:mi>W</mml:mi></mml:msub></mml:math>in minimal supersymmetric models
Using a frequentist analysis of experimental constraints within two versions of the minimal supersymmetric extension of the standard model, we derive the predictions for the top quark mass, ${m}_{t}$, and the $W$ boson mass, ${M}_{W}$. We find that the supersymmetric predictions for both ${m}_{t}$ and ${M}_{W}$, obtained by incorporating all the relevant experimental information and state-of-the-art theoretical predictions, are highly compatible with the experimental values with small remaining uncertainties, yielding an improvement compared to the case of the standard model.
2014
Cited 11 times
The NUHM2 after LHC Run 1
We make a frequentist analysis of the parameter space of the NUHM2, in which the soft supersymmetry (SUSY)-breaking contributions to the masses of the two Higgs multiplets, $m^2_{H_{u,d}}$, vary independently from the universal soft SUSY-breaking contributions $m^2_0$ to the masses of squarks and sleptons. Our analysis uses the MultiNest sampling algorithm with over $4 \times 10^8$ points to sample the NUHM2 parameter space. It includes the ATLAS and CMS Higgs mass measurements as well as their searches for supersymmetric jets + MET signals using the full LHC Run~1 data, the measurements of $B_s \to \mu^+ \mu^-$ by LHCb and CMS together with other B-physics observables, electroweak precision observables and the XENON100 and LUX searches for spin-independent dark matter scattering. We find that the preferred regions of the NUHM2 parameter space have negative SUSY-breaking scalar masses squared for squarks and sleptons, $m_0^2 < 0$, as well as $m^2_{H_u} < m^2_{H_d} < 0$. The tension present in the CMSSM and NUHM1 between the supersymmetric interpretation of $g_\mu - 2$ and the absence to date of SUSY at the LHC is not significantly alleviated in the NUHM2. We find that the minimum $\chi^2 = 32.5$ with 21 degrees of freedom (dof) in the NUHM2, to be compared with $\chi^2/{\rm dof} = 35.0/23$ in the CMSSM, and $\chi^2/{\rm dof} = 32.7/22$ in the NUHM1. We find that the one-dimensional likelihood functions for sparticle masses and other observables are similar to those found previously in the CMSSM and NUHM1.
DOI: 10.5170/cern-2005-002.750
2004
Cited 19 times
Predicting the Resource Requirements of a Job Submission
Grid computing aims to provide an infrastructure for distributed problem solving in dynamic virtual organizations. It is gaining interest among many scientific disciplines as well as the industrial community. However, current grid solutions still require highly trained programmers with expertise in networking, high-performance computing, and operating systems. One of the big issues in full-scale usage of a grid is matching the resource requirements of job submission to the resources available on the grid. Resource brokers and job schedulers must make estimates of the resource usage of job submissions in order to ensure efficient use of grid resources. We prop ose a prediction engine that will operate as part of a grid scheduler. This prediction engine will provide estimates of the resources required by job submission based upon historical information. This paper presents the need for such a prediction engine and discusses two approaches for history based estimation.
DOI: 10.5170/cern-2005-002.1119
2005
Cited 19 times
Grid Enabled Analysis : Architecture, prototype and status
The Grid Analysis Environment (GAE), which is a continuation of the CAIGEE project [5], is an effort to develop, integrate and deploy a system for distributed analysis. The current focus within the GAE is on the CMS experiment [1] however the GAE design abstracts from any specific scientific experiment and focuses on scientific analysis in general. The GAE project does not intend to reinvent services, but rather to integrate existing services into a collaborative system of web services.
DOI: 10.1109/ipdps.2005.409
2005
Cited 18 times
SPHINX: A Fault-Tolerant System for Scheduling in Dynamic Grid Environments
A grid consists of high-end computational, storage, and network resources that, while known a priori, are dynamic with respect to activity and availability. Efficient scheduling of requests to use grid resources must adapt to this dynamic environment while meeting administrative policies. In this paper, we describe a framework called SPHINX that can administrate grid policies, and schedule complex and data intensive scientific applications. We present experimental results for several scheduling strategies that effectively utilize the monitoring and job-tracking information provided by SPHINX. These results demonstrate that SPHINX can effectively schedule work across a large number of distributed clusters that are owned by multiple units in a virtual organization in a fault-tolerant way in spite of the highly dynamic nature of the grid and complex policy issues. The novelty lies in use of effective monitoring of resources and job execution tracking in making scheduling decisions and fault-tolerance - something that is missed in today's grid environments.
DOI: 10.48550/arxiv.cs/0306009
2003
Cited 9 times
Virtual Data in CMS Production
Initial applications of the GriPhyN Chimera Virtual Data System have been performed within the context of CMS Production of Monte Carlo Simulated Data. The GriPhyN Chimera system consists of four primary components: 1) a Virtual Data Language, which is used to describe virtual data products, 2) a Virtual Data Catalog, which is used to store virtual data entries, 3) an Abstract Planner, which resolves all dependencies of a particular virtual data product and forms a location and existence independent plan, 4) a Concrete Planner, which maps an abstract, logical plan onto concrete, physical grid resources accounting for staging in/out files and publishing results to a replica location service. A CMS Workflow Planner, MCRunJob, is used to generate virtual data products using the Virtual Data Language. Subsequently, a prototype workflow manager, known as WorkRunner, is used to schedule the instantiation of virtual data products across a grid.
2006
Cited 5 times
Tevatron-for-LHC report: preparations for discoveries
This is the TeV4LHC report of the Physics Landscapes Working Group, focused on facilitating the start-up of physics explorations at the LHC by using the experience gained at the Tevatron. We present experimental and theoretical results that can be employed to probe various scenarios for physics beyond the Standard Model.
DOI: 10.48550/arxiv.hep-ph/0605143
2006
Cited 4 times
GARCON: Genetic Algorithm for Rectangular Cuts OptimizatioN. User's manual for version 2.0
This paper presents GARCON program, illustrating its functionality on a simple HEP analysis example. The program automatically performs rectangular cuts optimization and verification for stability in a multi-dimensional phase space. The program has been successfully used by a number of very different analyses presented in the CMS Physics Technical Design Report. The current version GARCON 2.0 incorporates the feedback the authors have received. User's Manual is included as a part of the note.
DOI: 10.1117/12.2663402
2023
Comparison of reflective band (Vis, NIR, SWIR, eSWIR) performance in daytime reduced illumination conditions
Daytime low light conditions such as overcast, dawn, and dusk pose a challenge for object discrimination in the reflective bands, where the majority of illumination comes from reflected solar light. In reduced illumination conditions, sensor signal-to-noise ratio can suffer, inhibiting range performance for recognizing and identifying objects of interest. This performance reduction is more apparent in the longer wavelengths where there is less solar light. Range performance models show a strong dependence on cloud type, thickness, and time of day across all wavebands. Through an experimental and theoretical analysis of a passive sensitivity and resolution matched testbed, we compare Vis (0.4-0.7μm), NIR (0.7-1μm), SWIR (1-1.7μm), and eSWIR (2-2.5μm) to assess the limiting cases in which reduced illumination inhibits range performance.
DOI: 10.1117/12.2663672
2023
Cell tower detection in the VIS, SWIR, and LWIR bands
In a GPS-denied environment, distinct structures such as cell towers and transmission towers are useful as an aid to vision-based navigation. Cell towers are surveyed such that their locations are well-known, and the imagery of these towers can be compared to imagery databases to assist in navigation. In this research, imagery of the cell towers was taken in the VIS, SWIR, and LWIR bands with both clear sky and portions of the ground in the background. The contrast of the cell towers in the two reflective bands was determined against the sky and the ground in terms of equivalent reflectivity. The contrast of the cell towers was also determined in the LWIR in terms of equivalent blackbody temperature. The analysis of contrast is provided, the results are discussed, and recommendations on band use is provided for use in 3D map image comparisons.
DOI: 10.1364/ao.495832
2023
Comparison of reflective band (Vis, NIR, SWIR, eSWIR) performance in daytime reduced illumination conditions
Daytime low-light conditions such as overcast, dawn, and dusk pose a challenge for object discrimination in the reflective bands, where the majority of illumination comes from reflected solar light. In reduced-illumination conditions, the sensor signal-to-noise ratio can suffer, inhibiting range performance for detecting, recognizing, and identifying objects of interest. This performance reduction is more apparent in the longer wavelengths where there is less solar light. Range performance models show a strong dependence on cloud type and thickness, as well as time of day across the reflective wavebands. Through an experimental and theoretical analysis of a passive sensitivity- and resolution-matched testbed, we compare Vis (0.4-0.7 µm), NIR (0.7-1 µm), SWIR (1-1.7 µm), and eSWIR (2-2.5 µm) to assess the limiting cases in which reduced illumination inhibits range performance. The time during dawn and dusk is brief yet does show significant range performance reduction for SWIR and eSWIR. Under heavy cloud cover, eSWIR suffers the most at range due to a low signal-to-noise ratio. In cases of severe reduction in illumination, we propose utilizing active illumination or the emissive component of eSWIR to improve the signal-to-noise ratio for various discrimination tasks.
DOI: 10.1021/ci9500748
1996
Cited 7 times
Periodic Systems of Molecular States from the Boson Group Dynamics of <i>SO</i>(3) × <i>SU</i>(2)<i><sub>s</sub></i>
An overview of the principles of group-dynamic periodic systems is given. The process by which atomic and molecular multiplets are obtained is then described, and reference is made to the support provided by the data. Atomic and molecular periodic systems consist of these multiplets situated in coordinates with axes which are the chemical angular-momentum quantum number l (or L) and the principal quantum number n (or n summed over all atoms). The periodic system for atoms serves as a template for the systems for molecules, and only one additional quantum number is needed for the molecular systems. Equivalence classes of substitutable multiplets are then defined and their numbers given. The structures of the resulting molecular periodic systems are shown as plots of the number of substitutable multiplets located at given quantum-number coordinates. Wall charts are available from R.A.H. or G.V.Zh.
DOI: 10.5170/cern-2005-002.746
2004
Cited 4 times
Job Monitoring in an Interactive Grid Analysis Environment
The grid is emerging as a great computational resource but its dynamic behavior makes the Grid environment unpredictable. Systems and networks can fail, and the introduction of more users can result in resource starvation. Once a job has been submitted for execution on the grid, monitoring becomes essential for a user to see that the job is completed in an efficient way, and to detect any problems that occur while the job is running. In current environments once a user submits a job he loses direct control over the job and the system behaves like a batch system: the user submits the job and later gets a result back. The only information a user can obtain about a job is whether it is scheduled, running, cancelled or finished. Today users are becoming increasingly interested in such analysis grid environments in which they can check the progress of the job, obtain intermediate results, terminate the job based on the progress of job or intermediate results, steer the job to other nodes to achieve better performance and check the resources consumed by the job. In order to fulfill their requirements of interactivity a mechanism is needed that can provide the user with real time access to information about different attributes of a job. In this paper we present the design of a Job Monitoring Service, a web service that will provide interactive remote job monitoring by allowing users to access different attributes of a job once it has been submitted to the interactive Grid Analysis Environment.
DOI: 10.1016/b978-155860933-4/50014-6
2004
Cited 4 times
Distributed Data Analysis
This chapter presents a detailed analysis of the technologies and effort required to complete a challenging data generation and analysis task for a high-energy physics experiment. Compact Muon Solenoid (CMS) is a high-energy physics detector planned for the Large Hadron Collider (LHC) at the European Center for Nuclear Research (CERN) near Geneva, Switzerland. CMS records data from the highest-energy proton-proton collisions (“events”). Data from these collisions may shed light on many fundamental scientific issues including a definitive search for the Higgs particle and the possible origin of mass in the universe, the existence of a new fundamental symmetry of nature called “super symmetry,” and even the possible discovery of new spatial dimensions. CMS also enables to directly compare the output of simulation studies against actual data. Such comparisons provide improved detector calibrations, measurements of physical processes, and indications of possible scientific discoveries. The physics simulation software used by CMS is complicated and has evolved over years—in some cases decades—to embody a great deal of accumulated knowledge and problem-solving experience.
DOI: 10.1145/1188455.1188708
2006
Cited 3 times
Bandwidth challenge---High speed data gathering, distribution and analysis for physics discoveries at the large Hadron collider
The latest WAN infrastructure and Grid-based Web Services will be used to demonstrate movement and analysis of TeraByte-scale event datasets for particle physics. Optimized transfers of data over 10Gbit/sec networks linking servers and disk systems will be shown using the latest processors, PCI-Express NICs, RAID arrays and firmware. All available lambdas arriving at the SC06 show floor will be saturated, in full duplex mode. The WAN performance will be monitored using the MonALISA distributed agent-based system. A suite of Grid-enabled Analysis tools developed at Caltech, University of Florida and Michigan will be used, as will. EGEE, OSG, ATLAS and CMS data management software such as SRM, dCache, FTS, and PhEdEx. Prototypes of the latest version of parallel NFS will demonstrate excellent utilization of 10Gbit/sec connections during execution of the analysis tasks.We intend to exceed the aggregate rate of 150Gbits/sec we achieved in our SC2005 Bandwidth Challenge winning entry.
DOI: 10.1088/1748-0221/11/01/c01051
2016
Run 2 upgrades to the CMS Level-1 calorimeter trigger
The CMS Level-1 calorimeter trigger is being upgraded in two stages to maintain performance as the LHC increases pile-up and instantaneous luminosity in its second run. In the first stage, improved algorithms including event-by-event pile-up corrections are used. New algorithms for heavy ion running have also been developed. In the second stage, higher granularity inputs and a time-multiplexed approach allow for improved position and energy resolution. Data processing in both stages of the upgrade is performed with new, Xilinx Virtex-7 based AMC cards.
DOI: 10.48550/arxiv.physics/0306008
2003
Cited 3 times
Virtual Data in CMS Analysis
The use of virtual data for enhancing the collaboration between large groups of scientists is explored in several ways: - by defining ``virtual'' parameter spaces which can be searched and shared in an organized way by a collaboration of scientists in the course of their analysis; - by providing a mechanism to log the provenance of results and the ability to trace them back to the various stages in the analysis of real or simulated data; - by creating ``check points'' in the course of an analysis to permit collaborators to explore their own analysis branches by refining selections, improving the signal to background ratio, varying the estimation of parameters, etc.; - by facilitating the audit of an analysis and the reproduction of its results by a different group, or in a peer review context. We describe a prototype for the analysis of data from the CMS experiment based on the virtual data system Chimera and the object-oriented data analysis framework ROOT. The Chimera system is used to chain together several steps in the analysis process including the Monte Carlo generation of data, the simulation of detector response, the reconstruction of physics objects and their subsequent analysis, histogramming and visualization using the ROOT framework.
DOI: 10.1088/1742-6596/513/6/062029
2014
CMS Analysis School Model
To impart hands-on training in physics analysis, CMS experiment initiated the concept of CMS Data Analysis School (CMSDAS). It was born over three years ago at the LPC (LHC Physics Centre), Fermilab and is based on earlier workshops held at the LPC and CLEO Experiment. As CMS transitioned from construction to the data taking mode, the nature of earlier training also evolved to include more of analysis tools, software tutorials and physics analysis. This effort epitomized as CMSDAS has proven to be a key for the new and young physicists to jump start and contribute to the physics goals of CMS by looking for new physics with the collision data. With over 400 physicists trained in six CMSDAS around the globe, CMS is trying to engage the collaboration in its discovery potential and maximize physics output. As a bigger goal, CMS is striving to nurture and increase engagement of the myriad talents, in the development of physics, service, upgrade, education of those new to CMS and the career development of younger members. An extension of the concept to the dedicated software and hardware schools is also planned, keeping in mind the ensuing upgrade phase.
2007
SUSY Survey with Inclusive Muon and Same-Sign Dimuon Accompanied by Jets and MET with CMS
2005
The UltraLight Project: The Network as an Integrated and Managed Resource in Grid Systems for High Energy Physics and Data Intensive Science
We describe the NSF-funded UltraLight project. The project's goal is to meet the data-intensive computing challenges of the next generation of particle physics experiments with a comprehensive, network-focused agenda. In particular we argue that instead of treating the network traditionally, as a static, unchanging and unmanaged set of inter-computer links, we instead will use it as a dynamic, configurable, and closely monitored resource, managed end-to-end, to construct a next-generation global system able to meet the data processing, distribution, access and analysis needs of the high energy physics (HEP) community. While the initial UltraLight implementation and services architecture is being developed to serve HEP, we expect many of UltraLight's developments in the areas of networking, monitoring, management, and collaborative research, to be applicable to many fields of data intensive e-science. In this paper we give an overview of, and motivation for the UltraLight project, and provide early results within different working areas of the project.
DOI: 10.5170/cern-2005-002.1022
2004
Job Interactivity Using a Steering Service in an Interactive Grid Analysis Environment
Grid computing has been dominated by the execution of batch jobs. Interactive data analysis is a new domain in the area of grid job execution. The Grid-Enabled Analysis Environment (GAE) attempts to address this in HEP grids by the use of a Steering Service. This service will provide physicists with the continuous feedback of their jobs and will provide them with the ability to control and steer the execution of their submitted jobs. It will enable them to move their jobs to different grid nodes when desired. The Steering Service will also act autonomously to make steering decisions on behalf of the user, attempting to optimize the execution of the job. This service will also ensure the optimal consumption of the Grid user's resource quota. The Steering Service will provide a web service interface defined by standard WSDL. In this paper we have discussed how the Steering Service will facilitate interactive remote analysis of data generated in Interactive Grid Analysis Environment.
2003
The CMS Integration Grid Testbed
The CMS Integration Grid Testbed (IGT) comprises USCMS Tier-1 and Tier-2 hardware at the following sites: the California Institute of Technology, Fermi National Accelerator Laboratory, the University of California at San Diego, and the University of Florida at Gainesville. The IGT runs jobs using the Globus Toolkit with a DAGMan and Condor-G front end. The virtual organization (VO) is managed using VO management scripts from the European Data Grid (EDG). Gridwide monitoring is accomplished using local tools such as Ganglia interfaced into the Globus Metadata Directory Service (MDS) and the agent based Mona Lisa. Domain specific software is packaged and installed using the Distrib ution After Release (DAR) tool of CMS, while middleware under the auspices of the Virtual Data Toolkit (VDT) is distributed using Pacman. During a continuo us two month span in Fall of 2002, over 1 million official CMS GEANT based Monte Carlo events were generated and returned to CERN for analysis while being demonstrated at SC2002. In this paper, we describe the process that led to one of the world's first continuously available, functioning grids.
2016
Boosted Top Jet Tagging at 100 TeV
2015
Run 2 Upgrades to the CMS Level-1 Calorimeter Trigger
DOI: 10.22323/1.134.0261
2012
Global SUSY Fits with the MasterCode Framework COLLABORATION
We present the latest results of the MasterCode collaboration on global SUSY fits.Currently available experimental data are used to determine the preferred SUSY and Higgs boson mass scales.The data comprise a combination of high-energy SUSY searches, low-energy precision measurements and astrophysical data.We include all relevant LHC searches for SUSY, electroweak precision observables such as the W boson mass and the anomalous magnetic moment of the muon, B physics observables such as BR(b → sγ), as well as the cold dark matter density in the Universe.The preferred masses for SUSY particles as well as for the MSSM Higgs bosons are derived in the context of four GUT-based realizations of the MSSM.We find a preference for relatively light SUSY masses, which the direct searches at the LHC shift to slightly higher mass scales.The preferred mass values can directly be compared to the reach of the LHC and future e + e -colliders as well as to current and future direct detection searches for dark matter.
2011
Global SUSY Fits with the MasterCode Framework COLLABORATION
2012
The CMSSM and NUHM1 in Light of 7 TeV LHC, Bs! + and XENON100 Data
2008
B, D and K decays
DOI: 10.1007/978-3-540-95942-7_3
2009
B, D and K decays
DOI: 10.1063/1.2735159
2007
Potential to Discover Inclusive Supersymmetry using the CMS Detector
Generic signatures of supersymmetry with R‐parity conservation include energetic jets and missing transverse energy accompanied with leptons. The ability of CMS to discover supersymmetry with these signals is estimated for 1 fb−1 of data collected. The selection criteria are optimized and the corresponding systematic effects studied for a single low‐mass benchmark point of the Constrained MSSM with m0 = 60 GeV/c2, m1/2 = 250 GeV/c2, tanβ = 10, A0 = 0 and μ > 0.
DOI: 10.1016/j.nuclphysbps.2007.07.011
2007
Reconstructing Jets and Missing Transverse Energy using the CMS Detector
In 2007, the Large Hadron Collider (LHC) will circulate and collide proton-proton beams for the first time. The Compact Muon Solenoid (CMS) is one of four experiments at the LHC and is entering the final phases of construction and initial phases of commissioning. This report discusses the expected performance of reconstructing jets and missing transverse energy using the CMS Detector. In addition, strategies for calibrating the energy scale using real data are presented.
2007
Search Strategy for the Standard Model Higgs Boson in the H to ZZ (ast ) to 4mu Decay Channel Using M(4mu )-Dependent Cuts
2006
The Motivation, Architecture and Demonstration of UltralightNetwork Testbed
In this paper we describe progress in the NSF-funded Ultralight project and a recent demonstration of Ultralight technologies at SuperComputing 2005 (SC|05). The goal of the Ultralight project is to help meet the data-intensive computing challenges of the next generation of particle physics experiments with a comprehensive, network-focused approach. Ultralight adopts a new approach to networking: instead of treating it traditionally, as a static, unchanging and unmanaged set of inter-computer links, we are developing and using it as a dynamic, configurable, and closely monitored resource that is managed from end-to-end. Thus we are constructing a next-generation global system that is able to meet the data processing, distribution, access and analysis needs of the particle physics community. In this paper we present the motivation for, and an overview of, the Ultralight project. We then cover early results in the various working areas of the project. The remainder of the paper describes our experiences of the Ultralight network architecture, kernel setup, application tuning and configuration used during the bandwidth challenge event at SC|05. During this Challenge, we achieved a record-breaking aggregate data rate in excess of 150 Gbps while moving physics datasets between many sites interconnected by the Ultralight backbone network. The exercise highlighted the benefits of Ultralight's research and development efforts that are enabling new and advanced methods of distributed scientific data analysis.
DOI: 10.1109/broadnets.2006.4374312
2006
The Design and Demonstration of the Ultralight Testbed
In this paper we present the motivation, the design, and a recent demonstration of the UltraLight testbed at SC|05. The goal of the Ultralight testbed is to help meet the data-intensive computing challenges of the next generation of particle physics experiments with a comprehensive, network- focused approach. UltraLight adopts a new approach to networking: instead of treating it traditionally, as a static, unchanging and unmanaged set of inter-computer links, we are developing and using it as a dynamic, configurable, and closely monitored resource that is managed from end-to-end. To achieve its goal we are constructing a next-generation global system that is able to meet the data processing, distribution, access and analysis needs of the particle physics community. In this paper we will first present early results in the various working areas of the project. We then describe our experiences of the network architecture, kernel setup, application tuning and configuration used during the bandwidth challenge event at SC|05. During this Challenge, we achieved a record-breaking aggregate data rate in excess of 150 Gbps while moving physics datasets between many Grid computing sites.
DOI: 10.1109/e-science.2006.136
2006
The Design and Implementation of the Transatlantic Mission-Oriented Production and Experimental Networks
In this paper we present the design and implementation of the mission-oriented USLHCNet for HEP research community and the UltraLight network testbed. The design philosophy for these networks is to help meet the data-intensive computing challenges of the next generation of particle physics experiments with a comprehensive, network-focused approach. Instead of treating the network as a static, unchanging and unmanaged set of intercomputer links, we are developing and using it as a dynamic, configurable, and closely monitored resource that is managed from end-to-end. In this paper we will present our work in the various areas of the project including infrastructure construction, protocol research and application development. Our goal is to construct a next-generation global system that is able to meet the data processing, distribution, access and analysis needs of the particle physics community.
2006
Relative contributions of t- and s-channels to the Z Z ---> 4mu process
2006
CMS Detector Sensitivity to the Standard Model Higgs Boson in $H \rightarrow ZZ \rightarrow 4\mu$ Decay Channel
2006
Study of PDF and QCD scale uncertainties in p p ---> Z Z ---> 4mu events at the LHC
1964
AUTOMATIC QUENCH DETERMINATION IN LIQUID SCINTILLATION COUNTING BY EXTERNAL STANDARDIZATION
2006
CMS Detector Sensitivity to the Standard Model Higgs Boson in H->ZZ^(*)->4leptons Decay Channel
2005
A Tier2 Center at the University of Florida
2005
The Emerging Global Cyberinfrastructure: Data Intensive Science in the 21st Century
DOI: 10.1142/s0217751x05026662
2005
CMS Preparations for New Physics Searches Involving Lepton Final States
In 2007, the Large Hadron Collider (LHC) will circulate and collide proton-proton beams at an expected center-of-mass energy of 14 TeV. The Compact Muon Solenoid (CMS) is one of four experiments at the LHC and has been designed with particular attention to selecting and reconstructing muons with high redundancy. This paper briefly describes the CMS Muon System and provides an overview of CMS preparations for new physics searches involving lepton final states during the early phases of running at the LHC.
DOI: 10.5170/cern-2005-002.1161
2005
A Regional Analysis Center at the University of Florida
2003
The cms integration grid testbed
The CMS Integration Grid Testbed (IGT) comprises USCMS Tier-2 and Tier-2 hardware at the following sites: the California Institute of Technology, Fermi National Accelerator Laboratory, the University of California at San Diego, and the University of Florida at Gainesville. The IGT runs jobs using the Globus Toolkit with a DAGMan and Condor-G front end. The virtual organization (VO) is managed using VO management scripts from the European Data Grid (EDG). Gridwide monitoring is accompolished using local tools such as Ganglia interfaced into the Globus Metadata Directory Service (MDS) and the agent based Mona Lisa. Domain specific software is packaged and installed using the Distribution After Release (DAR) tool of CMS, while middleware under the auspices of the Virtual Data Toolkit (VDT) is distributed using Pacman. During a continuous two month span in Fall of 2002, over 1 million official CMS GEANT based Monte Carlo events were generated and returned to CERN for analysis while being demonstrated at SC2002. In thie paper, we describe the process that led to one of the world's first continuously available functioning grids.
2003
Virtual data in cms production
Initial applications of the GriPhyN Chimera Virtual Data System have been performed within the context of CMS Production of Monte Carlo Simulated Data. The GriPhyN Chimera system consists of four primary components: 1) a Virtual Data Language, which is used to describe virtual data products, 2) a Virtual Data Catalog, which is used to store virtual data entries, 3) an Abstract Planner, which resolves all dependencies of a particular virtual data product and forms a location and existence independent plan, 4) a Concrete Planner, which maps an abstract, logical plan onto concrete, physical grid resources accounting for staging in/out files and publishing results to a replica location service. A CMS Workflow Planner, MCRunJob, is used to generate virtual data products using the Virtual Data Language. Subsequently, a prototype workflow manager, known as WorkRunner, is used to schedule the instantiation of virtual data products across a grid.
2003
Virtual Data in CMS Analysis
The use of virtual data for enhancing the collaboration between large groups of scientists is explored in several ways: - by defining ``virtual'' parameter spaces which can be searched and shared in an organized way by a collaboration of scientists in the course of their analysis; - by providing a mechanism to log the provenance of results and the ability to trace them back to the various stages in the analysis of real or simulated data; - by creating ``check points'' in the course of an analysis to permit collaborators to explore their own analysis branches by refining selections, improving the signal to background ratio, varying the estimation of parameters, etc.; - by facilitating the audit of an analysis and the reproduction of its results by a different group, or in a peer review context. We describe a prototype for the analysis of data from the CMS experiment based on the virtual data system Chimera and the object-oriented data analysis framework ROOT. The Chimera system is used to chain together several steps in the analysis process including the Monte Carlo generation of data, the simulation of detector response, the reconstruction of physics objects and their subsequent analysis, histogramming and visualization using the ROOT framework.
DOI: 10.1142/9789812811752_0010
2001
MEASUREMENTS OF THE W MASS AT LEPII
1972
STATUS OF NAL SAMM.
1976
The Fermilab SAMM Device: Hardware
1990
PERIODIC BEHAVIOUR OF DATA FOR MAIN-GROUP TRIATOMIC MOLECULES
$^{1}$ R. Hefferlin, R. Campbell, D. Gimbel, H. Kuhlman, and T. Cayton, J. Quant. Spect. Rad. Transfer 21, 315 (1979). $^{2}$ F. A. Kong, J. Mol. Struct. 90, 17 (1982). $^{3}$ R. Hefferlin, ``Periodic Systems and their Relation to the Systematic Analysis of Molecular Data, ''Edwin Mellen Press, Lewiston, NY, 1989. $^{4}$ R. Hefferlin, G. V. Zhuvikin, K. E. Caviness, and P. J. Duerksen, J. Quant. Spect. Rad. Transfer 32, 257 (1984).