ϟ

José M Hernández

Here are all the papers by José M Hernández that you can download and read on OA.mg.
José M Hernández’s last known institution is . Download José M Hernández PDFs here.

Claim this Profile →
DOI: 10.1051/0004-6361/202243283
2023
Cited 14 times
<i>Gaia</i> Data Release 3
Diffuse interstellar bands (DIBs) are common interstellar absorption features in spectroscopic observations but their origins remain unclear. DIBs play an important role in the life cycle of the interstellar medium (ISM) and can also be used to trace Galactic structure. Here, we demonstrate the capacity of the Gaia-Radial Velocity Spectrometer (RVS) in Gaia DR3 to reveal the spatial distribution of the unknown molecular species responsible for the most prominent DIB at 862 nm in the RVS passband, exploring the Galactic ISM within a few kiloparsecs from the Sun. The DIBs are measured within the GSP-Spec module using a Gaussian profile fit for cool stars and a Gaussian process for hot stars. In addition to the equivalent widths and their uncertainties, Gaia DR3 provides their characteristic central wavelength, width, and quality flags. We present an extensive sample of 476.117 individual DIB measurements obtained in a homogeneous way covering the entire sky. We compare spatial distributions of the DIB carrier with interstellar reddening and find evidence that DIB carriers are present in a local bubble around the Sun which contains nearly no dust. We characterised the DIB equivalent width with a local density of $0.19 \pm 0.04$ Angstr\"om/kpc and a scale height of $\rm 98.60_{-8.46}^{+11.10}$ pc. The latter is smaller than the dust scale height, indicating that DIBs are more concentrated towards the Galactic plane. We determine the rest-frame wavelength with unprecedented precision ($\rm \lambda_{0} = 8620.86\, \pm 0.019$ Angstr\"om in air) and reveal a remarkable correspondence between the DIB velocities and the CO gas velocities, suggesting that the 862 nm DIB carrier is related to macro-molecules.
DOI: 10.1108/ijchm-05-2014-0255
2015
Cited 69 times
How online search behavior is influenced by user-generated content on review websites and hotel interactive websites
Purpose – The purpose of this paper is to advance in research on consumer psychology of hospitality, since it investigates how online search behavior of users (particularly, information search and choice) is influenced by the opinions of other people in a new context characterized by the generalized use of Web 2.0 applications. Design/methodology/approach – Empirical research was carried out in the hotel sector in Iberian Peninsula, where two Web 2.0 applications are especially relevant for users: the review Web sites and the hotel interactive Web sites. A qualitative method (in-depth interviews with hotel managers) and a quantitative technique (personal surveys to a sample of 830 users) were used to conduct this research. Findings – The results indicates that the perceived influence on behavior of the user-generated content on these Web 2.0 applications is determined, in both cases, by the value of the information, the credibility of the sources and the degree of similarity between the user and the creators of content. Practical implications – Firms should have an active presence in the review Web sites and the hotel interactive Web sites, and use these platforms for market research and communication. Firms should engage users to post content, support their credibility and facilitate the evaluation of the content generators’ similarity. Originality/value – This paper is the first study in the hospitality literature that develops and empirically tests an integrative model explaining the perceived influence on behavior of user-generated content on Web 2.0 applications.
DOI: 10.1111/1467-6451.00144
2001
Cited 84 times
Informative Advertising and Optimal Targeting in a Monopoly
This paper analyzes how the transition from mass to specialized advertising can affect the market outcomes. To that end, we consider a particular technology of information transmission which allows a monopolist to decide the optimal targeting strategy. From this starting point, we show that the use of targeted advertising is likely to increase the market price and reduce the level of advertising, and that the degree of media specialization chosen by the monopolist tends to exceed the socially optimal. Furthermore, our model indicates that the social loss resulting from the greater monopoly power might exceed the gain due to the lower wasting of ads, in such a way that targeting could reduce consumer surplus and, what is more important, the level of social welfare.
DOI: 10.1016/s0300-8932(02)76781-x
2002
Cited 69 times
Registro Español de Hemodinámica y Cardiología Intervencionista. XI Informe Oficial de la Sección de Hemodinámica y Cardiología Intervencionista de la Sociedad Española de Cardiología (años 1990–2001)
The results of the Spanish Registry of the Working Group on cardiac catheterization and Interventional Cardiology of the Spanish Society of Cardiology (years 1990-2001) are presented. One-hundred-and-three centers contributed data, all the cardiac catheterization laboratories in Spain; 97 centers performed mainly adult catheterization and 6 carried out only pediatric procedures. In 2001, 95,430 diagnostic catheterization procedures were performed, with 79,607 coronary angiograms, representing a total increase of 8.4% over 2000. The population-adjusted incidence was 1947 coronary angiograms per 106 inhabitants. Coronary interventions increased by 15.4% compared with 2000, with a total of 31,290 procedures and an incidence of coronary interventions of 761 per 106 inhabitants. Coronary stents were the most frequently used devices with 39,356 implanted in 2001, and increase of 33.4% over 2000. Stenting accounted for 88.2% of procedures. Direct stenting was done in 11,280 procedures (40.9%). IIb-IIIa glycoprotein inhibitors were given in 7,012 procedures (22.4%). Multivessel percutaneous coronary interventions were performed in 8,445 cases (27%) and interventions were performed ad hoc during diagnostic study in 23,144 cases (74 %).A total of 3,845 percutaneous coronary interventions were carried out in patients with acute myocardial infarction, an increase of 22.9% over 2000 and 12.3% of all interventional procedures. Among non-coronary interventions, atrial septal defect closure was performed more often (161 cases, a 60% increase over 2000). Pediatric interventions increased by 15.4% (from 817 to 943 cases).Lastly, we would like to underline the high rate of reporting by laboratories, which allowed the Registry to compile data that are highly representative of hemodynamic interventions in Spain.
DOI: 10.1140/epjc/s10052-006-0139-9
2006
Cited 42 times
A Measurement of the ψ′ to J/ψ production ratio in 920 GeV proton-nucleus interactions
Ratios of the ψ′ over the J/ψ production cross sections in the dilepton channel for C, Ti and W targets have been measured in 920 GeV proton-nucleus interactions with the HERA-B detector at the HERA storage ring. The ψ′ and J/ψ states were reconstructed in both the μ+μ- and the e+e- decay modes. The measurements covered the kinematic range -0.35≤xF≤0.1 with transverse momentum pT≤4.5 GeV/c. The angular dependence of the ratio has been used to measure the difference of the ψ′ and J/ψ polarization. All results for the muon and electron decay channels are in good agreement: their ratio, averaged over all events, is Rψ′(μ)/Rψ′(e)=1.00±0.08±0.04. This result constitutes a new, direct experimental constraint on the double ratio of branching fractions, (B′(μ)B(e))/(B(μ)B′(e)), of ψ′ and J/ψ in the two channels. The ψ′ to J/ψ production ratio is almost constant in the covered xF range and shows a slow increase with pT.
DOI: 10.1007/s10336-011-0646-9
2011
Cited 29 times
Endangered subspecies of the Reed Bunting (Emberiza schoeniclus witherbyi and E. s. lusitanica) in Iberian Peninsula have different genetic structures
In the Iberian Peninsula, populations of two subspecies of the Reed Bunting Emberiza schoeniclus have become increasingly fragmented during the last decades when suitable habitats have been lost and/or the populations have gone extinct. Presently, both subspecies are endangered. We estimated the amount of genetic variation and population structure in order to define conservation units and management practices for these populations. We found that the subspecies lusitanica has clearly reduced genetic variation in nuclear and mitochondrial markers, has a drastically small effective population size and no genetic differentiation between populations. In contrast, the subspecies witherbyi is significantly structured, but the populations still hold large amounts of variation even though the effective population sizes are smaller than in the non-endangered subspecies schoeniclus. We suggest several management units for the Iberian populations. One unit includes subspecies lusitanica as a whole; the other three units are based on genetically differentiated populations of witherbyi. The most important genetic conservation measure in the case of lusitanica is to preserve the remaining habitats in order to at least maintain the present levels of gene flow. In the case of the three management units within witherbyi, the most urgent conservation measure is to improve the habitat quality to increase the population sizes.
DOI: 10.1016/s0300-8932(01)76526-8
2001
Cited 49 times
Registro de Actividad de la Sección de Hemodinámica y Cardiología Intervencionista de la Sociedad Española de Cardiología del año 2000
The results of the Registry of the Working Group on Hemodynamics and Interventional Cardiology of the Spanish Society of Cardiology for 2000 are presented. Date came from 100 centers representing all the cardiac catheterization laboratories in Spain; 93 centers performed mainly adult catheterization and 7 carried out only pediatric procedures. In 2000, 88,339 diagnostic catheterization procedures were performed (73,382 coronary angiograms), representing a total increase of 12.5% over 1999. The population-adjusted rate was 1,825 coronary angiograms per 106 inhabitants. With a total of 26,993 procedures and a rate of coronary interventions per 106 inhabitants of 671, coronary intervention increased by 17% over figures for 1999. Coronary stents were the devices used most often, with 29,504 implanted in 2000; stenting accounted for 77.2% of procedures, a 30.5% increase over 1999. The increase in direct stenting without predilatation was noteworthy. Direct stenting was done in 8,778 procedures (38.9% of the total), an increase of 131% compared to 1999. IIb-IIIa glycoprotein were used in 4,700 coronary interventions (17%). Angioplasty, performed in 3,128 cases of acute myocardial infarction, accounted for 11.6% of coronary interventions 33.5% more than in 1999. A decrease of 6.5% in valvuloplastics occurred, attributable to the performance of fewer mitral valve repairs (493 vs 525 in 2000 and 1999, respectively). Pediatric procedures increased by 20.5%, from 678 to 817 cases. In conclusion, we would like to underline the high rate of reporting by laboratories, through which the Registry has been able to compile data that are highly representative of the hemodynamic activity in Spain.
DOI: 10.1016/s0140-9883(02)00013-0
2002
Cited 45 times
Global warming and the energy efficiency of Spanish industry
This paper uses a stochastic frontier production function model to analyze the energy efficiency of Spanish industry. We used minimum cost input demand equations as the reference in order to calculate the demand for electricity, gas and other fuels. On this basis, we found that there is no inherent conflict between the objectives of achieving productive efficiency and reducing energy consumption. Indeed, it is possible to reduce the industrial emissions of CO2 by up to 29.4% by means of a bottom-up energy efficiency policy. However, if the government wants firms to reduce their emissions even further, then it would be necessary to implement some form of energy regulatory policy. In this respect, we estimate the cost of reducing CO2 emissions by 20%.
2006
Cited 36 times
PhEDEx High-Throughput Data Transfer Management System
DOI: 10.1109/tns.2011.2146276
2011
Cited 22 times
CMS Workflow Execution Using Intelligent Job Scheduling and Data Access Strategies
Complex scientific workflows can process large amounts of data using thousands of tasks. The turnaround times of these workflows are often affected by various latencies such as the resource discovery, scheduling and data access latencies for the individual workflow processes or actors. Minimizing these latencies will improve the overall execution time of a workflow and thus lead to a more efficient and robust processing environment. In this paper, we propose a pilot job based infrastructure that has intelligent data reuse and job execution strategies to minimize the scheduling, queuing, execution and data access latencies. The results have shown that significant improvements in the overall turnaround time of a workflow can be achieved with this approach. The proposed approach has been evaluated, first using the CMS Tier0 data processing workflow, and then simulating the workflows to evaluate its effectiveness in a controlled environment.
DOI: 10.1088/1742-6596/2438/1/012039
2023
Extending the distributed computing infrastructure of the CMS experiment with HPC resources
Abstract Particle accelerators are an important tool to study the fundamental properties of elementary particles. Currently the highest energy accelerator is the LHC at CERN, in Geneva, Switzerland. Each of its four major detectors, such as the CMS detector, produces dozens of Petabytes of data per year to be analyzed by a large international collaboration. The processing is carried out on the Worldwide LHC Computing Grid, that spans over more than 170 compute centers around the world and is used by a number of particle physics experiments. Recently the LHC experiments were encouraged to make increasing use of HPC resources. While Grid resources are homogeneous with respect to the used Grid middleware, HPC installations can be very different in their setup. In order to integrate HPC resources into the highly automatized processing setups of the CMS experiment a number of challenges need to be addressed. For processing, access to primary data and metadata as well as access to the software is required. At Grid sites all this is achieved via a number of services that are provided by each center. However at HPC sites many of these capabilities cannot be easily provided and have to be enabled in the user space or enabled by other means. At HPC centers there are often restrictions regarding network access to remote services, which is again a severe limitation. The paper discusses a number of solutions and recent experiences by the CMS experiment to include HPC resources in processing campaigns.
DOI: 10.3390/electronics12163395
2023
Smart Cities and Citizen Adoption: Exploring Tourist Digital Maturity for Personalizing Recommendations
Due to the irruption of new technologies in cities such as mobile applications, geographic information systems, internet of things (IoT), Big Data, or artificial intelligence (AI), new approaches to citizen management are being developed. The primary goal is to adapt citizen services to this evolving technological environment, thereby enhancing the overall urban experience. These new services can enable city governments and businesses to offer their citizens a truly immersive experience that facilitates their day-to-day lives and ultimately improves their standard of living. In this arena, it is important to emphasize that all investments in infrastructure and technological developments in Smart Cities will be wasted if the citizens for whom they have been created eventually do not use them for whatever reason. To avoid these kinds of problems, the citizens’ level of adaptation to the technologies should be evaluated. However, although much has been studied about new technological developments, studies to validate the actual impact and user acceptance of these technological models are much more limited. This work endeavors to address this deficiency by presenting a new model of personalized recommendations based in the technology acceptance model (TAM). To achieve the goal, this research introduces an assessment system for tourists’ digital maturity level (DMT) that combines a fuzzy 2-tuple linguistic model and the analytic hierarchy process (AHP). This approach aims to prioritize and personalize the connection and communication between tourists and Smart Cities based on the digital maturity level of the tourist. The results have shown a significant correlation between technology usage and the potential for personalized experiences in the context of tourism and Smart Cities.
DOI: 10.1007/978-981-99-7210-4_13
2024
The Growing Scientific Interest in Artificial Intelligence for Addressing Climate Change: A Bibliometric Analysis
Climate change is a reality that can be felt. There are more and more symptoms: droughts, floods, global temperature change… This is causing public opinion to react and worry. The scientific community is no stranger to this feeling and is looking to science and technology, specifically artificial intelligence, for the means and mechanisms to help reduce this impact. This study demonstrates that the scientific community’s interest in artificial intelligence and climate change is a constant and growing reality. To achieve this objective, a bibliometric study is used with the following methodology: first, scientific papers related to artificial intelligence and climate change are obtained from the Scopus database, then they are processed through VOSviewer and analyzed by the dimensions of time, topics, and countries, and finally a network map is visualized where it can be seen how climate change is surrounded by areas related to artificial intelligence.
DOI: 10.1007/978-3-031-54235-0_25
2024
Artificial Intelligence Applied to Human Resources Management: A Bibliometric Analysis
Artificial intelligence has positively influenced several fields, including human resources. It has simplified the automated search for candidates in social networks, making it possible to filter a large amount of information to find the ideal candidate. It has also made it possible to predict drop-out rates, automate selection processes and allow applicants to participate remotely in training, optimizing human resources management and increasing business productivity. Specifically, during the COVID-19 pandemic and subsequent years, scientific production in this area has experienced a remarkable growth since 2019, with India being the most involved country and Computer Science and Business, Management and Accounting being the main research areas of interest. This bibliometric study, which covers the period 2013–2022 and presents a bibliometric map produced with VOSViewer, reveals four clusters with emerging areas and disciplines that are emerging as research trends: “ML for resource management”, “AI for recruitment management”, “DSS for information management”, and “AI for training”.
DOI: 10.1051/epjconf/202429507027
2024
Integration of the Barcelona Supercomputing Center for CMS computing: Towards large scale production
The CMS experiment is working to integrate an increasing number of High Performance Computing (HPC) resources into its distributed computing infrastructure. The case of the Barcelona Supercomputing Center (BSC) is particularly challenging as severe network restrictions prevent the use of CMS standard computing solutions. The CIEMAT CMS group has performed significant work in order to overcome these constraints and make BSC resources available to CMS. The developments include adapting the workload management tools, replicating the CMS software repository to BSC storage, providing an alternative access to detector conditions data, and setting up a service to transfer produced output data to a nearby storage facility. In this work, we discuss the current status of this integration activity and present recent developments, such as a front-end service to improve slot usage efficiency and an enhanced transfer service that supports the staging of input data for workflows at BSC. Moreover, significant efforts have been devoted to improving the scalability of the deployed solution, automating its operation, and simplifying the matchmaking of CMS workflows that are suitable for execution at BSC.
DOI: 10.1051/epjconf/202429501006
2024
A case study of content delivery networks for the CMS ex-periment
In 2029 the LHC will start the high-luminosity LHC program, with a boost in the integrated luminosity resulting in an unprecedented amount of ex- perimental and simulated data samples to be transferred, processed and stored in disk and tape systems across the worldwide LHC computing Grid. Content de- livery network solutions are being explored with the purposes of improving the performance of the compute tasks reading input data via the wide area network, and also to provide a mechanism for cost-effective deployment of lightweight storage systems supporting traditional or opportunistic compute resources. In this contribution we study the benefits of applying cache solutions for the CMS experiment, in particular the configuration and deployment of XCache serving data to two Spanish WLCG sites supporting CMS: the Tier-1 site at PIC and the Tier-2 site at CIEMAT. The deployment and configuration of the system and the developed monitoring tools will be shown, as well as data popularity studies in relation to the optimization of the cache configuration, the effects on CPU efficiency improvements for analysis tasks, and the cost benefits and impact of including this solution in the region.
DOI: 10.1051/epjconf/202429507045
2024
The Spanish CMS Analysis Facility at CIEMAT
The increasingly larger data volumes that the LHC experiments will accumulate in the coming years, especially in the High-Luminosity LHC era, call for a paradigm shift in the way experimental datasets are accessed and analyzed. The current model, based on data reduction on the Grid infrastructure, followed by interactive data analysis of manageable size samples on the physicists’ individual computers, will be superseded by the adoption of Analysis Facilities. This rapidly evolving concept is converging to include dedicated hardware infrastructures and computing services optimized for the effective analysis of large HEP data samples. This paper describes the actual implementation of this new analysis facility model at the CIEMAT institute, in Spain, to support the local CMS experiment community. Our work details the deployment of dedicated highly performant hardware, the operation of data staging and caching services ensuring prompt and efficient access to CMS physics analysis datasets, and the integration and optimization of a custom analysis framework based on ROOT’s RDataFrame and CMS NanoAOD format. Finally, performance results obtained by benchmarking the deployed infrastructure and software against a CMS analysis workflow are summarized.
DOI: 10.1016/s0300-8932(03)77021-3
2003
Cited 31 times
Registro Español de Hemodinámica y Cardiología Intervencionista. XII Informe Oficial de la Sección de Hemodinámica y Cardiología Intervencionista de la Sociedad Española de Cardiología (1990-2002)
Se presentan los resultados del Registro de actividad de la Sección de Hemodinámica y Cardiología Intervencionista de la Sociedad Española de Cardiología del año 2002. Se han recogido datos de 101 centros, la práctica totalidad de los laboratorios del país, de los que 95 realizaron su actividad sobre todo en pacientes adultos y 6 exclusivamente en pacientes pediátricos. Se realizaron 97.609 estudios diagnósticos, con una cifra de 83.667 coronariografías, con un incremento de éstas del 5,1% respecto al año 2001 y una tasa de 2.053 coronariografías por millón de habitantes. Se efectuaron 34.723 procedimientos de intervencionismo coronario, con un incremento del 11% respecto al año anterior y una tasa de 850 intervenciones por millón de habitantes. El stent intracoronario fue el dispositivo más empleado, en el 91,7% de los procedimientos, con 47.249 unidades utilizadas (incremento del 20%). El stent con carácter directo, sin predilatación, fue utilizado en 13.768 procedimientos, el 43,2% de los casos. Los inhibidores de la glucoproteína IIb/IIIa fueron utilizados en 9.966 procedimientos (28,7%). En 9.830 casos (28%) se efectuó un procedimiento en multivaso, y en 26.341 casos (76%) la intervención coronaria percutánea se realizó en la misma sesión que la coronariografía diagnóstica. Se llevaron a cabo 4.766 procedimientos de intervencionismo en el infarto agudo de miocardio, lo que supone un 23,9% más respecto al año 2001 y el 13,7% del total de las intervenciones coronarias percutáneas. En el intervencionismo no coronario destaca el descenso del número de valvuloplastias mitrales (21,2%), un descenso en los cierres percutáneos de comunicación interauricular en pacientes adultos (11,1%), y un ligero incremento de los procedimientos intervencionistas en pacientes en edad pediátrica (3,7%). Finalmente, destacamos el alto grado de participación de centros en el registro, lo que hace que los datos aquí presentados sean representativos de la actividad hemodinámica en nuestro país. The results of the Registry of the Working Group on Cardiac Catheterization and Interventional Cardiology of the Spanish Society of Cardiology for 2002 are presented. Data were obtained from 101 centers representing all cardiac catheterization laboratories in Spain; 95 centers performed mainly adult catheterization and 6 carried out only pediatric procedures. In 2002, 97,609 diagnostic catheterization procedures were performed, including 83,667 coronary angiograms, representing a total increase of 5.1% in comparison to 2001. The population-adjusted rate was 2,053 coronary angiograms per 106 inhabitants. Coronary interventions increased by 11% in comparison to 2001, with a total of 34,723 procedures and a rate of coronary interventions of 850 per 106 inhabitants. Coronary stents were the devices used most frequently, with 47,249 implanted in 2002, for a total increase of 20% in comparison to 2001. Stenting accounted for 91.7% of all procedures. Direct stenting was done in 13 768 procedures (43.2%). IIb-IIIa glycoprotein inhibitors were used in 9966 procedures (28.7%). Multivessel percutaneous coronary interventions were performed in 9,830 patients (28%), and ad hoc interventions were done in the course of diagnostic coronary angiography in 26,341 patients (76%). A total of 4,766 percutaneous coronary interventions were done in patients with acute myocardial infarction, representing an increase of 23.9% in comparison to 2001, and accounting for 13.7% of all interventional procedures. Of the noncoronary interventions recorded, we note the decrease in percutaneous mitral valvuloplasties (21.2%) and atrial septal defect closures (11.1%), and the slight increase in pediatric interventions (3.7%). In conclusion, we emphasize the high rate of reporting by laboratories, which allows the Registry to compile data that are highly representative of the activity at cardiac catheterization laboratories in Spain
DOI: 10.1016/0370-2693(85)90350-8
1985
Cited 27 times
D correlations in 360 GeV/c π−p interactions
Charm-charm correlation properties are studied in detail for the first time using a sample of DD pairs produced in 360 GeV/c π−p interactions. The data are compared with various models of charm production.
DOI: 10.1140/epjc/s10052-007-0237-3
2007
Cited 26 times
K*0 and φ meson production in proton–nucleus interactions at $\sqrt{s}=41.6\text{GeV}$
The inclusive production cross sections of the strange vector mesons K*0, K*0bar, and phi have been measured in interactions of 920 GeV protons with C, Ti, and W targets with the HERA-B detector at the HERA storage ring. Differential cross sections as a function of rapidity and transverse momentum have been measured in the central rapidity region and for transverse momenta up to pT=3.5 GeV/c. The atomic number dependence is parametrised as sigma(pA) = sigma(pN)*A**alpha, where sigma(pN) is the proton-nucleon cross section. Within the phase space accessible, alpha(K*0) = 0.86+/-0.03, alpha(K*0bar) = 0.87+/-0.03, and alpha(phi) = 0.96+/-0.02. The total proton-nucleon cross sections, determined by extrapolating the differential measurements to full phase space, are sigma(pN->K*0) = 5.06+/-0.54 mb, sigma(pN->K*0bar) = 4.02+/-0.45 mb, and sigma(pN->phi) = 1.17+/-0.11 mb. The Cronin effect is observed for the first time for vector mesons containing strange quarks; compared to the measurements of Cronin et al. for K+- mesons, the measured values of alpha for phi mesons coincide with those of K- mesons for all transverse momenta, while the enhancement for K*0 / K*0bar mesons is smaller.
DOI: 10.1140/epjc/s10052-007-0427-z
2007
Cited 25 times
Measurement of D0, D+, Ds + and D*+ production in fixed target 920 GeV proton–nucleus collisions
The inclusive production cross sections of the charmed mesons D0,D+,Ds + and D*+ have been measured in interactions of 920 GeV protons on C, Ti, and W targets with the HERA-B detector at the HERA storage ring. Differential cross sections as a function of transverse momentum and Feynman’s x variable are given for the central rapidity region and for transverse momenta up to pT=3.5 GeV/c. The atomic mass number dependence and the leading to non-leading particle production asymmetries are presented as well.
DOI: 10.1103/physrevd.79.012001
2009
Cited 22 times
Production of the charmonium states<mml:math xmlns:mml="http://www.w3.org/1998/Math/MathML" display="inline"><mml:msub><mml:mi>χ</mml:mi><mml:mrow><mml:mi>c</mml:mi><mml:mn>1</mml:mn></mml:mrow></mml:msub></mml:math>and<mml:math xmlns:mml="http://www.w3.org/1998/Math/MathML" display="inline"><mml:msub><mml:mi>χ</mml:mi><mml:mrow><mml:mi>c</mml:mi><mml:mn>2</mml:mn></mml:mrow></mml:msub></mml:math>in proton nucleus interactions at<mml:math xmlns:mml="http://www.w3.org/1998/Math/MathML" display="…
ψ F < 0.15 is presented.Both µ + µ - and e + e -J/ψ decay channels are observed with an overall statistics of about 15000 χc events, which is by far the largest available sample in pA collisions.The result is Rχ c = 0.188 ± 0.013st +0.024 -0.022 sys averaged over the different materials, when no J/ψ and χc polarisations are considered.The χc1 to χc2 production ratio R12 = Rχ c 1 /Rχ c 2 is measured to be 1.02 ± 0.40, leading to a cross section ratio σ(χ c1 ) σ(χ c2 ) = 0.57 ± 0.23.The dependence of Rχ c on the Feynman-x of the J/ψ, x J/ψ F , and its transverse momentum, p J/ψ T , is studied, as well as its dependence on the atomic number, A, of the target.For the first time, an extensive study of possible biases on Rχ c and R12 due to the dependence of acceptance on the polarization states of J/ψ and χc is performed.By varying the polarisation parameter, λ obs , of all produced J/ψ's by two sigma around the value measured by HERA-B, and considering the maximum variation due to the possible χc1 and χc2 polarisations, it is shown that Rχ c could change by a factor between 1.02 and 1.21 and R12 by a factor between 0.89 and 1.16.
DOI: 10.1016/s0370-2693(03)00407-6
2003
Cited 28 times
J/ψ production via χc decays in 920 GeV pA interactions
Using data collected by the HERA-B experiment, we have measured the fraction of J/ψ's produced via radiative χc decays in interactions of 920 GeV protons with carbon and titanium targets. We obtained Rχc=0.32±0.06stat±0.04sys for the fraction of J/ψ from χc decays averaged over proton–carbon and proton–titanium collisions. This result is in agreement with previous measurements and is compared with theoretical predictions.
DOI: 10.1157/13054037
2003
Cited 27 times
Registro Español de Hemodinámica y Cardiología Intervencionista. XII Informe Oficial de la Sección de Hemodinámica y Cardiología Intervencionista de la Sociedad Española de Cardiología (1990-2002)
Se presentan los resultados del Registro de actividad de la Sección de Hemodinámica y Cardiología Intervencionista de la Sociedad Española de Cardiología del año 2002. Se han recogido datos de 101 centros, la práctica totalidad de los laboratorios del país, de los que 95 realizaron su actividad sobre todo en pacientes adultos y 6 exclusivamente en pacientes pediátricos. Se realizaron 97.609 estudios diagnósticos, con una cifra de 83.667 coronariografías, con un incremento de éstas del 5,1% respecto al año 2001 y una tasa de 2.053 coronariografías por millón de habitantes. Se efectuaron 34.723 procedimientos de intervencionismo coronario, con un incremento del 11% respecto al año anterior y una tasa de 850 intervenciones por millón de habitantes. El stent intracoronario fue el dispositivo más empleado, en el 91,7% de los procedimientos, con 47.249 unidades utilizadas (incremento del 20%). El stent con carácter directo, sin predilatación, fue utilizado en 13.768 procedimientos, el 43,2% de los casos. Los inhibidores de la glucoproteína IIb/IIIa fueron utilizados en 9.966 procedimientos (28,7%). En 9.830 casos (28%) se efectuó un procedimiento en multivaso, y en 26.341 casos (76%) la intervención coronaria percutánea se realizó en la misma sesión que la coronariografía diagnóstica. Se llevaron a cabo 4.766 procedimientos de intervencionismo en el infarto agudo de miocardio, lo que supone un 23,9% más respecto al año 2001 y el 13,7% del total de las intervenciones coronarias percutáneas. En el intervencionismo no coronario destaca el descenso del número de valvuloplastias mitrales (21,2%), un descenso en los cierres percutáneos de comunicación interauricular en pacientes adultos (11,1%), y un ligero incremento de los procedimientos intervencionistas en pacientes en edad pediátrica (3,7%). Finalmente, destacamos el alto grado de participación de centros en el registro, lo que hace que los datos aquí presentados sean representativos de la actividad hemodinámica en nuestro país. The results of the Registry of the Working Group on Cardiac Catheterization and Interventional Cardiology of the Spanish Society of Cardiology for 2002 are presented. Data were obtained from 101 centers representing all cardiac catheterization laboratories in Spain; 95 centers performed mainly adult catheterization and 6 carried out only pediatric procedures. In 2002, 97,609 diagnostic catheterization procedures were performed, including 83,667 coronary angiograms, representing a total increase of 5.1% in comparison to 2001. The population-adjusted rate was 2,053 coronary angiograms per 106 inhabitants. Coronary interventions increased by 11% in comparison to 2001, with a total of 34,723 procedures and a rate of coronary interventions of 850 per 106 inhabitants. Coronary stents were the devices used most frequently, with 47,249 implanted in 2002, for a total increase of 20% in comparison to 2001. Stenting accounted for 91.7% of all procedures. Direct stenting was done in 13 768 procedures (43.2%). IIb-IIIa glycoprotein inhibitors were used in 9966 procedures (28.7%). Multivessel percutaneous coronary interventions were performed in 9,830 patients (28%), and ad hoc interventions were done in the course of diagnostic coronary angiography in 26,341 patients (76%). A total of 4,766 percutaneous coronary interventions were done in patients with acute myocardial infarction, representing an increase of 23.9% in comparison to 2001, and accounting for 13.7% of all interventional procedures. Of the noncoronary interventions recorded, we note the decrease in percutaneous mitral valvuloplasties (21.2%) and atrial septal defect closures (11.1%), and the slight increase in pediatric interventions (3.7%). In conclusion, we emphasize the high rate of reporting by laboratories, which allows the Registry to compile data that are highly representative of the activity at cardiac catheterization laboratories in Spain
DOI: 10.1111/j.1530-9134.2006.00123.x
2006
Cited 24 times
<scp>Customer Directed Advertising and Product Quality</scp>
This paper studies the relationship between three key elements of the marketing mix, namely, price, product, and promotion, in a model where a seller employs informative advertising to launch a new product. We propose a fairly general advertising technology for the study of three promotional strategies—mass, imperfectly targeted, and customer directed advertising (CDA). We find that both the private and the social incentives to use distinct advertising strategies are aligned, and that sales are likely to be promoted through CDA. Compared to mass advertising, with CDA the social planner reduces quantity and downgrades quality whereas the seller sometimes upgrades it. Our model of targeting with endogenous product quality provides some new insights into the way the transition from mass to specialized advertising can affect market outcomes. Quality distortions imply that (i) even if CDA increases the market price, the degree of market power need not increase and (ii) CDA may yield a welfare loss even if it leads to a lower market price.
DOI: 10.1088/1741-4326/aa7691
2017
Cited 14 times
3D effects on transport and plasma control in the TJ-II stellarator
The effects of 3D geometry are explored in TJ-II from two relevant points of view: neoclassical transport and modification of stability and dispersion relation of waves. Particle fuelling and impurity transport are studied considering the 3D transport properties, paying attention to both neoclassical transport and other possible mechanisms. The effects of the 3D magnetic topology on stability, confinement and Alfvén Eigenmodes properties are also explored, showing the possibility of controlling Alfvén modes by modifying the configuration; the onset of modes similar to geodesic acoustic modes are driven by fast electrons or fast ions; and the weak effect of magnetic well on confinement. Finally, we show innovative power exhaust scenarios using liquid metals.
DOI: 10.1088/0741-3335/58/8/084004
2016
Cited 13 times
Particle transport after pellet injection in the TJ-II stellarator
We study radial particle transport in stellarator plasmas using cryogenic pellet injection. By means of perturbative experiments, we estimate the experimental particle flux and compare it with neoclassical simulations. Experimental evidence is obtained of the fact that core depletion in helical devices can be slowed-down even by pellets that do not reach the core region. This phenomenon is well captured by neoclassical predictions with DKES and FORTEC-3D.
DOI: 10.1016/j.physletb.2006.03.064
2006
Cited 20 times
Measurement of the <mml:math xmlns:mml="http://www.w3.org/1998/Math/MathML" altimg="si1.gif" overflow="scroll"><mml:mi>J</mml:mi><mml:mo stretchy="false">/</mml:mo><mml:mi>ψ</mml:mi></mml:math> production cross section in <mml:math xmlns:mml="http://www.w3.org/1998/Math/MathML" altimg="si2.gif" overflow="scroll"><mml:mn>920</mml:mn><mml:mtext> GeV</mml:mtext><mml:mo stretchy="false">/</mml:mo><mml:mi>c</mml:mi></mml:math> fixed-target proton–nucleus interactions
The mid-rapidity (dσpN/dy at y=0) and total (σpN) production cross sections of Jψ mesons are measured in proton–nucleus interactions. Data collected by the HERA-B experiment in interactions of 920 GeV/c protons with carbon, titanium and tungsten targets are used for this analysis. The Jψ mesons are reconstructed by their decay into lepton pairs. The total production cross section obtained is σpNJ/ψ=663±74±46 nb/nucleon. In addition, our result is compared with previous measurements.
DOI: 10.1016/j.physletb.2006.05.040
2006
Cited 20 times
Polarization of Λ and <mml:math xmlns:mml="http://www.w3.org/1998/Math/MathML" altimg="si1.gif" overflow="scroll"><mml:mover accent="true"><mml:mi>Λ</mml:mi><mml:mo>¯</mml:mo></mml:mover></mml:math> in 920 GeV fixed-target proton–nucleus collisions
A measurement of the polarization of Λ and Λ¯ baryons produced in pC and pW collisions at s=41.6GeV has been performed with the HERA-B spectrometer. The measurements cover the kinematic range of 0.6GeV/c<p⊥<1.2GeV/c in transverse momentum and −0.15<xF<0.01 in Feynman-x. The polarization results from the two different targets agree within the statistical error. In the combined data set, the largest deviation from zero, +0.054±0.029, is measured for xF≲−0.07. Zero polarization is expected at xF=0 in the absence of nuclear effects. The polarization results for the Λ agree with a parametrization of previous measurements which were performed at positive xF values, where the Λ polarization is negative. Results of Λ¯ polarization measurements are consistent with zero.
2007
Cited 17 times
Strategic Targeted Advertising and Market Fragmentation
This paper proves that oligopolistic price competition with both targeted advertising and targeted prices can lead to a permanent fragmentation of the market into a local monopoly. However, compared to mass advertising, targeting increases social welfare and turns out to be more beneficial for consumers than for firms.
DOI: 10.1016/0370-2693(86)91481-4
1986
Cited 19 times
Measurement of D-meson branching ratios
Charm data from 360 GeV/c π− p interactions are used to give results on D-meson branching ratios. The analysis is based on 114 charm events containing 183 observed charm particle decays. We present topological branching ratios and decay multiplicities, as well as the following inclusive branching ratios of D-mesons: B(D± → K∓ + anything) = 0.16−0.07 +0.08, if B(D0 → K ± + anything) = 0.44−0.10+0.11, ifB(D± → e± + 2,4 (rmcharged hadrons) = 0.07−0.05++0.08, B(D0 → e ± + anything) = 0.17−0.06+0.08.
DOI: 10.1007/s10723-010-9152-1
2010
Cited 12 times
Distributed Analysis in CMS
The CMS experiment expects to manage several Pbytes of data each year during the LHC programme, distributing them over many computing sites around the world and enabling data access at those centers for analysis. CMS has identified the distributed sites as the primary location for physics analysis to support a wide community with thousands potential users. This represents an unprecedented experimental challenge in terms of the scale of distributed computing resources and number of user. An overview of the computing architecture, the software tools and the distributed infrastructure is reported. Summaries of the experience in establishing efficient and scalable operations to get prepared for CMS distributed analysis are presented, followed by the user experience in their current analysis activities.
2016
Cited 9 times
El Gaucho Martin Fierro
2023
Neutropenic Enterocolitis: Case report and literature review.
Typhlitis, is also known as neutropenic enterocolitis, affects the cecum and distal ileum. It was frequently encountered in pediatric patients who were undergoing treatment for leukemia. Nonetheless, it can affect adult patients, regardless of the cause of the immunosuppression. We report the case of a 20-year-old patient who was receiving chemotherapy for Osteosarcoma, who had a 6-day history of nausea and vomiting, fever sensation, diarrhea, and diffuse abdominal pain. Physical examination was relevant for hemodynamic instability, a distended and tender abdomen predominantly in the right iliac fossa. The laboratory workup showed severe neutropenia, thrombocytopenia, and electrolyte disturbances. The image studies evidenced edema of the ascending colon and cecum. Treatment was started with vasopressor support, correction of electrolyte alterations, blood cell and platelet transfusion, G-CSF, hydration, broad spectrum antibiotic therapy, initially with adequate clinical and laboratory response. After a few days, he presented lower gastrointestinal bleeding which was treated by conservative management. In conclusion, typhlitis must be suspected in every patient developing neutropenia as a reaction to chemotherapy and who also presents gastrointestinal symptoms, such as nausea, vomiting, diarrhea, and intense abdominal pain.
DOI: 10.1103/physrevd.73.052005
2006
Cited 14 times
Improved measurement of the<mml:math xmlns:mml="http://www.w3.org/1998/Math/MathML" display="inline"><mml:mi>b</mml:mi><mml:mover accent="true"><mml:mi>b</mml:mi><mml:mo>¯</mml:mo></mml:mover></mml:math>production cross section in 920 GeV fixed-target proton-nucleus collisions
A new measurement of the $b\overline{b}$ production cross section in 920 GeV proton-nucleus collisions is presented by the HERA-B Collaboration. The $b\overline{b}$ production is tagged via inclusive bottom quark decays into $J/\ensuremath{\psi}$ mesons by exploiting the longitudinal separation of $J/\ensuremath{\psi}\ensuremath{\rightarrow}{l}^{+}{l}^{\ensuremath{-}}$ decay vertices from the primary proton-nucleus interaction point. Both ${e}^{+}{e}^{\ensuremath{-}}$ and ${\ensuremath{\mu}}^{+}{\ensuremath{\mu}}^{\ensuremath{-}}$ channels are reconstructed for a total of $83\ifmmode\pm\else\textpm\fi{}12$ inclusive $b\ensuremath{\rightarrow}J/\ensuremath{\psi}X$ events found. The combined analysis yields a $b\overline{b}$ to prompt $J/\ensuremath{\psi}$ cross section ratio of $\frac{\ensuremath{\Delta}\ensuremath{\sigma}(b\overline{b})}{\ensuremath{\Delta}{\ensuremath{\sigma}}_{J/\ensuremath{\psi}}}=0.032\ifmmode\pm\else\textpm\fi{}{0.005}_{\mathrm{stat}}\ifmmode\pm\else\textpm\fi{}{0.004}_{\mathrm{sys}}$ measured in the ${x}_{F}$ acceptance ($\ensuremath{-}0.35&lt;{x}_{F}&lt;0.15$), extrapolated to $\ensuremath{\sigma}(b\overline{b})=14.9\ifmmode\pm\else\textpm\fi{}{2.2}_{\mathrm{stat}}\ifmmode\pm\else\textpm\fi{}{2.4}_{\mathrm{sys}}\text{ }\text{ }\mathrm{\text{nb/nucleon}}$ in the total phase space.
DOI: 10.1080/00379271.2006.10697454
2006
Cited 14 times
Morphological study of the Stridulatory Organ in two species of<i>Crematogaster</i>genus:<i>Crematogaster scutellaris</i>(Olivier 1792) and<i>Crematogaster auberti</i>(Emery 1869) (Hymenoptera: Formicidae)
Abstract The stridulatory organ of the Crematogaster scutellaris (Olivier 1792) workers is being described, comparing their pars stridens present in six nests of this species, with one nest of Crematogaster auberti Emery 1869 and with the bibliographical data regarding other neighbouring species at our disposal. Both species and some Crematogaster scutellaris nests have shown significant differences. We propose several hypotheses which could explain these differences.
DOI: 10.1016/j.nuclphysbps.2008.02.001
2008
Cited 12 times
The CMS Monte Carlo Production System: Development and Design
The CMS production system has undergone a major architectural upgrade from its predecessor, with the goal of reducing the operational manpower needed and preparing for the large scale production required by the CMS physics plan. The new production system is a tiered architecture that facilitates robust and distributed production request processing and takes advantage of the multiple Grid and farm resources available to the CMS experiment.
DOI: 10.9774/gleaf.978-1-907643-26-2_16
2013
Cited 9 times
Towards a sustainable innovation model for small enterprises
DOI: 10.1007/s12083-015-0338-y
2015
Cited 8 times
Evaluation of alternatives for the broadcast operation in Kademlia under churn
DOI: 10.1002/ctpp.201400067
2015
Cited 8 times
A Spectrally Resolved Motional Stark Effect Diagnostic for the TJ‐II Stellarator
Abstract A spectrally resolved motional stark effect (MSE) diagnostic has been implemented for the TJ‐II stellarator to quantify the magnitude and pitch of components of the magnetic field created in this magnetic confinement device. The system includes a compact diagnostic neutral beam injector (DNBI) that provides a short pulse of accelerated neutral hydrogen atoms with an e –1 beam radius of 2.1 cm to stimulate the Doppler‐displaced Balmer H αs emissions, which are the basis for this diagnostic. Measurement of the wavelength separation of the Stark splitting of the H α spectrum, as well as of the relative line intensities of its components, allow the local magnitude and direction of the internal magnetic field components to be measured at 10 positions across the plasma. The use of a DNBI extends such measurements to the electron cyclotron resonance (ECR) heated phases of plasmas while also overcoming the need for the complicated inversion techniques that are required when such measurements are performed with a heating neutral beam injector (NBI). Moreover, the use of the shot‐to‐shot technique with reproducible discharges further simplifies fits to the MSE spectra as nearby impurity spectral emission lines can be eliminated or significantly reduced. After outlining the principles of this technique and the diagnostic set‐up, magnetic field measurements made during ECR or NBI heating phases are reported for a range of magnetic configurations and are compared with vacuum magnetic field estimates in order to evaluate the capabilities and limitations of this diagnostic for the TJ‐II. (© 2015 WILEY‐VCH Verlag GmbH &amp; Co. KGaA, Weinheim)
DOI: 10.1088/0029-5515/55/10/104014
2015
Cited 8 times
Transport, stability and plasma control studies in the TJ-II stellarator
The main TJ-II results since 2012 are presented in this overview. Impurity confinement is studied showing an isotopic dependence of impurity confinement time, asymmetries in parallel impurity flows in TJ-II ion-root plasmas and impurity density asymmetries within a flux surface. In addition, first observations of electrostatic potential variations within the same magnetic flux surface are presented. Evidence of the impact of three-dimensional magnetic structures on plasma confinement and L–H transitions is also presented. The leading role of the plasma turbulence is emphasized by the observed temporal ordering of the limit cycle oscillations at the L–I–H transition. Comparative studies between tokamaks and stellarators have provided direct experimental evidence for the importance of multi-scale physics to unravel the impact of the isotope effect on transport. Novel solutions for plasma facing components based on the recently installed Li-liquid limiters (LLLs) have been developed on TJ-II, showing the self-screening effect of evaporating liquid lithium, protecting plasma-facing components against heat loads, and tritium inventory control. Regarding plasma stability, magnetic well scan experiments show that traditional stability criteria, on which the optimization of stellarator configurations is based, may miss some stabilization mechanisms. Further effects of ECRH on Alfvénic instabilities are investigated, showing that moderate off-axis ECH power deposition modifies the continuous nature of the Alfvén eigenmodes, and frequency chirping sets in. This result shows that ECH can be a tool for AE control that might be ITER and reactor-relevant.
DOI: 10.2307/40082091
1940
Cited 3 times
La vida literaria en Cuba (1836-1840)
DOI: 10.1007/s00712-011-0243-7
2011
Cited 8 times
Specialized advertising media and product market competition
DOI: 10.1088/1742-6596/664/6/062038
2015
Cited 7 times
CMS distributed data analysis with CRAB3
The CMS Remote Analysis Builder (CRAB) is a distributed workflow management tool which facilitates analysis tasks by isolating users from the technical details of the Grid infrastructure. Throughout LHC Run 1, CRAB has been successfully employed by an average of 350 distinct users each week executing about 200,000 jobs per day.
DOI: 10.1109/hpcsim.2014.6903678
2014
Cited 6 times
Distributed scheduling and data sharing in late-binding overlays
Pull-based late-binding overlays are used in some of today's largest computational grids. Job agents are submitted to resources with the duty of retrieving real workload from a central queue at runtime. This helps overcome the problems of these complex environments: heterogeneity, imprecise status information and relatively high failure rates. In addition, the late job assignment allows dynamic adaptation to changes in grid conditions or user priorities. However, as the scale grows, the central assignment queue may become a bottleneck for the whole system. This article presents a distributed scheduling architecture for late-binding overlays, which addresses this issue by letting execution nodes build a distributed hash table and delegating job matching and assignment to them. This reduces the load on the central server and makes the system much more scalable and robust. Scalability makes fine-grained scheduling possible and enables new functionalities, like the implementation of a distributed data cache on the execution nodes, which helps alleviate the commonly congested grid storage services.
DOI: 10.1088/1742-6596/396/3/032055
2012
Cited 6 times
Multi-core processing and scheduling performance in CMS
Commodity hardware is going many-core. We might soon not be able to satisfy the job memory needs per core in the current single-core processing model in High Energy Physics. In addition, an ever increasing number of independent and incoherent jobs running on the same physical hardware not sharing resources might significantly affect processing performance. It will be essential to effectively utilize the multi-core architecture.
DOI: 10.1007/s00712-013-0357-1
2013
Cited 6 times
Endogenous direct advertising and price competition
DOI: 10.5170/cern-2005-002.838
2004
Cited 10 times
Software Agents in Data and Workflow Management
DOI: 10.1055/s-2008-1074875
1980
Cited 10 times
Flavonoids from Digitalis thapsi Leaves
DOI: 10.1109/tns.2005.852755
2005
Cited 10 times
Distributed computing grid experiences in CMS
The CMS experiment is currently developing a computing system capable of serving, processing and archiving the large number of events that will be generated when the CMS detector starts taking data. During 2004 CMS undertook a large scale data challenge to demonstrate the ability of the CMS computing system to cope with a sustained data-taking rate equivalent to 25% of startup rate. Its goals were: to run CMS event reconstruction at CERN for a sustained period at 25 Hz input rate; to distribute the data to several regional centers; and enable data access at those centers for analysis. Grid middleware was utilized to help complete all aspects of the challenge. To continue to provide scalable access from anywhere in the world to the data, CMS is developing a layer of software that uses Grid tools to gain access to data and resources, and that aims to provide physicists with a user friendly interface for submitting their analysis jobs. This paper describes the data challenge experience with Grid infrastructure and the current development of the CMS analysis system.
DOI: 10.1088/1742-6596/219/6/062043
2010
Cited 6 times
Data location-aware job scheduling in the grid. Application to the GridWay metascheduler
Grid infrastructures constitute nowadays the core of the computing facilities of the biggest LHC experiments. These experiments produce and manage petabytes of data per year and run thousands of computing jobs every day to process that data. It is the duty of metaschedulers to allocate the tasks to the most appropriate resources at the proper time.
DOI: 10.1088/1742-6596/219/6/062055
2010
Cited 6 times
Debugging data transfers in CMS
The CMS experiment at CERN is preparing for LHC data taking in several computing preparation activities. In early 2007 a traffic load generator infrastructure for distributed data transfer tests was designed and deployed to equip the WLCG tiers which support the CMS virtual organization with a means for debugging, load-testing and commissioning data transfer routes among CMS computing centres. The LoadTest is based upon PhEDEx as a reliable, scalable data set replication system. The Debugging Data Transfers (DDT) task force was created to coordinate the debugging of the data transfer links. The task force aimed to commission most crucial transfer routes among CMS tiers by designing and enforcing a clear procedure to debug problematic links. Such procedure aimed to move a link from a debugging phase in a separate and independent environment to a production environment when a set of agreed conditions are achieved for that link. The goal was to deliver one by one working transfer routes to the CMS data operations team. The preparation, activities and experience of the DDT task force within the CMS experiment are discussed. Common technical problems and challenges encountered during the lifetime of the taskforce in debugging data transfer links in CMS are explained and summarized.
DOI: 10.1088/0029-5515/53/10/104016
2013
Cited 5 times
Dynamics of flows and confinement in the TJ-II stellarator
This work deals with the results on flow dynamics in TJ-II plasmas under Li-coated wall conditions, which produces low recycling and facilitates the density control and access to improved confinement transitions. The low-density transition, characterized by the emergence of the shear flow layer, is described from first principles and within the framework of neoclassical theory. The vanishing of the neoclassical viscosity when approaching the transition from below explains the observation of a number of turbulent phenomena reported in TJ-II in recent years; a unifying picture is provided in which zonal, i.e. large scale, radially structured, perturbations are observable when the neoclassical damping is sufficiently small. Preliminary linear, collisionless gyrokinetic simulations are carried out to assess that the measured time scale of relaxation of such perturbations is reasonably understood theoretically. In higher density regimes, the physical mechanisms behind the L–H transition have been experimentally studied. The spatial, temporal and spectral structure of the interaction between turbulence and flows has been studied close to the L–H transition threshold conditions. The temporal dynamics of the turbulence-flow interaction displays a predator–prey relationship and both radial outward and inward propagation velocities of the turbulence-flow front have been measured. Finally, a non-linear relation between turbulent fluxes and gradients is observed.
DOI: 10.1016/j.nima.2004.02.006
2004
Cited 8 times
HERA-B data acquisition system
The HERA-B Data Acquisition System implements a 50kHz dead-timeless readout of 500kB events requiring unprecedented speed of storing and data processing. The system is based on Digital Signal Processors (DSP) minimizing the number of components. A high bandwidth, low-latency DSP switching network provides full connectivity between the readout buffers and a PC farm which runs the higher level trigger. The design of the system and the achieved performance are described in this paper.
DOI: 10.1088/1742-6596/219/6/062007
2010
Cited 5 times
Use of the gLite-WMS in CMS for production and analysis
The CMS experiment at LHC started using the Resource Broker (by the EDG and LCG projects) to submit Monte Carlo production and analysis jobs to distributed computing resources of the WLCG infrastructure over 6 years ago. Since 2006 the gLite Workload Management System (WMS) and Logging & Bookkeeping (LB) are used. The interaction with the gLite-WMS/LB happens through the CMS production and analysis frameworks, respectively ProdAgent and CRAB, through a common component, BOSSLite. The important improvements recently made in the gLite-WMS/LB as well as in the CMS tools and the intrinsic independence of different WMS/LB instances allow CMS to reach the stability and scalability needed for LHC operations. In particular the use of a multi-threaded approach in BOSSLite allowed to increase the scalability of the systems significantly. In this work we present the operational set up of CMS production and analysis based on the gLite-WMS and the performances obtained in the past data challenges and in the daily Monte Carlo productions and user analysis usage in the experiment.
DOI: 10.1088/1742-6596/396/3/032053
2012
Cited 5 times
Evolution of the Distributed Computing Model of the CMS experiment at the LHC
The Computing Model of the CMS experiment was prepared in 2005 and described in detail in the CMS Computing Technical Design Report. With the experience of the first years of LHC data taking and with the evolution of the available technologies, the CMS Collaboration identified areas where improvements were desirable. In this work we describe the most important modifications that have been, or are being implemented in the Distributed Computing Model of CMS. The Worldwide LHC computing Grid (WLCG) project acknowledged that the whole distributed computing infrastructure is impacted by this kind of changes that are happening in most LHC experiments and decided to create several Technical Evolution Groups (TEG) aiming at assessing the situation and at developing a strategy for the future. In this work we describe the CMS view on the TEG activities as well.
DOI: 10.1111/jems.12173
2016
Cited 4 times
Advertising Media Planning, Optimal Pricing, and Welfare
This paper analyzes optimal media planning strategies in a pricing‐advertising competition model where firms can use mass and specialized advertising. We find that although targeted advertising avoids the wasting of ads, firms might find it optimal to mix specialized advertising with the mass media. We also show that the characteristics of the specialized media available crucially affect the outcome of price competition between firms, which can range from a full fragmentation of the market into local monopolies to lower average prices (compared to the case where firms had only mass advertising available). Regarding welfare, we prove that although the use of specialized advertising can lower consumer surplus and drive a fragment of consumers out of the market, this advertising technology is welfare‐improving, and can be Pareto superior.
DOI: 10.1016/j.physletb.2006.04.042
2006
Cited 7 times
Measurement of the ϒ production cross section in 920 GeV fixed-target proton–nucleus collisions
The cross section ratio RJ/ψ=Br(ϒ→l+l−)⋅dσ(ϒ)/dy|y=0/σ(J/ψ) has been measured with the HERA-B spectrometer in fixed-target proton–nucleus collisions at 920 GeV proton beam energy corresponding to a proton–nucleon c.m.s. energy of s=41.6GeV. The combined results for the decay channels ϒ→e+e− and ϒ→μ+μ− yield a ratio RJ/ψ=(9.0±2.1)×10−6. The corresponding ϒ production cross section per nucleon at mid-rapidity (y=0) has been determined to be Br(ϒ→l+l−)⋅dσ(ϒ)/dy|y=0=4.5±1.1pb/nucleon.
DOI: 10.1088/1742-6596/219/6/062047
2010
Cited 4 times
The commissioning of CMS sites: Improving the site reliability
The computing system of the CMS experiment works using distributed resources from more than 60 computing centres worldwide. These centres, located in Europe, America and Asia are interconnected by the Worldwide LHC Computing Grid. The operation of the system requires a stable and reliable behaviour of the underlying infrastructure. CMS has established a procedure to extensively test all relevant aspects of a Grid site, such as the ability to efficiently use their network to transfer data, the functionality of all the site services relevant for CMS and the capability to sustain the various CMS computing workflows at the required scale. This contribution describes in detail the procedure to rate CMS sites depending on their performance, including the complete automation of the program, the description of monitoring tools, and its impact in improving the overall reliability of the Grid from the point of view of the CMS computing system.
DOI: 10.1007/s13209-010-0033-4
2010
Cited 4 times
Specialized advertising and price competition in vertically differentiated markets
This paper studies how the emergence of specialized communication media focused on both high quality contents and high quality advertised products, affects the functioning of a vertically differentiated market. To that end, we formulate a simultaneous game of pricing and targeted advertising with two firms producing different levels of quality. We find that the transition from uniform advertising to targeted advertising can turn a pure vertically differentiated market into a hybrid market which incorporates some features of a monopoly, thus changing the pattern of price competition between the firms. In particular, we show that (1) compared to uniform advertising, targeting leads both firms to always charge higher prices, (2) the increase in prices is more intense in highly competitive (low differentiated) markets, (3) the expected price of the low-quality firm is non-monotonic with the degree of product differentiation, and (4) the low-quality product may be sold at a higher expected price than the high-quality product. We also show that a progressive growth of specialized advertising vehicles leads to a further increase in prices. In addition, more specialized targeting may raise price competition, so firms may find it optimal to use low specialized targeting.
DOI: 10.1016/j.infoecopol.2017.01.001
2017
Cited 4 times
Direct advertising and opt-in provisions: Policy and market implications
This paper formulates a game of pricing and informative advertising with horizontally-differentiated products in which two firms, first, compete with mass advertising and, later, build a database using their historical sales records and compete by targeting the ads to their potential customers. We study market interaction under two types of direct advertising: opt-in advertising, where firms ask consumers for their consent to send them ads with information about new products, and direct advertising without permission, where sellers use consumer contact information without their explicit consent. We show that, compared to the case where firms only use mass media, the use of direct ads (with or without permission) results in an intertemporal reallocation of market power from the first to the second period and that, compared to opt-in advertising, direct advertising without permission results in lower or equal prices. We also evaluate the impact of a regulatory policy aimed at protecting consumer privacy by banning the use of direct advertising without permission in favor of opt-in advertising. We find that this policy lowers social welfare and, if the degree of product differentiation is sufficiently high (vs. low), it does not affect (vs. lowers) firm profits and lowers (vs. increases) consumer surplus.
DOI: 10.1016/j.future.2017.04.021
2018
Cited 4 times
Integration of end-user Cloud storage for CMS analysis
End-user Cloud storage is increasing rapidly in popularity in research communities thanks to the collaboration capabilities it offers, namely synchronisation and sharing. CERN IT has implemented a model of such storage named, CERNBox, integrated with the CERN AuthN and AuthZ services. To exploit the use of the end-user Cloud storage for the distributed data analysis activity, the CMS experiment has started the integration of CERNBox as a Grid resource. This will allow CMS users to make use of their own storage in the Cloud for their analysis activities as well as to benefit from synchronisation and sharing capabilities to achieve results faster and more effectively. It will provide an integration model of Cloud storages in the Grid, which is implemented and commissioned over the world’s largest computing Grid infrastructure, Worldwide LHC Computing Grid (WLCG). In this paper, we present the integration strategy and infrastructure changes needed in order to transparently integrate end-user Cloud storage with the CMS distributed computing model. We describe the new challenges faced in data management between Grid and Cloud and how they were addressed, along with details of the support for Cloud storage recently introduced into the WLCG data movement middleware, FTS3. The commissioning experience of CERNBox for the distributed data analysis activity is also presented.
DOI: 10.1163/2468-1733_shafr_sim090130026
2017
Cited 4 times
Race to Revolution: The United States and Cuba During Slavery and Jim Crow
The histories of Cuba and the United States are tightly intertwined and have been for at least two centuries. In Race to Revolution, historian Gerald Horne examines a critical relationship between the two countries by tracing out the typically overlooked interconnections among slavery, Jim Crow, and revolution. Slavery was central to the economic and political trajectories of Cuba and the United States, both in terms of each nation s internal political and economic development and in the interactions between the small Caribbean island and the Colossus of the North. Horne draws a direct link between the black experiences in two very different countries and follows that connection through changing periods of resistance and revolutionary upheaval. Black Cubans were crucial to Cuba s initial independence, and the relative freedom they achieved helped bring down Jim Crow in the United States, reinforcing radical politics within the black communities of both nations. This in turn helped to create the conditions that gave rise to the Cuban Revolution which, on New Years Day in 1959, shook the United States to its core. Based on extensive research in Havana, Madrid, London, and throughout the U.S., Race to Revolution delves deep into the historical record, bringing to life the experiences of slaves and slave traders, abolitionists and sailors, politicians and poor farmers. It illuminates the complex web of interaction and infl uence that shaped the lives of many generations as they struggled over questions of race, property, and political power in both Cuba and the United States.
2005
Cited 6 times
Distributed computing grid experiences in CMS DC04
DOI: 10.1016/j.physletb.2004.06.097
2004
Cited 6 times
Search for the flavor-changing neutral current decay <mml:math xmlns:mml="http://www.w3.org/1998/Math/MathML" altimg="si1.gif" overflow="scroll"><mml:msup><mml:mi mathvariant="normal">D</mml:mi><mml:mn>0</mml:mn></mml:msup><mml:mo>→</mml:mo><mml:msup><mml:mi>μ</mml:mi><mml:mo>+</mml:mo></mml:msup><mml:msup><mml:mi>μ</mml:mi><mml:mo>−</mml:mo></mml:msup></mml:math> with the HERA-B detector
We report on a search for the flavor-changing neutral current decay $D^0 \to \mu^+\mu^-$ using $50 \times 10^6$ events recorded with a dimuon trigger in interactions of 920 GeV protons with nuclei by the HERA-B experiment. We find no evidence for such decays and set a 90% confidence level upper limit on the branching fraction $Br(D^0 \to \mu^+\mu^-) <2.0 \times 10^{-6}$.
DOI: 10.1109/nssmic.2008.4774771
2008
Cited 4 times
The commissioning of CMS computing centres in the worldwide LHC computing Grid
The computing system of the CMS experiment uses distributed resources from more than 60 computing centres worldwide. Located in Europe, America and Asia, these centres are interconnected by the Worldwide LHC Computing Grid. The operation of the system requires a stable and reliable behavior of the underlying infrastructure. CMS has established a procedure to extensively test all relevant aspects of a Grid site, such as the ability to efficiently use their network to transfer data, services relevant for CMS and the capability to sustain the various CMS computing workflows (Monte Carlo simulation, event reprocessing and skimming, data analysis) at the required scale. This contribution describes in detail the procedure to rate CMS sites depending on their performance, including the complete automation of the program, the description of monitoring tools, and its impact in improving the overall reliability of the Grid from the point of view of the CMS computing system.
DOI: 10.1109/nssmic.2008.4775085
2008
Cited 4 times
The CMS data transfer test environment in preparation for LHC data taking
The CMS experiment is preparing for LHC data taking in several computing preparation activities. In distributed data transfer tests, in early 2007 a traffic load generator infrastructure was designed and deployed, to equip the WLCG Tiers which support the CMS Virtual Organization with a means for debugging, load-testing and commissioning data transfer routes among CMS Computing Centres. The LoadTest is based upon PhEDEx as a reliable, scalable dataset replication system. In addition, a Debugging Data Transfers (DDT) Task Force was created to coordinate the debugging of data transfer links in the preparation period and during the Computing Software and Analysis challenge in 2007 (CSA07). The task force aimed to commission most crucial transfer routes among CMS tiers by designing and enforcing a clear procedure to debug problematic links. Such procedure aimed to move a link from a debugging phase in a separate and independent environment to a production environment when a set of agreed conditions are achieved for that link. The goal was to deliver one by one working transfer routes to Data Operations. The experiences with the overall test transfers infrastructure within computing challenges - as in the WLCG Common-VO Computing Readiness Challenge (CCRC’08) - as well as in daily testing and debugging activities are reviewed and discussed, and plans for the future are presented.
DOI: 10.1088/1742-6596/331/7/072020
2011
Cited 3 times
Monitoring the Readiness and Utilization of the Distributed CMS Computing Facilities
The CMS experiment has adopted a computing system where resources are distributed worldwide in more than 100 sites. The operation of the system requires a stable and reliable behavior of the underlying infrastructure. CMS has established procedures to extensively test all relevant aspects of a site and their capability to sustain the various CMS computing workflows at the required scale. The Site Readiness monitoring infrastructure has been instrumental in understanding how the system as a whole was improving towards LHC operations, measuring the reliability of sites when running CMS activities, and providing sites with the information they need to solve eventual problems. This paper reviews the complete automation of the Site Readiness program, with the description of monitoring tools, the impact in improving the overall reliability of the Grid from the point of view of the CMS computing system, as well as the resource utilization and performance seen at the sites during the first year of LHC running.
DOI: 10.1088/1742-6596/396/3/032103
2012
Cited 3 times
The benefits and challenges of sharing glidein factory operations across nine time zones between OSG and CMS
OSG has been operating for a few years at UCSD a glideinWMS factory for several scientific communities, including CMS analysis, HCC and GLOW. This setup worked fine, but it had become a single point of failure. OSG thus recently added another instance at Indiana University, serving the same user communities. Similarly, CMS has been operating a glidein factory dedicated to reprocessing activities at Fermilab, with similar results. Recently, CMS decided to host another glidein factory at CERN, to increase the availability of the system, both for analysis, MC and reprocessing jobs. Given the large overlap between this new factory and the three factories in the US, and given that CMS represents a significant fraction of glideins going through the OSG factories, CMS and OSG formed a common operations team that operates all of the above factories. The reasoning behind this arrangement is that most operational issues stem from Grid-related problems, and are very similar for all the factory instances. Solving a problem in one instance thus very often solves the problem for all of them. This paper presents the operational experience of how we address both the social and technical issues of running multiple instances of a glideinWMS factory with operations staff spanning multiple time zones on two continents.
DOI: 10.1088/1742-6596/331/6/062032
2011
Cited 3 times
CMS Distributed Computing Integration in the LHC sustained operations era
After many years of preparation the CMS computing system has reached a situation where stability in operations limits the possibility to introduce innovative features. Nevertheless it is the same need of stability and smooth operations that requires the introduction of features that were considered not strategic in the previous phases. Examples are: adequate authorization to control and prioritize the access to storage and computing resources; improved monitoring to investigate problems and identify bottlenecks on the infrastructure; increased automation to reduce the manpower needed for operations; effective process to deploy in production new releases of the software tools. We present the work of the CMS Distributed Computing Integration Activity that is responsible for providing a liaison between the CMS distributed computing infrastructure and the software providers, both internal and external to CMS. In particular we describe the introduction of new middleware features during the last 18 months as well as the requirements to Grid and Cloud software developers for the future.
DOI: 10.2307/40081716
1940
La novela histórica en España, 1828-1850
DOI: 10.1051/epjconf/201921403006
2019
Cited 3 times
Improving efficiency of analysis jobs in CMS
Hundreds of physicists analyze data collected by the Compact Muon Solenoid (CMS) experiment at the Large Hadron Collider using the CMS Remote Analysis Builder and the CMS global pool to exploit the resources of the Worldwide LHC Computing Grid. Efficient use of such an extensive and expensive resource is crucial. At the same time, the CMS collaboration is committed to minimizing time to insight for every scientist, by pushing for fewer possible access restrictions to the full data sample and supports the free choice of applications to run on the computing resources. Supporting such variety of workflows while preserving efficient resource usage poses special challenges. In this paper we report on three complementary approaches adopted in CMS to improve the scheduling efficiency of user analysis jobs: automatic job splitting, automated run time estimates and automated site selection for jobs.
DOI: 10.1007/s10723-016-9374-y
2016
Distributed Late-binding Scheduling and Cooperative Data Caching
DOI: 10.2307/40070691
1932
Ensayo de psicología de Sor Juana Inés de la Cruz
1974
Cited 5 times
The Gaucho Martin Fierro
DOI: 10.1109/tns.2003.814537
2003
Cited 5 times
Architecture of the HERA-B data acquisition system
The HERA-B experiment was dedicated for the measurement of charge conjugation space inversion (parity) (CP)-violation in decays of neutral B-mesons and for investigating the physics of charmed particles. One of the experimental requirements is a highly selective on-line filtering of data due to high interaction rates and a low signal-to-background ratio. This demands a hierarchical trigger and a high bandwidth data acquisition system. The challenge for the data acquisition (DAQ) system is a readout free of deadtime, which requires an unprecedented speed of storing and processing the data. In this paper, we will outline the general architecture and hardware implementation of the HERA-B DAQ.
DOI: 10.1109/nssmic.2004.1462662
2005
Cited 4 times
Use of grid tools to support CMS distributed analysis
In order to prepare the Physics Technical Design Report, due by end of 2005, the CMS experiment needs to simulate, reconstruct and analyse about 100 million events, corresponding to more than 200 TB of data. The data will be distributed to several Computing Centres. In order to provide access to the whole data sample to all the world-wide dispersed physicists, CMS is developing a layer of software that uses the Grid tools provided by the LCG project to gain access to data and resources and that aims to provide a user friendly interface to the physicists submitting the analysis jobs. To achieve these aims CMS will use Grid tools from both the LCG-2 release and those being developed in the framework of the ARDA project. This work describes the current status and the future developments of the CMS analysis system.
2008
Cited 3 times
Portable NDA Equipment for Enrichment Measurements in the HEU Transparency Program
In October 1996, the Department of Energy (DOE) and MINATOM agreed to use portable non-destructive assay (NDA) equipment to measure the {sup 235}U enrichment of material subject to the HEU Transparency agreement. A system based on the ''enrichment meter'' method and high-purity germanium (HPGe) detectors had been previously developed for this application. Instead, sodium iodide (NaI) detectors were chosen to measure {sup 235}U enrichment because HPGe systems might reveal sensitive information. Although the accuracy of the NaI systems is less than an HPGe system, it still satisfies the transparency requirements. The equipment consists of a collimated NaI detector, a Canberra Inspector Multi-channel Analyzer, and a laptop computer. The units have been used to confirm the enrichment of material at Russian facilities since January 1997. This paper compares the performance of the NaI systems with the HPGe system and discusses some significant differences.
DOI: 10.1016/j.nima.2007.09.011
2007
Cited 3 times
Luminosity determination at HERA-B
A detailed description of an original method used to measure the luminosity accumulated by the HERA-B experiment for a data sample taken during the 2002-2003 HERA running period is reported. We show that, with this method, a total luminosity measurement can be achieved with a typical precision, including overall systematic uncertainties, at a level of 5% or better. We also report evidence for the detection of delta-rays generated in the target and comment on the possible use of such delta rays to measure luminosity.
DOI: 10.1088/1742-6596/513/3/032074
2014
CMS multicore scheduling strategy
In the next years, processor architectures based on much larger numbers of cores will be most likely the model to continue "Moore's Law" style throughput gains.This not only results in many more jobs in parallel running the LHC Run 1 era monolithic applications, but also the memory requirements of these processes push the workernode architectures to the limit.One solution is parallelizing the application itself, through forking and memory sharing or through threaded frameworks.CMS is following all of these approaches and has a comprehensive strategy to schedule multicore jobs on the GRID based on the glideinWMS submission infrastructure.The main component of the scheduling strategy, a pilot-based model with dynamic partitioning of resources that allows the transition to multicore or whole-node scheduling without disallowing the use of single-core jobs, is described.This contribution also presents the experiences made with the proposed multicore scheduling schema and gives an outlook of further developments working towards the restart of the LHC in 2015.
DOI: 10.2307/40077208
1935
Introducción y programa de literatura española
1872
El Martin Fierro
DOI: 10.1016/s0735-1097(23)02621-9
2023
MULTIDISCIPLINARY TEAM-BASED TELEHEALTH WITH REMOTE AUTOMATED BLOOD PRESSURE MONITORING AMONG HYPERTENSIVE PATIENTS: AN 18-MONTH RETROSPECTIVE EVALUATION STUDY
DOI: 10.1158/1535-7163.c.6538641.v1
2023
Data from Expression of Genes Involved in Vascular Morphogenesis and Maturation Predicts Efficacy of Bevacizumab-Based Chemotherapy in Patients Undergoing Liver Resection
&lt;div&gt;Abstract&lt;p&gt;Angiogenesis-related gene expression is associated with the efficacy of anti-VEGF therapy. We tested whether intratumoral mRNA expression levels of genes involved in vascular morphogenesis and early vessel maturation predict response, recurrence-free survival (RFS), and overall survival (OS) in a unique cohort of patients with colorectal liver metastases (CLM) treated with bevacizumab-based chemotherapy followed by curative liver resection. Intratumoral mRNA was isolated from resected bevacizumab-pretreated CLM from 125 patients. In 42 patients, a matching primary tumor sample collected before bevacizumab treatment was available. Relative mRNA levels of 9 genes (&lt;i&gt;ACVRL1, EGFL7, EPHB4, HIF1A, VEGFA, VEGFB, VEGFC, FLT1&lt;/i&gt;, and &lt;i&gt;KDR&lt;/i&gt;) were analyzed by RT-PCR and evaluated for associations with response, RFS, and OS. &lt;i&gt;P&lt;/i&gt; values for the associations between the individual dichotomized expression level and RFS were adjusted for choosing the optimal cut-off value. In CLM, high expression of &lt;i&gt;VEGFB, VEGFC, HIF1A&lt;/i&gt;, and &lt;i&gt;KDR&lt;/i&gt; and low expression of &lt;i&gt;EGFL7&lt;/i&gt; were associated with favorable RFS in multivariable analysis (&lt;i&gt;P&lt;/i&gt; &lt; 0.05). High &lt;i&gt;ACVRL1&lt;/i&gt; levels predicted favorable 3-year OS (&lt;i&gt;P&lt;/i&gt; = 0.041) and radiologic response (PR = 1.093, SD = 0.539, &lt;i&gt;P&lt;/i&gt; = 0.002). In primary tumors, low &lt;i&gt;VEGFA&lt;/i&gt; and high &lt;i&gt;EGFL7&lt;/i&gt; were associated with radiologic and histologic response (&lt;i&gt;P&lt;/i&gt; &lt; 0.05). High &lt;i&gt;VEGFA&lt;/i&gt; expression predicted shorter RFS (10.1 vs. 22.6 months; HR = 2.83, &lt;i&gt;P&lt;/i&gt; = 0.038). High &lt;i&gt;VEGFB&lt;/i&gt; (46% vs. 85%; HR = 5.75, &lt;i&gt;P&lt;/i&gt; = 0.009) and low &lt;i&gt;FLT1&lt;/i&gt; (55% vs. 100%; &lt;i&gt;P&lt;/i&gt; = 0.031) predicted lower 3-year OS rates. Our data suggest that intratumoral mRNA expression of genes involved in vascular morphogenesis and early vessel maturation may be promising predictive and/or prognostic biomarkers. &lt;i&gt;Mol Cancer Ther; 15(11); 2814–21. ©2016 AACR&lt;/i&gt;.&lt;/p&gt;&lt;/div&gt;
DOI: 10.1158/1535-7163.22507302
2023
Supplementary Figure S6 from Expression of Genes Involved in Vascular Morphogenesis and Maturation Predicts Efficacy of Bevacizumab-Based Chemotherapy in Patients Undergoing Liver Resection
&lt;p&gt;Kaplan Meier curves for EPHB4 mRNA levels in CLM for RFS (complete response, CR)&lt;/p&gt;
DOI: 10.1158/1535-7163.22507308
2023
Supplementary Figure S4 from Expression of Genes Involved in Vascular Morphogenesis and Maturation Predicts Efficacy of Bevacizumab-Based Chemotherapy in Patients Undergoing Liver Resection
&lt;p&gt;Kaplan Meier curves for KDR mRNA levels in CLM for RFS (complete response, CR)&lt;/p&gt;
DOI: 10.1158/1535-7163.22507317
2023
Supplementary Figure S1 from Expression of Genes Involved in Vascular Morphogenesis and Maturation Predicts Efficacy of Bevacizumab-Based Chemotherapy in Patients Undergoing Liver Resection
&lt;p&gt;Kaplan Meier curves for VEGFB mRNA levels in CLM for RFS (complete response, CR; not reached, NR)&lt;/p&gt;
DOI: 10.1158/1535-7163.22507314
2023
Supplementary Figure S2 from Expression of Genes Involved in Vascular Morphogenesis and Maturation Predicts Efficacy of Bevacizumab-Based Chemotherapy in Patients Undergoing Liver Resection
&lt;p&gt;Kaplan Meier curves for VEGFC mRNA levels in CLM for RFS (complete response, CR; not reached, NR)&lt;/p&gt;
DOI: 10.1158/1535-7163.22507305
2023
Supplementary Figure S5 from Expression of Genes Involved in Vascular Morphogenesis and Maturation Predicts Efficacy of Bevacizumab-Based Chemotherapy in Patients Undergoing Liver Resection
&lt;p&gt;Kaplan Meier curves for EGFL7 mRNA levels in CLM for RFS (complete response, CR)&lt;/p&gt;
DOI: 10.1158/1535-7163.22507296
2023
Supplementary Figure S8 from Expression of Genes Involved in Vascular Morphogenesis and Maturation Predicts Efficacy of Bevacizumab-Based Chemotherapy in Patients Undergoing Liver Resection
&lt;p&gt;Kaplan Meier curves for VEGFB mRNA levels in CLM for OS (complete response, CR; standard error, SE)&lt;/p&gt;
DOI: 10.1158/1535-7163.22507311
2023
Supplementary Figure S3 from Expression of Genes Involved in Vascular Morphogenesis and Maturation Predicts Efficacy of Bevacizumab-Based Chemotherapy in Patients Undergoing Liver Resection
&lt;p&gt;Kaplan Meier curves for HIF1A mRNA levels in CLM for RFS (complete response, CR; not reached, NR)&lt;/p&gt;
DOI: 10.1158/1535-7163.22507299
2023
Supplementary Figure S7 from Expression of Genes Involved in Vascular Morphogenesis and Maturation Predicts Efficacy of Bevacizumab-Based Chemotherapy in Patients Undergoing Liver Resection
&lt;p&gt;Kaplan Meier curves for ACVRL1 mRNA levels in CLM for OS (complete response, CR; standard error, SE)&lt;/p&gt;
DOI: 10.1158/1535-7163.22507317.v1
2023
Supplementary Figure S1 from Expression of Genes Involved in Vascular Morphogenesis and Maturation Predicts Efficacy of Bevacizumab-Based Chemotherapy in Patients Undergoing Liver Resection
&lt;p&gt;Kaplan Meier curves for VEGFB mRNA levels in CLM for RFS (complete response, CR; not reached, NR)&lt;/p&gt;
DOI: 10.1158/1535-7163.22507302.v1
2023
Supplementary Figure S6 from Expression of Genes Involved in Vascular Morphogenesis and Maturation Predicts Efficacy of Bevacizumab-Based Chemotherapy in Patients Undergoing Liver Resection
&lt;p&gt;Kaplan Meier curves for EPHB4 mRNA levels in CLM for RFS (complete response, CR)&lt;/p&gt;
DOI: 10.1158/1535-7163.22507311.v1
2023
Supplementary Figure S3 from Expression of Genes Involved in Vascular Morphogenesis and Maturation Predicts Efficacy of Bevacizumab-Based Chemotherapy in Patients Undergoing Liver Resection
&lt;p&gt;Kaplan Meier curves for HIF1A mRNA levels in CLM for RFS (complete response, CR; not reached, NR)&lt;/p&gt;
DOI: 10.1158/1535-7163.22507305.v1
2023
Supplementary Figure S5 from Expression of Genes Involved in Vascular Morphogenesis and Maturation Predicts Efficacy of Bevacizumab-Based Chemotherapy in Patients Undergoing Liver Resection
&lt;p&gt;Kaplan Meier curves for EGFL7 mRNA levels in CLM for RFS (complete response, CR)&lt;/p&gt;
DOI: 10.1158/1535-7163.22507314.v1
2023
Supplementary Figure S2 from Expression of Genes Involved in Vascular Morphogenesis and Maturation Predicts Efficacy of Bevacizumab-Based Chemotherapy in Patients Undergoing Liver Resection
&lt;p&gt;Kaplan Meier curves for VEGFC mRNA levels in CLM for RFS (complete response, CR; not reached, NR)&lt;/p&gt;
DOI: 10.1158/1535-7163.22507296.v1
2023
Supplementary Figure S8 from Expression of Genes Involved in Vascular Morphogenesis and Maturation Predicts Efficacy of Bevacizumab-Based Chemotherapy in Patients Undergoing Liver Resection
&lt;p&gt;Kaplan Meier curves for VEGFB mRNA levels in CLM for OS (complete response, CR; standard error, SE)&lt;/p&gt;
DOI: 10.1158/1535-7163.22507299.v1
2023
Supplementary Figure S7 from Expression of Genes Involved in Vascular Morphogenesis and Maturation Predicts Efficacy of Bevacizumab-Based Chemotherapy in Patients Undergoing Liver Resection
&lt;p&gt;Kaplan Meier curves for ACVRL1 mRNA levels in CLM for OS (complete response, CR; standard error, SE)&lt;/p&gt;
DOI: 10.1158/1535-7163.22507308.v1
2023
Supplementary Figure S4 from Expression of Genes Involved in Vascular Morphogenesis and Maturation Predicts Efficacy of Bevacizumab-Based Chemotherapy in Patients Undergoing Liver Resection
&lt;p&gt;Kaplan Meier curves for KDR mRNA levels in CLM for RFS (complete response, CR)&lt;/p&gt;