ϟ

D. Yu

Here are all the papers by D. Yu that you can download and read on OA.mg.
D. Yu’s last known institution is . Download D. Yu PDFs here.

Claim this Profile →
DOI: 10.1016/s0302-2838(24)01038-8
2024
The role of biopsy in small renal masses <4cm: A european modified delphi consensus statement
DOI: 10.48550/arxiv.2403.11170
2024
Common substring with shifts in b-ary expansions
Denote by $S_n(x,y)$ the length of the longest common substring of $x$ and $y$ with shifts in their first $n$ digits of $b$-ary expansions. We show that the sets of pairs $(x,y)$, for which the growth rate of $S_n(x,y)$ is $\alpha \log n$ with $0\le \alpha \le \infty$, have full Hausdorff dimension.
DOI: 10.22251/jlcci.2024.24.7.129
2024
The Effect of self-discrepancy on Materialism: Mediating effect of fear of negative evaluation and Face Sensitivity
Objectives This study aimed to explore the mediating effect of negative evaluation and face sensitivity in the relationship between self-discrepancy by college students and materialism.
 Methods From December 4, 2023 to December 15, 2023, 369 male and female college students who voluntarily participated in the study collected survey data on variables related to self-discrepancy, fear of negative evaluation, face sensitivity, and materialism. after collected survey data, descriptive statistics analysis, reliability analysis, correlation analysis, and mediating effect analysis were performed using SPSS 28.0 and AMOS 28.0 programs, and bootstrapping was performed to verify the significance of indirect effects.
 Results The main research finding were as follow: First, self-discrepancy, fear of negative evaluation, face sensitivity, and materialism showed positive correlation. Second, the partial mediating effect of face sensitivity was confirmed in the relationship between self-discrepancy and materialism. Third, the double mediating effect of fear of negative evaluation and face sensitivity was confirmed in the relationship between self-discrepancy and materialism.
 Conclusions This study can be used to help understand of the process in which college students who highly perceive self-discrepancy experience materialism and by revealing the internal mechanisms of fear of negative evaluation and face sensitivity that mediate this relationship, presented evidence to organize counseling goals and programs for a client who have complain of pain due to highly materialism.
DOI: 10.48550/arxiv.2404.08453
2024
Lightweight Multi-System Multivariate Interconnection and Divergence Discovery
Identifying outlier behavior among sensors and subsystems is essential for discovering faults and facilitating diagnostics in large systems. At the same time, exploring large systems with numerous multivariate data sets is challenging. This study presents a lightweight interconnection and divergence discovery mechanism (LIDD) to identify abnormal behavior in multi-system environments. The approach employs a multivariate analysis technique that first estimates the similarity heatmaps among the sensors for each system and then applies information retrieval algorithms to provide relevant multi-level interconnection and discrepancy details. Our experiment on the readout systems of the Hadron Calorimeter of the Compact Muon Solenoid (CMS) experiment at CERN demonstrates the effectiveness of the proposed method. Our approach clusters readout systems and their sensors consistent with the expected calorimeter interconnection configurations, while capturing unusual behavior in divergent clusters and estimating their root causes.
DOI: 10.4271/2022-01-0107
2022
Cited 5 times
The Digital Foundation Platform -A Multi-Layered SOA Architecture for Intelligent Connected Vehicle Operating System
Legacy AD/ADAS development from OEMs centers around developing functions on ECUs using services provided by AUTOSAR Classic Platform (CP) to meet automotive-grade and mass-production requirements. The AUTOSAR CP couples hardware and software components statically and encounters challenges to provide sufficient capacities for the processing of high-level intelligent driving functions, whereas the new platform, AUTOSAR Adaptive Platform (AP) is designed to support dynamically communication and provide richer services and function abstractions for those resource-intensive (memory, CPU) applications. Yet for both platforms, application development and the supporting system software are still closely coupled together, and this makes application development and the enhancement less scalable and flexible, resulting in longer development cycles and slower time-to-market. This paper presents a multi-layered, service-oriented intelligent driving operating system foundation (we named it as Digital Foundation Platform) that provides abstractions for easier adoption of heterogeneous computing hardware. It features a multi-layer SOA software architecture with each layer providing adaptive service API at north-bound for application developers. The proposed Digital Foundation Platform (DFP) has significant advantages of decoupling hardware, operating system core, middle-ware, functional software and application software development. It provides SOA at multiple layers and enables application developers from OEMs, to customize and develop new applications or enhance existing applications with new features, either in autonomous domain or intelligent cockpit domain, with great agility, and less code through re-usability, and thus reduce the time-to-market.
DOI: 10.21037/tcr-22-2270
2023
Unusual presentation of a neuroendocrine tumor in the ileostomy specimen after rectal cancer treatment: a case report
Background: Neuroendocrine tumors of the small intestine are uncommon, but at the same time they are the most frequent subtype of neuroendocrine tumor in the gastrointestinal system.They originate from enterochromaffin cells, which are involved in the creation of serotonin.This asymptomatic characteristic in the initial presentation is usually why these tumors are discovered at a late stage, sometimes in association with symptomatic metastatic disease.Case Description: We present a case-report of a 52-year-old gentleman with a suggestive family history of hereditary cancer syndrome (mother with lung cancer and maternal uncle with colon cancer at the age of 40 years old).The patient was diagnosed with rectal cancer and he received neoadjuvant chemotherapy with short-course radiotherapy followed by a robotic low anterior resection with diverting loop ileostomy.Following closure of his ileostomy, the pathology report of the ileostomy resection specimen showed a 1.1 cm neuroendocrine tumor with negative margins.Conclusions: This extraordinary unusual presentation could be very fortuity for the patient, who in every other opportunity just found this neuroendocrine tumor after advanced or maybe metastatic diseases.
2001
Cited 17 times
An Evaluation of Silica Gel for Humidity Control in Display Cases
Conservation research has shown that RH levels above 65% will promote microbial growth (primarily fungi), while RH levels below 25% can lead to brittleness and cracking. In addition, large fluctuations in RH can lead to dimensional changes, deformation, and mechanical stress in organic materials. Though there is still debate about the appropriate RH requirements for museum environments, a set point of 50% (or the historic building average) with allowable fluctuations of ±5-10% is a generally accepted guideline.
DOI: 10.1260/2047-4970.3.3.579
2014
Cited 7 times
X3D Fragment Identifiers — Extending the Open Annotation Model to Support Semantic Annotation of 3D Cultural Heritage Objects over the Web
This paper describes extensions made to the Open Annotation (OA) data model to enable the semantic annotation of points, surface regions and volumetric segments on 3D cultural heritage artefacts. More specifically it describes how the X3D (Extensible 3D Graphics) standard has been adapted to support concise, machine-processable and persistent 3D fragment identifiers. Methods to improve the efficiency of capturing, compressing and processing the X3D fragment identifiers are also described. The proposed X3D extensions are implemented and evaluated using a test-bed of 3D representations of ancient Greek Vases. The advanced annotation and search services that are achievable as a result of combining OA and X3D for the underlying annotation data model are also described. The benefits of this approach are demonstrated by measuring the improvements in file size and response times that are achieved. We also present evaluation results that indicate that the X3D approach combined with innovative run-length encoding (RLE) and innovative processing methods, improves the speed and efficiency of searching, retrieving and rendering annotations on 3D digital objects, over the Web.
DOI: 10.1109/aero.2019.8741887
2019
Cited 5 times
Application of Pneumatics in Delivering Samples to Instruments on Planetary Missions
Traditional sample acquisition, transfer and capture approaches rely on mechanical methods (e.g. drill or a scoop) to acquire a sample, mechanical methods (e.g. robotic arm) to transfer the sample and gravity to capture the sample inside an instrument or a sample return container. This approach has some limitations: because of reliance on gravity, it is only suited to materials with no or little cohesion. Because of the sample transfer requiring mechanical system, the instrument or sample return container need to be easily accessible. Pneumatic based systems solve these problems because the pneumatic force can exceed the gravitational force and the sample delivery tubing can be routed around other spacecraft elements, making instrument or sample return container placement irrelevant to the sampling system. This paper presents background to pneumatic system applied to planetary missions and provides examples how this could be accomplished on planetary bodies with significant atmosphere (Venus and Titan) and on airless bodies (the Moon, Europa, Ceres).
2012
Cited 4 times
Ranking the Economic Freedom of North America using dominetrics
The Economic Freedom of North America is a widely used political economy indicator related to outcomes such as entrepreneurship, equity prices, housing prices, and migration. As a result, relative rankings are often mentioned in policy discussions. The ranking of regions based on economic freedom, however, involves many layers of subjectivity. We employ a ranking methodology called ‘dominetrics' to remove one layer of subjectivity. Doing so creates six rankings reflecting different importance orderings of the underlying spheres of economic freedom. Our results show that preferences regarding which components of economic freedoms are most important influence final rankings.
DOI: 10.1088/1742-6596/331/4/042045
2011
Cited 3 times
Tape Storage Optimization at BNL
The BNL's RHIC and Atlas Computing Facility (RACF), is supporting the RHIC experiment as its Tier0 center and the Atlas/LHC as a Tier1 center. The RACF had to address the issue of efficient access to data stored to disk and tape storage. Randomly restoring files out of tapes destroys access performance to tape by causing too frequently, high latency and time consuming tape mount and dismount. BNL's mass storage system currently holds more than 16 PB of data on tapes, managed by HPSS. To restore files from HPSS, we make use of a scheduler software, called ERADAT. This scheduler system was originally based on a code from OakRidge National Lab, and then it was renamed to BNL Batch at 2005 after some major modifications and enhancements. The new BNL Batch, ERADAT, provides dynamic HPSS resource management, schedule jobs efficiently, enhanced visibility of real-time staging activities and advanced error handling, to maximize the tape staging performance. ERADAT is the interface between HPSS and other applications such as the DataCarousel, our home developed production system and dCache. Scalla/Xrootd MSS can also be interfaced with HPSS via DataCarousel. ERADAT has demonstrated great performance in BNL and other institute.
DOI: 10.4208/cmaa.2023-0001
2023
The Euler Limit of the Relativistic Boltzmann Equation
In this work we prove the existence and uniqueness theorems of the solutions to the relativistic Boltzmann equation for analytic initial fluctuations on a time interval independent of the Knudsen number ǫ > 0. As ǫ → 0, we prove that the solution of the relativistic Boltzmann equation tends to the local relativistic Maxwellian, whose fluid-dynamical parameters solve the relativistic Euler equations and the convergence rate is also obtained.Due to this convergence rate, the Hilbert expansion is verified in the short time interval for the relativistic Boltzmann equation.We also consider the physically important initial layer problem.As a by-product, an existence theorem for the relativistic Euler equations without the assumption of the non-vacuum fluid states is obtained.
DOI: 10.48550/arxiv.2311.04190
2023
Spatio-Temporal Anomaly Detection with Graph Networks for Data Quality Monitoring of the Hadron Calorimeter
The compact muon solenoid (CMS) experiment is a general-purpose detector for high-energy collision at the large hadron collider (LHC) at CERN. It employs an online data quality monitoring (DQM) system to promptly spot and diagnose particle data acquisition problems to avoid data quality loss. In this study, we present semi-supervised spatio-temporal anomaly detection (AD) monitoring for the physics particle reading channels of the hadronic calorimeter (HCAL) of the CMS using three-dimensional digi-occupancy map data of the DQM. We propose the GraphSTAD system, which employs convolutional and graph neural networks to learn local spatial characteristics induced by particles traversing the detector, and global behavior owing to shared backend circuit connections and housing boxes of the channels, respectively. Recurrent neural networks capture the temporal evolution of the extracted spatial features. We have validated the accuracy of the proposed AD system in capturing diverse channel fault types using the LHC Run-2 collision data sets. The GraphSTAD system has achieved production-level accuracy and is being integrated into the CMS core production system--for real-time monitoring of the HCAL. We have also provided a quantitative performance comparison with alternative benchmark models to demonstrate the promising leverage of the presented system.
DOI: 10.3390/s23249679
2023
Spatio-Temporal Anomaly Detection with Graph Networks for Data Quality Monitoring of the Hadron Calorimeter
The Compact Muon Solenoid (CMS) experiment is a general-purpose detector for high-energy collision at the Large Hadron Collider (LHC) at CERN. It employs an online data quality monitoring (DQM) system to promptly spot and diagnose particle data acquisition problems to avoid data quality loss. In this study, we present a semi-supervised spatio-temporal anomaly detection (AD) monitoring system for the physics particle reading channels of the Hadron Calorimeter (HCAL) of the CMS using three-dimensional digi-occupancy map data of the DQM. We propose the GraphSTAD system, which employs convolutional and graph neural networks to learn local spatial characteristics induced by particles traversing the detector and the global behavior owing to shared backend circuit connections and housing boxes of the channels, respectively. Recurrent neural networks capture the temporal evolution of the extracted spatial features. We validate the accuracy of the proposed AD system in capturing diverse channel fault types using the LHC collision data sets. The GraphSTAD system achieves production-level accuracy and is being integrated into the CMS core production system for real-time monitoring of the HCAL. We provide a quantitative performance comparison with alternative benchmark models to demonstrate the promising leverage of the presented system.
DOI: 10.1088/1742-6596/1828/1/012133
2021
Thermal Detection for Free Flight
Abstract Thermals are regions of rising hot air formed on the ground through the warming of the surface by the sun. Thermals are commonly used by birds and glider pilots to extend flight duration, increase cross-country distance, or simply to conserve energy. This kind of powerless flight using natural sources of lift is called soaring. Once a thermal is encountered, the pilot flies in circles to keep within the thermal, so gaining altitude before flying off to the next thermal and towards the destination. A single thermal can net a pilot thousands of meters of elevation gain. Estimating thermal locations is not an easy task, pilots look for different indicators like color variation on the ground because the difference in the amount of heat absorbed by the ground varies based on the color/composition, birds circling in an area, and certain types of cloud formations (cumulus clouds). The above methods are not always reliable enough and pilots study the conditions for thermals by estimating solar heating of ground (cloud cover and time year/date) and also the lapse rate and dew point of air. In this paper, we present a Machine Learning based solution to forecast thermals. Since pilots in general record many of their flights locally and sometimes upload them to databases, we use the flight data uploaded to determine where the pilot encountered thermals and together with other information (weather and satellite images corresponding to the location and time of the flight) train an algorithm to automatically predict the location of thermals given as input the current weather conditions and terrain information (obtained from Google Earth Engine). Results show that our model is able to converge on the training and validation set with a loss bellow 1%.
DOI: 10.1007/978-981-15-6743-8
2020
Aircraft Valuation
DOI: 10.48550/arxiv.2203.12035
2022
Displaying dark matter constraints from colliders with varying simplified model parameters
The search for dark matter is one of the main science drivers of the particle and astroparticle physics communities. Determining the nature of dark matter will require a broad approach, with a range of experiments pursuing different experimental hypotheses. Within this search program, collider experiments provide insights on dark matter which are complementary to direct/indirect detection experiments and to astrophysical evidence. To compare results from a wide variety of experiments, a common theoretical framework is required. The ATLAS and CMS experiments have adopted a set of simplified models which introduce two new particles, a dark matter particle and a mediator, and whose interaction strengths are set by the couplings of the mediator. So far, the presentation of LHC and future hadron collider results has focused on four benchmark scenarios with specific coupling values within these simplified models. In this work, we describe ways to extend those four benchmark scenarios to arbitrary couplings, and release the corresponding code for use in further studies. This will allow for more straightforward comparison of collider searches to accelerator experiments that are sensitive to smaller couplings, such as those for the US Community Study on the Future of Particle Physics (Snowmass 2021), and will give a more complete picture of the coupling dependence of dark matter collider searches when compared to direct and indirect detection searches. By using semi-analytical methods to rescale collider limits, we drastically reduce the computing resources needed relative to traditional approaches based on the generation of additional simulated signal samples.
DOI: 10.36001/phme.2022.v7i1.3367
2022
Long Horizon Anomaly Prediction in Multivariate Time Series with Causal Autoencoders
Predictive maintenance is essential for complex industrial systems to foresee anomalies before major system faults or ultimate breakdown. However, the existing efforts on Industry 4.0 predictive monitoring are directed at semi-supervised anomaly detection with limited robustness for large systems, which are often accompanied by uncleaned and unlabeled data. We address the challenge of predicting anomalies through data-driven end-to-end deep learning models using early warning symptoms on multivariate time series sensor data. We introduce AnoP, a long multi-timestep anomaly prediction system based on unsupervised attention-based causal residual networks, to raise alerts for anomaly prevention. The experimental evaluation on large data sets from detector health monitoring of the Hadron Calorimeter of the CMS Experiment at LHC CERN demonstrates the promising efficacy of the proposed approach. AnoP predicted around 60% of the anomalies up to seven days ahead, and the majority of the missed anomalies are abnormalities with unpredictable noisy-like behavior. Moreover, it has discovered previously unknown anomalies in the calorimeter’s sensors.
DOI: 10.3389/fped.2022.953122
2022
Extracorporeal membrane oxygenation in the care of a preterm infant with COVID-19 infection: Case report
Coronavirus disease 2019 (COVID-19) was first reported to the World Health Organization (WHO) in December 2019 and has since unleashed a global pandemic, with over 518 million cases as of May 10, 2022. Neonates represent a very small proportion of those patients. Among reported cases of neonates with symptomatic COVID-19 infection, the rates of hospitalization remain low. Most reported cases in infants and neonates are community acquired with mild symptoms, most commonly fever, rhinorrhea and cough. Very few require intensive care or invasive support for acute infection. We present a case of a 2-month-old former 26-week gestation infant with a birthweight of 915 grams and diagnoses of mild bronchopulmonary dysplasia and a small ventricular septal defect who developed acute respiratory decompensation due to COVID-19 infection. He required veno-arterial extracorporeal membrane oxygenation support for 23 days. Complications included liver and renal dysfunction and a head ultrasound notable for lentriculostriate vasculopathy, extra-axial space enlargement and patchy periventricular echogenicity. The patient was successfully decannulated to conventional mechanical ventilation with subsequent extubation to non-invasive respiratory support. He was discharged home at 6 months of age with supplemental oxygen via nasal cannula and gastrostomy tube feedings. He continues to receive outpatient developmental follow-up. To our knowledge, this is the first case report of a preterm infant during their initial hospitalization to survive ECMO for COVID-19.
DOI: 10.1088/1742-6596/898/8/082024
2017
Efficient Access to Massive Amounts of Tape-Resident Data
Randomly restoring files from tapes degrades the read performance primarily due to frequent tape mounts. The high latency and time-consuming tape mount and dismount is a major issue when accessing massive amounts of data from tape storage. BNL's mass storage system currently holds more than 80 PB of data on tapes, managed by HPSS. To restore files from HPSS, we make use of a scheduler software, called ERADAT. This scheduler system was originally based on code from Oak Ridge National Lab, developed in the early 2000s. After some major modifications and enhancements, ERADAT now provides advanced HPSS resource management, priority queuing, resource sharing, web-browser visibility of real-time staging activities and advanced real-time statistics and graphs. ERADAT is also integrated with ACSLS and HPSS for near real-time mount statistics and resource control in HPSS. ERADAT is also the interface between HPSS and other applications such as the locally developed Data Carousel, providing fair resource-sharing policies and related capabilities. ERADAT has demonstrated great performance at BNL.
2019
Playing Fast Not Loose: Evaluating team-level pace of play in ice hockey using spatio-temporal possession data
Pace of play is an important characteristic in hockey as well as other team sports. We provide the first comprehensive study of pace within the sport of hockey, focusing on how teams and players impact pace in different regions of the ice, and the resultant effect on other aspects of the game. First we examined how pace of play varies across the surface of the rink, across different periods, at different manpower situations, between different professional leagues, and through time between seasons. Our analysis of pace by zone helps to explain some of the counter-intuitive results reported in prior studies. For instance, we show that the negative correlation between attacking speed and shots/goals is likely due to a large decline in attacking speed in the OZ. We also studied how pace impacts the outcomes of various events. We found that pace is positively-correlated with both high-danger zone entries (e.g. odd-man rushes) and higher shot quality. However, we find that passes with failed receptions occur at higher speeds than successful receptions. These findings suggest that increased pace is beneficial, but perhaps only up to a certain extent. Higher pace can create breakdowns in defensive structure and lead to better scoring chances but can also lead to more turnovers. Finally, we analyzed team and player-level pace in the NHL, highlighting the considerable variability in how teams and players attack and defend against pace. Taken together, our results demonstrate that measures of team-level pace derived from spatio-temporal data are informative metrics in hockey and should prove useful in other team sports.
DOI: 10.1016/s0889-857x(03)00053-x
2003
Spondyloarthropathies
2015
Searches for new phenomena using events with three or more charged leptons in pp collisions at $\sqrt{s}=8$ TeV with the ATLAS detector at the LHC
Author(s): Yu, David Ren-Hwa | Advisor(s): Heinemann, Beate E | Abstract: This dissertation presents two searches for phenomena beyond the Standard Model using events with three or more charged leptons. The searches are based on 20.3 fb−1 of proton- proton collision data with a center-of-mass energy of √s = 8 TeV collected by the ATLAS detector at the CERN Large Hadron Collider in 2012. The first is a model-independent search for excesses beyond Standard Model expectations in many signal regions. The events are required to have least three charged leptons, of which at least two are electrons or muons, and at most one is a hadronically decaying τ lepton. The selected events are categorized based on the flavor and charge of the leptons, and the signal regions are defined using several kinematic variables sensitive to beyond the Standard Model phenomena. The second search looks for new heavy leptons decaying resonantly to three electrons or muons, two of which are produced through an intermediate Z boson. The resonant decay produces a narrowly- peaked excess in the trilepton mass spectrum. In both cases, no significant excess beyond Standard Model expectations is observed, and the data are used to set limits on models of new physics. The model-independent trilepton search is used to confront a model of doubly charged scalar particles decaying to eτ or μτ, excluding masses below 400 GeV at 95% confidence level. The trilepton resonance search is used to test models of vector-like leptons and the type III neutrino seesaw mechanism. The vector-like lepton model is excluded for most of the mass range 114 GeV − 176 GeV, while the type III seesaw model is excluded for most the mass range 100 GeV − 468 GeV. Both searches also present tools to facilitate reinterpretations in the context of other models predicting the production of three or more charged leptons.
DOI: 10.4049/jimmunol.196.supp.145.8
2016
The immunotherapeutic GEN-003 as a prophylactic vaccine candidate for genital herpes in guinea pigs
Abstract GEN‑003 is a subunit vaccine candidate containing adjuvanted gD and ICP4 antigens that has shown durable therapeutic efficacy against genital herpes in a phase 2 clinical trial. Here we describe the potential benefit of GEN-003 as a prophylactic vaccine in a guinea pig model of genital herpes. Animals were immunized three times every two weeks with GEN-003 or control and intravaginally challenged with herpes simplex virus 2 (HSV-2, MS strain) 21 days later. GEN-003 significantly reduced frequency and severity of genital lesions in both acute and early stages of recurrent infection by over 85%, and 60% of vaccinated guinea pigs did not develop any lesions during the 33 day follow-up period. Furthermore, viral genome copy numbers recovered in vaginal swabs from GEN-003-immunized animals were 100-fold lower than from controls. Recurrent shedding frequencies were reduced by 27%. GEN-003 vaccination induced antigen-specific antibody responses at least as strong as those elicited by viral infection. Driven by gD-specific antibodies, HSV-2 neutralizing antibody titers followed the same pattern. After viral challenge, peripheral antigen-specific T cell responses in spleens of GEN-003-immunized animals were lower than those in controls, as measured by interferon-γ ELISpot. This might have resulted from increased antigen-specific T cell recruitment to the genital mucosa, or decreased exposure to the virus due to fewer recurrences. Additional methods to better understand mucosal mechanisms of protection are in development. In summary, GEN-003 shows promise as a prophylactic vaccine candidate for genital herpes. Ongoing studies will determine if protection can be further enhanced by addition or exchange of antigens and/or adjuvants.
DOI: 10.22323/1.191.0121
2013
Searches for new Physics in events with multiple leptons with the ATLAS detector
2013
Luminosity Determination in 7 TeV pp Collisions with the ATLAS Detector
DOI: 10.22323/1.093.0023
2011
BNL Batch and DataCarousel systems at BNL: A tool and UI for efficient access to data on tape with faireshare policies capabilities
DOI: 10.1167/10.7.1340
2010
Semantic and identical word priming reduces peripheral word crowding and reaction times
Visual crowding dramatically reduces the perception of word form and meaning in peripheral vision. Previous research demonstrates that crowded letters and shapes can be “demasked” when the same form is simultaneously presented at the fovea, provided the central and peripheral figures are similar in orientation and contrast (Geiger, Lettvin, Perception, 1985). The present study explored whether a briefly presented, but visible, identical or semantically related foveal prime word would “uncrowd” flanked peripheral words compared to an unrelated prime condition. We tested eighteen observers across three target eccentricities (fovea, 4°, and 8°) and three prime-target relationships (unrelated, identical, or semantically related). Using a dual-task design, participants first discriminated target words from non-words in a 2 AFC peripheral lexical decision task (LDT). In the second task, observers gave a letter-by-letter account of the target, providing a more detailed report of the deleterious effects of crowding. Across all eccentricities, LDT percent correct and discriminability (d′) improved modestly, but not significantly, on identical and semantically related trials compared to the unrelated prime condition. However, the letter-by-letter analysis revealed that identical and semantically related word trials reduced crowding and reaction times. Subjects correctly reported all letters of the crowded word more often in both the identical and semantically related conditions (F(2,34)= 15.6, p<0.01). Further, on correct LDT trials, reaction times were reduced for both identical and related word pairs across all eccentricities (F(2,34)= 12.2, p<0.01). Taken together, these results suggest that LDT does not provide the most sensitive measure of semantic influences on word crowding. More importantly, these results demonstrate that semantic priming at the fovea can reduce peripheral word crowding and facilitate reaction times. This adds to a growing body of research demonstrating that the resolution of vision in crowded scenes can be modified by learning and prior experience.
DOI: 10.1088/1742-6596/119/5/052031
2008
Integrated RT-nagios system at the BNL US ATLAS Tier 1 computing center
We present an integrated system for monitoring a heterogeneous computing facility, and for managing user problem reports.
DOI: 10.2139/ssrn.4034713
2022
Advancing Platooning with ADAS Control Integration and Assessment in Real-World Driving Scenarios
DOI: 10.48550/arxiv.2209.03505
2022
Background Monte Carlo Samples for a Future Hadron Collider
A description of Standard Model background Monte Carlo samples produced for studies related to future hadron colliders.
DOI: 10.2172/1887258
2022
Background Monte Carlo Samples for a Future Hadron Collider
investigation of the radiation levels in the SSC detectors was undertaken by D. Groom and colleagues, in the context of the ``task force on radiation levels in the SSC interaction regions.`` The method consisted essentially of an analytic approach, using standard descriptions of average events in conjunction with simulations of secondary processes. Following Groom`s work, extensive Monte Carlo simulations were performed to address the issues of backgrounds and radiation environments for the GEM and SD C3 experiments proposed at the SSC, and for the ATLAS and CMS experiments planned for the LHC. The purpose of the present article is to give a brief summary of some aspects of the methods, assumptions, and calculations performed to date (principally for the SSC detectors), and to stress the relevance of such calculations to the detectors proposed for the study of B-physics in particular.
DOI: 10.1109/nysds.2018.8538947
2018
Performance Evaluation for Tape Storage Data Recall with T10KD Drive
Tape library storage is the best solution for archiving big data due to high capacity, low cost and long retention. However, as the nature of sequential media, its main drawback versus disk storage is the slow performance for data recall in random access order. With emerging of the modern technology from the enterprise tape drives, Recommended Access Order (RAO) should improve the data recall performance considerably.The Scientific Data and Computing Center (SDCC) in Brookhaven National Lab (BNL) is operating a tape library storage system which contains more than 100 PB data. It is managed by High Performance Storage System (HPSS) software. Starting from HPSS 7.5.1 release, a new feature named Tape Ordered Recall (TOR) has offered a great advantage for large amount of data recall. It is designed to reduce the file access time from tape media. In our test with Oracle T10K enterprise drive, TOR's tape recall rates for small files (1GB) is almost 4 times of the unordered recall. TOR supports both RAO and non-RAO drives. It will use the RAO feature if available. Otherwise, it makes a linear offset ordering to recall the data. Additionally, Quaid is a smart recall tool introduced in HPSS7.5.1 as well. It uses new batch stage API to stage files in the background, improves the tape data recall performance significantly. Efficient Retrieval and Access to Data Archived on Tape, aka ERADAT, is a tape storage data recall tool developed mainly in SDCC. It sorts the staging requests based on file's logical position order before submitting jobs to HPSS. Over a decade of utilization and improvement, ERADAT has been proven as a great performer serving BNL's worldwide scientific community. In this paper we will present the test results for each individual tool under certain test setting and configuration, analyze and evaluate their performance advantage and disadvantage, find the best solution for BNL's tape storage production system for different data characteristics.
DOI: 10.1594/essr2018/p-0157
2018
Dual energy CT for gout: mportant concepts and pitfalls
Poster: ESSR 2018 / P-0157 / Dual energy CT for gout: mportant concepts and pitfalls by: J. S. B. Kho, D. Yu; Brighton/UK
DOI: 10.1051/epjconf/201921404022
2019
Best Practices in Accessing Tape-Resident Data in HPSS*
Tape is an excellent choice for archival storage because of the capacity, cost per GB and long retention intervals, but its main drawback is the slow access time due to the nature of sequential medium. Modern enterprise tape drives now support Recommended Access Ordering (RAO), which is designed to reduce data recall/retrieval times. BNL SDCC's mass storage system currently holds more than 100 PB of data on tapes, managed by HPSS. Starting with HPSS version 7.5.1, a new feature called “Tape Order Recall (TOR) has been introduced. It supports both RAO and non-RAO drives. The file access performance can be increased by 30% to 60% over the random file access. Prior to HPSS 7.5.1, we have been using an in-house developed scheduling software, aka ERADAT. ERADAT accesses files based on the file logical position order. It has demonstrated a great performance over the past decade long usage in BNL. In this paper we will present a series of test results, compare TOR and ERADAT's performance under different configurations to show how effective TOR (RAO) and ERADAT perform and what is the best solution in data recall from SDCC's tape storage
DOI: 10.22323/1.350.0184
2019
Searches for hadronic resonances in CMS
New particles decaying to jet pairs, predicted by many theories of physics beyond the Standard Model, could be discovered at the Large Hadron Collider as resonances in the dijet invariant mass spectrum.Three searches for dijet resonances by the CMS Collaboration are presented, which employ several novel techniques to probe a wide range of resonance masses.The first search targets resonance masses above 1800 GeV.A new background method using jet pairs with large pseudorapidity separation is described.The latter two searches target resonances below 450 GeV, where trigger bandwidth limitation preclude the methods used at high masses.The resonances are required to be produced with high transverse momentum due to significant initial state radiation (ISR), and hence the resonance decay products are collimated into a single jet.The searches, which require either ISR jets or photons, probe resonance masses from 50 GeV -450 GeV and 10 GeV -125 GeV, respectively.
DOI: 10.4236/gep.2020.85012
2020
Developing a Novel Approach for Sludge Treatment Using Microwaves Technology
The purpose of this research is to find a method that can improve the cost and efficiency of sludge treatment.Currently, large amounts of sludge are produced every day, but sludge treatment is neither efficient nor profitable.To improve the sludge treatment process, we proposed the method of using microwave technology to treat sludge.We hypothesized that using microwave technology, we can reduce the volume of the sludge up to 90%, and can save more energy and time comparing to the traditional methods that we are currently using to treat the sludge.To prove our hypothesis, we designed an experiment to compare the solid-liquid boundary height and the solid-liquid mass ratio of the sludge treated by the conventional method and the microwave technology.Prime temperature and time found for dewatering sludge are 70 Celsius degrees and five minutes.The results were rather surprising, as microwave heating demonstrated no significant advantage over conventional heating.The solid-liquid boundary height of sludge heated by conventional and by microwave methods are 22.34 mL and 22.56 mL; the solid-liquid mass ratio of sludge using conventional heating and microwave heating at 70 Celsius degrees are 14.28% and 14.55% (by separation with filter press), or 9.82% and 9.89% (by centrifugation).In conclusion, the difference is negligible.
DOI: 10.46855/2020.05.02.09.17.141758
2020
Multi-stage transport and logistic optimization for the mobilized and distributed battery
DOI: 10.48550/arxiv.1902.02397
2019
Winning Is Not Everything: A contextual analysis of hockey face-offs
This paper takes a different approach to evaluating face-offs in ice hockey. Instead of looking at win percentages, the de facto measure of successful face-off takers for decades, focuses on the game events following the face-off and how directionality, clean wins, and player handedness play a significant role in creating value. This will demonstrate how not all face-off wins are made equal: some players consistently create post-face-off value through clean wins and by directing the puck to high-value areas of the ice. As a result, we propose an expected events face-off model as well as a wins above expected model that take into account the value added on a face-off by targeting the puck to specific areas on the ice in various contexts, as well as the impact this has on subsequent game events.
DOI: 10.48550/arxiv.1902.02020
2019
Playing Fast Not Loose: Evaluating team-level pace of play in ice hockey using spatio-temporal possession data
Pace of play is an important characteristic in hockey as well as other team sports. We provide the first comprehensive study of pace within the sport of hockey, focusing on how teams and players impact pace in different regions of the ice, and the resultant effect on other aspects of the game. First we examined how pace of play varies across the surface of the rink, across different periods, at different manpower situations, between different professional leagues, and through time between seasons. Our analysis of pace by zone helps to explain some of the counter-intuitive results reported in prior studies. For instance, we show that the negative correlation between attacking speed and shots/goals is likely due to a large decline in attacking speed in the OZ. We also studied how pace impacts the outcomes of various events. We found that pace is positively-correlated with both high-danger zone entries (e.g. odd-man rushes) and higher shot quality. However, we find that passes with failed receptions occur at higher speeds than successful receptions. These findings suggest that increased pace is beneficial, but perhaps only up to a certain extent. Higher pace can create breakdowns in defensive structure and lead to better scoring chances but can also lead to more turnovers. Finally, we analyzed team and player-level pace in the NHL, highlighting the considerable variability in how teams and players attack and defend against pace. Taken together, our results demonstrate that measures of team-level pace derived from spatio-temporal data are informative metrics in hockey and should prove useful in other team sports.
DOI: 10.1109/iccsai53272.2021.9609741
2021
Utilization Big Data and GPS to Help E-TLE System in The Cities of Indonesia
Indonesia government currently keep practicing on Electronic Traffic Law Enforcement. Over the last three years the E-TLE was launched by the Ditlantas Polda Metro Jaya the government keep developed a system to provides the smoothness and safety of traffic violations at the cities in Indonesia. Using a Big Data Analytics for helping the large amount of data to recording all the traffic violations, GPS for moving activity and can be used for analysis to identify and information. We used the literature review to find out how to develop the E-TLE concept, system, and implement in Indonesia. The main problem in Indonesia is due to the lack of equipment to support all traffic systems and the accuracy of the inference algorithm varies greatly depending on the size of the collected data sample. In this study, we find out all the problem that can be cause by the system and find out how to solve all the problem using the literature review method for identifying, understanding, and transmitting information to help E-TLE system. Finally, we concluded the big data is must for implementing the E-TLE system. Furthermore, GPS and radar sensor is the critical data beside CCTV for enforcement of electronic traffic law.
DOI: 10.1109/tip.2021.3129405
2021
IEEE Signal Processing Society Information
DOI: 10.1163/15406253-00702004
1980
The Conceptions of Self in Whitehead and Chu Hsi