ϟ

C. Schäfer

Here are all the papers by C. Schäfer that you can download and read on OA.mg.
C. Schäfer’s last known institution is . Download C. Schäfer PDFs here.

Claim this Profile →
DOI: 10.1103/physrevd.102.042001
2020
Cited 25 times
Limits from the FUNK experiment on the mixing strength of hidden-photon dark matter in the visible and near-ultraviolet wavelength range
We present results from the FUNK experiment in the search for hidden-photon dark matter. Near the surface of a mirror, hidden photons may be converted into ordinary photons. These photons are emitted perpendicular to the surface and have an energy equal to the mass of the dark matter hidden photon. Our experimental setup consists of a large, spherical mirror with an area of more than 14 m$^2$, which concentrates the emitted photons into its central point. Using a detector sensitive to visible and near-UV photons, we can exclude a kinetic-mixing coupling of stronger than $\chi \approx 10^{-12}$ in the mass range of 2.5 to 7 eV, assuming hidden photons comprise all of the dark matter. The experimental setup and analysis used to obtain this limit are discussed in detail.
DOI: 10.1051/0004-6361/202347781
2024
Metal-silicate mixing in planetesimal collisions
Impacts between differentiated planetesimals are ubiquitous in protoplanetary discs and may mix materials from the core, mantle, and crust of planetesimals, thus forming stony-iron meteorites. The surface composition of the asteroid (16) Psyche represents a mixture of metal and non-metal components. However, the velocities, angles, and outcome regimes of impacts that mixed metal and silicate from different layers of planetesimals are debated. Our aim is to investigate the impacts between planetesimals that can mix large amounts of metal and silicate, and the mechanism of stony-iron meteorite formation. We used smooth particle hydrodynamics to simulate the impacts between differentiated planetesimals with various initial conditions that span different outcome regimes. In our simulations, the material strength was included and the effects of the states of planetesimal cores were studied. Using a statistical approach, we quantitatively analysed the distributions of metal and silicate after impacts. Our simulations modelled the mass, depth, and sources of the metal-silicate mixture in different impact conditions. Our results suggest that the molten cores in planetesimals could facilitate mixing of metal and silicate. Large amounts of the metal-silicate mixture could be produced by low-energy accretional impacts and high-energy erosive impacts in the largest impact remnant, and by hit-and-run and erosive impacts in the second-largest impact remnant. After impact, most of the metal-silicate mixture was buried at depth, consistent with the low cooling rates of stony-iron meteorites. Our results indicate that mesosiderites potentially formed in an erosive impact, while pallasites potentially formed in an accretional or hit-and-run impact. The mixing of metal and non-metal components on Psyche may also be the result of impacts.
DOI: 10.1016/j.ijleo.2021.167169
2021
Cited 5 times
Use of a general purpose integrating sphere as a low intensity near-UV extended uniform light source
Nowadays, the need for extended uniform light sources increases in many different areas of optics. Integrating spheres show one of the best properties among them. The paper is focused on verification of the practical use of so-called general purpose integrating spheres for the purpose of the calibration of highly sensitive detectors in the near-UV spectrum. To achieve this, the summarization of the theory of the integrating spheres is given, the methodology of the measurement is described and a universal experimental setup is designed. The setup is able to measure the real spatial and angular radiance uniformity of the exit port of selected representative of integrating spheres. The results of radiance uniformity measurements are graphically presented and discussed.
2015
Cited 5 times
Search for dark matter in the hidden-photon sector with a large spherical mirror
If dark matter consists of hidden-sector photons which kinetically mix with regular photons, a tiny oscillating electric-field component is present wherever we have dark matter. In the surface of conducting materials this induces a small probability to emit single photons almost perpendicular to the surface, with the corresponding photon frequency matching the mass of the hidden photons. We report on a construction of an experimental setup with a large ~14 m2 spherical metallic mirror that will allow for searches of hidden-photon dark matter in the eV and sub-eV range by application of different electromagnetic radiation detectors. We discuss sensitivity and accessible regions in the dark matter parameter space.
DOI: 10.22323/1.301.0880
2017
Cited 5 times
Search for hidden-photon Dark Matter with the FUNK experiment
Many extensions of the Standard Model of particle physics predict a parallel sector of a new U(1) symmetry, giving rise to hidden photons.These hidden photons are candidate particles for cold dark matter.They are expected to kinetically mix with regular photons, which leads to a tiny oscillating electric-field component accompanying dark matter particles.A conducting surface can convert such dark matter particles into photons which are emitted almost perpendicularly to the surface.The corresponding photon frequency follows from the mass of the hidden photons.In this contribution we present a preliminary result on a hidden photon search in the visible and near-UV wavelength range that was done with a large, 14 m 2 spherical metallic mirror and discuss future dark matter searches in the eV and sub-eV range by application of different detectors for electromagnetic radiation.
DOI: 10.1016/j.nima.2003.08.126
2003
Cited 7 times
Experience with the L3 vertex drift chamber at LEP
The vertex drift chamber of the L3 Experiment at LEP, based on the time expansion principle, was in operation from the start-up of LEP in 1989 until the shutdown of LEP in 2000. The gas mixture used was 80% CO2 and 20% i-C4H10 at a pressure of 1200mbar. We present the design of the chamber, the infrastructure and the performance during the 11 years of operation. The total radiation received on the anode wires was ∼10−4C/cm. No degradation of the anode pulse amplitude, wire efficiencies and resolution was observed for the whole running period.
DOI: 10.1080/00295450.2017.1291227
2017
Cited 3 times
An Alternative Method for Thermal Plume–Induced Aerosol Release and Deposition Calculations in Large Geometries Using fireFoam
Being a particle physics laboratory, the European Organization for Nuclear Research (CERN) plans, constructs, and maintains installations emitting ionizing radiation during operation. Activation of present material is a consequence. Hence, fire scenarios for certain CERN installations must take into account the presence of radioactive material. Releases of gaseous, liquid, or solid combustion products, e.g., attached to aerosols, are taken so far into account by a worst case approach. Scenarios taking place in underground installations assume hence a smoke transport coefficient of 100% of release toward the surface level, independent of the local geometry. For a radioactive inventory identified in a certain fire load, this results in a conservative release.To overcome this conservative worst case approach, a computational fluid dynamics model based on FM Global’s fireFoam 2.2.x is proposed. Its Lagrangian library was modified in order to provide aerosol release and deposition information based on more detailed interaction data between Lagrangian particles and their surrounding geometry. Results are shown for a CERN-typical large-scale experimental cavern placed 100 m below surface level. A simple diffusion burner is modeled inside the cavern to create a thermal plume emerging from a 1.5-MW fire over 14 min. Lagrangian particles are used to model aerosols with an aerodynamic diameter of 1, 10, and 100 μm, injected into the emerging thermal plume. Results for particle release and deposition vary according to aerodynamic diameter. In the present case, maximums of ~32% and 39% are found for 1- and 10-μm particles, respectively, being released to the surface level.
DOI: 10.22323/1.395.0220
2021
Cited 3 times
The XY Scanner - A Versatile Method of the Absolute End-to-End Calibration of Fluorescence Detectors
One of the crucial detector systems of the Pierre Auger Observatory is the fluorescence detector composed of 27 large-aperture wide-angle Schmidt telescopes. In the past, these telescopes were absolutely calibrated by illuminating the whole aperture with a uniform large-diameter light source. This absolute calibration was performed roughly once every three years, while a relative calibration was performed on a nightly basis. In this contribution, a new technique for an absolute end-to-end calibration of the fluorescence telescopes is presented. For this technique, a portable, calibrated light source mounted on a rail system is moved across the aperture of each telescope instead of illuminating the whole aperture at once. A dedicated setup using a combination of NIST traceable photodiodes to measure the mean intensity and a PMT for pulse-to-pulse stability tracking has been built for the absolute calibration of the light source. As a result of these complementary measurements, the pulse-to-pulse light source intensity can be known to the 3.5% uncertainty level. The analysis of the readout of the PMT camera at each position of the light source together with the knowledge of the light source emission provides an absolute end-to-end calibration of the telescope. We will give a brief overview of this novel calibration method and its current status as well as show preliminary results from the measurement campaigns performed so far.
2015
The FUNK search for Hidden Photon Dark Matter in the eV range
We give a brief update on the search for Hidden Photon Dark Matter with FUNK. The experiment uses a large spherical mirror, which, if Hidden Photon Dark Matter exists in the accessible mass and coupling parameter range, would yield an optical signal in the mirror's center in an otherwise dark environment. After a test run with a CCD, preparations for a run with a low-noise PMT are under way and described in this proceedings.
DOI: 10.1007/s00204-016-1719-6
2016
“Watching the Detectives” report of the general assembly of the EU project DETECTIVE Brussels, 24–25 November 2015
SEURAT-1 is a joint research initiative between the European Commission and Cosmetics Europe aiming to develop in vitro- and in silico-based methods to replace the in vivo repeated dose systemic toxicity test used for the assessment of human safety. As one of the building blocks of SEURAT-1, the DETECTIVE project focused on a key element on which in vitro toxicity testing relies: the development of robust and reliable, sensitive and specific in vitro biomarkers and surrogate endpoints that can be used for safety assessments of chronically acting toxicants, relevant for humans. The work conducted by the DETECTIVE consortium partners has established a screening pipeline of functional and "-omics" technologies, including high-content and high-throughput screening platforms, to develop and investigate human biomarkers for repeated dose toxicity in cellular in vitro models. Identification and statistical selection of highly predictive biomarkers in a pathway- and evidence-based approach constitute a major step in an integrated approach towards the replacement of animal testing in human safety assessment. To discuss the final outcomes and achievements of the consortium, a meeting was organized in Brussels. This meeting brought together data-producing and supporting consortium partners. The presentations focused on the current state of ongoing and concluding projects and the strategies employed to identify new relevant biomarkers of toxicity. The outcomes and deliverables, including the dissemination of results in data-rich "-omics" databases, were discussed as were the future perspectives of the work completed under the DETECTIVE project. Although some projects were still in progress and required continued data analysis, this report summarizes the presentations, discussions and the outcomes of the project.
DOI: 10.22323/1.444.0305
2023
A Novel Tool for the Absolute End-to-End Calibration of Fluorescence Telescopes -- The XY-Scanner
The Pierre Auger Observatory uses 27 large-aperture wide-angle Schmidt telescopes to measure the longitudinal profile of air showers using the air-fluorescence technique.Up to the year 2013, the absolute calibration of the telescopes was performed by mounting a uniform large-diameter light source on each of the telescopes and illuminating the entire aperture with a known photon flux.Due to the high amount of work and person-power required, this procedure was only carried out roughly once every three years, and a relative calibration was performed every night to track short-term changes.Since 2013, only the relative calibration has been performed.In this paper, we present a novel tool for the absolute end-to-end calibration of the fluorescence detectors, the XY-Scanner.The XY-Scanner uses a portable integrating sphere as a light source, which has been absolutely calibrated.This light source is installed onto a motorized rail system and moved across the aperture of each telescope.We mimic the illumination of the entire aperture by flashing the light source at ∼1700 positions evenly distributed across the telescope aperture.For the absolute calibration of the light source, we built a dedicated setup that uses a NIST-calibrated photodiode to measure the average photon flux and a PMT to track the pulse-to-pulse stability.We present the laboratory setups used to study the characteristics of the employed light sources and discuss the inter-calibration between selected telescopes.
DOI: 10.1016/j.ijleo.2023.171350
2023
Optical ray-tracing simulation method for the investigation of radiance non-uniformity of an integrating sphere
The paper deals with possibilities and limits of optical simulations of extended uniform light sources. Optical simulations of ideal and real light sources are performed using optical ray-tracing software which is capable to simulate non-imaging systems such as Lambertian disc and integrating spheres. The simulation methodology is validated by comparing the developed simulation model with a real experiment previously performed in optical laboratory. The comparison is made on the radiance uniformity measurement and simulation results of a general purpose integrating sphere rather than on less convenient and more frequent irradiance uniformity measurements. The aim is to develop a reliable simulation method for the investigation of radiance non-uniformity of any real extended uniform light source.
2023
A qualitative analysis of the use of Book Creator functions while processing Fermi questions
DOI: 10.48550/arxiv.2209.06659
2022
Effects of impact and target parameters on the results of a kinetic impactor: predictions for the Double Asteroid Redirection Test (DART) mission
The Double Asteroid Redirection Test (DART) spacecraft will impact into the asteroid Dimorphos on September 26, 2022 as a test of the kinetic impactor technique for planetary defense. The efficiency of the deflection following a kinetic impactor can be represented using the momentum enhancement factor, Beta, which is dependent on factors such as impact geometry and the specific target material properties. Currently, very little is known about Dimorphos and its material properties that introduces uncertainty in the results of the deflection efficiency observables, including crater formation, ejecta distribution, and Beta. The DART Impact Modeling Working Group (IWG) is responsible for using impact simulations to better understand the results of the DART impact. Pre-impact simulation studies also provide considerable insight into how different properties and impact scenarios affect momentum enhancement following a kinetic impact. This insight provides a basis for predicting the effects of the DART impact and the first understanding of how to interpret results following the encounter. Following the DART impact, the knowledge gained from these studies will inform the initial simulations that will recreate the impact conditions, including providing estimates for potential material properties of Dimorphos and Beta resulting from DARTs impact. This paper summarizes, at a high level, what has been learned from the IWG simulations and experiments in preparation for the DART impact. While unknown, estimates for reasonable potential material properties of Dimorphos provide predictions for Beta of 1-5, depending on end-member cases in the strength regime.
DOI: 10.48550/arxiv.1711.02958
2017
Search for hidden-photon dark matter with the FUNK experiment
Many extensions of the Standard Model of particle physics predict a parallel sector of a new U(1) symmetry, giving rise to hidden photons. These hidden photons are candidate particles for cold dark matter. They are expected to kinetically mix with regular photons, which leads to a tiny oscillating electric-field component accompanying dark matter particles. A conducting surface can convert such dark matter particles into photons which are emitted almost perpendicularly to the surface. The corresponding photon frequency follows from the mass of the hidden photons. In this contribution we present a preliminary result on a hidden photon search in the visible and near-UV wavelength range that was done with a large, 14 m2 spherical metallic mirror and discuss future dark matter searches in the eV and sub-eV range by application of different detectors for electromagnetic radiation.
DOI: 10.48550/arxiv.1711.02961
2017
Search for hidden-photon Dark Matter with FUNK
It has been proposed that an additional U(1) sector of hidden photons could account for the Dark Matter observed in the Universe. When passing through an interface of materials with different dielectric properties, hidden photons can give rise to photons whose wavelengths are related to the mass of the hidden photons. In this contribution we report on measurements covering the visible and near-UV spectrum that were done with a large, 14 m2 spherical metallic mirror and discuss future dark-matter searches in the eV and sub-eV range by application of different electromagnetic radiation detectors.
2008
Lusoria. Ein Römerschiff im Experiment. Rekonstruktion, Tests, Ergebnisse
DOI: 10.1016/0731-7085(92)80066-v
1992
Cited 3 times
Analysis of 3- and 4-monopivaloylepinephrine, degradation products in dipivefrin hydrochloride drug substance and ophthalmic formulations
DOI: 10.1007/s11623-015-0469-6
2015
Verschlüsseln in der Cloud
2015
Stadtpatrone – eine antike Erfindung
2014
ICE/AGE - Von der Anwendungsinsel zum digitalen Marktplatz und Hörsaal.
DOI: 10.2307/j.ctv1t4m1zc.44
2015
Inspiration and Impact of Seleucid Royal Representation
DOI: 10.48550/arxiv.1510.05869
2015
The FUNK search for Hidden Photon Dark Matter in the eV range
We give a brief update on the search for Hidden Photon Dark Matter with FUNK. The experiment uses a large spherical mirror, which, if Hidden Photon Dark Matter exists in the accessible mass and coupling parameter range, would yield an optical signal in the mirror's center in an otherwise dark environment. After a test run with a CCD, preparations for a run with a low-noise PMT are under way and described in this proceedings.
2015
Die Abberufung des Mithridates durch Caligula aus Armenien. Ein Wendepunkt in der römischen Parther- und „Ostpolitik“?
DOI: 10.48550/arxiv.1509.02386
2015
Search for dark matter in the hidden-photon sector with a large spherical mirror
If dark matter consists of hidden-sector photons which kinetically mix with regular photons, a tiny oscillating electric-field component is present wherever we have dark matter. In the surface of conducting materials this induces a small probability to emit single photons almost perpendicular to the surface, with the corresponding photon frequency matching the mass of the hidden photons. We report on a construction of an experimental setup with a large ~14 m2 spherical metallic mirror that will allow for searches of hidden-photon dark matter in the eV and sub-eV range by application of different electromagnetic radiation detectors. We discuss sensitivity and accessible regions in the dark matter parameter space.
2013
Adaptiver, Interaktiver, Dynamischer Atlas zur Geschichte (AIDA). Visuelles Erkunden und interaktives Erleben der Geschichte
Adaptiver, Interaktiver, Dynamischer Atlas zur Geschichte (AIDA). Visuelles Erkunden und interaktives Erleben der Geschichte. Hauptziel des Projekts wird die Entwicklung und Pflege eines datenbankgenerierten, dynamischen Atlas zur Geschichte Europas und des Mittelmeerraumes sein mit umfassenden Informationen zu Politik, Wirtschaft und sozialen Verhaltnissen mit einem ersten groseren Schwerpunkt in der griechisch-romischen Antike sowie einer “ vertikalen Achse” in Mittelalter, Neuzeit und Zeitgeschichte. Durch das Variieren von Abfragekriterien sollen die historischen Zusammenhange mit einer Fulle von Gesamt-und Detailkarten so dargestellt werden, dass der Atlas selbst zur Quelle neuer Erkenntnisse wird. Uber die Karten soll der Zugriff auf umfangreiche Datenbanken mit Quellenmaterial und Forschungsergebnissen gewahrleistet werden. Diese Datenbanken stellen die Basis fur die moglichst beliebig zu variierenden Karten dar, so dass abhangig von den eingegebenen Abfrageparametern Tausende von Varianten visualisiert werden konnen. Ausgehend von der wissenschaftlichen Anwendung fur die historische Forschung und der Bereitstellung von kartographischen Ressourcen fur den akademischen Unterricht soll das Informationssystem sukzessive fur den Wissenstransfer via Internet in Schule und Offentlichkeit geoffnet werden. Dazu gehort auch die Entwicklung von neuen Formen der Usability und innovativer Lern-bzw. Trainingssoftware durch das Fraunhofer Institut Erfurt und das Meme Media Laboratory der Hokkaido University in Sapporo. Die neuen Formen des Wissenstransfers werden ihrerseits selbst Gegenstand der Untersuchung sein.
DOI: 10.1524/zfmw.2010.0023
2010
Maschinen aus Möglichkeiten: «Die Stadt ist unsere Fabrik»
2017
arXiv : Search for hidden-photon Dark Matter with FUNK
2017
Search for hidden-photon Dark Matter with FUNK
DOI: 10.1007/978-3-658-16301-3_25
2017
Maßnahmen
DOI: 10.2307/j.ctt183p4q9.10
2017
SALON PUBLIC HAPPINESS
DOI: 10.1007/978-3-658-16301-3_4
2017
EinBlick
DOI: 10.1007/978-3-8349-9409-7_6
2009
Kulturelle Bedeutung des Lesens
Die Eskapaden eines der skurrilsten Neurotikers aller Zeiten hätte das Buch des Literaturwissenschaftlers Pierre Bayard wohl verhindern können: Leonard Zelig, charmanter, wenn auch fiktiver Titelheld von Woody Allens Kinoklassiker. Dieser wurde als Kind im New York der 1920er Jahre von seinen Mitschülern gefragt, ob er Moby Dick kenne. „Ich schämte mich zuzugeben, dass ich es nicht gelesen habe — also log ich.“ Zeligs Schwindelei ist der Auftakt einer surrealen Persönlichkeitsveränderung: Künftig passt er sich unwillkürlich wie ein Chamäleon jeder Umgebung an, verändert nicht nur seine Stimme, sondern auch Statur und Hautfarbe — ein zwanghafter Virtuose der Unauffälligkeit. Zelig wäre viel erspart geblieben, hätte er, wenn schon nicht Moby Dick, so doch zumindest Bayards Essay als Lektüre in petto gehabt: „Wie man über Bücher spricht, die man nicht gelesen hat“ (Bayard, 2007). Das schmächtige Bändchen hat es in sich: Bayard wendet sich gegen den gesellschaftlichen „Zwang zu lesen“. Ihm zufolge ist Lektüre „noch immer Gegenstand einer Form von Sakralisierung“. Und die Buchdiskurs-Kompetenz, also die Fähigkeit, über Bücher zu reden, ist eine soziale Eintrittskarte und bewirkt im Umkehrschluss Ausgrenzung. Wer nicht mitmachen kann, bleibt außen vor.
DOI: 10.37307/j.2510-5116.2022.05.18
2022
Unterhalt: Neues aus Küche und Keller des BGH
1998
Spitzenmanagement in Republik und Kaiserzeit : die Prokuratoren von Privatpersonen im Imperium Romanum vom 2. Jh. v. Chr. bis zum 3. Jh. n. Chr.
2018
Potential sensitivity of dark-matter searches for hidden photons with the FUNK experiment
2018
FUNK: Search for hidden photon dark matter in visible range
DOI: 10.1115/gt2018-75033
2018
Aerodynamic Design of a Ported Shroud Casing Treatment for a Turbocharger Compressor of a Miller-Cycle Gasoline Engine
The emission laws for internal combustion engines become more and more strict. Therefore, new concepts have to be implemented. In the so called Miller approach the intake valve is closed before the intake stroke is finished, thus resulting in a lower combustion end temperature and pressure. As a negative result, the specific power of the engine is reduced. This disadvantage has to be compensated by an increased boost pressure delivered by a turbocharger compressor. For the turbomachinery this means for low end torque engine operation a compressor operating point at high pressure ratio and low mass flow. Thus an increased risk for surge results. A cost-effective measure to establish an utilizable compressor map in this regime is a ported shroud casing treatment. Here, a circumferential cavity connects the low channel at the inducer with the compressor housing inflow, allowing fluid to recirculate at low mass flows. Thereby, the gross inducer mass flow is increased, the flow stabilized and hence the surge line improved. In this paper, a ported shroud casing treatment is developed employing CFD. The aim is to improve the surge line as well as the stability of the compressor characteristics and to minimize the impact on compressor efficiencies at high flows as well as the acoustic behaviour at the same time. In order to validate the performance of the design, standard hot gas measurements as well as acoustic measurements are conducted and analyzed. Furthermore, the impact of a commonly applied 90° inflow bend on the performance of the ported shroud cavity is investigated by experimental data.
DOI: 10.15496/publikation-25195
2018
Proceedings of the 4th bwHPC Symposium
DOI: 10.1007/978-3-658-26133-7
2019
Schnellstart Python
In diesem Buch wird die Programmierung mit Python vorgestellt und ein schneller Einstieg zur eigenständigen Entwicklung von Skripten ermöglicht. Es werden die Module NumPy, SciPy und Matplotlib für Anwender aufgeführt und neueste Applikationen in Big Data Science und Machine Learning gezeigt.
2019
Search for hidden-photon dark matter with FUNK
The FUNK (Finding $U(1)$s of Novel Kind) experiment was built to search direct evidences for hidden-photon (HP) dark-matter. We use a $15\,\mathrm{m}^2$ spherical metallic mirror as a HP-to-photon converter via Maxwellian-like transition. Signals are expected to come together as single photon events at the radius point where we mounted a low-noise photomultiplier. We scanned the whole optical range of frequencies with extension in far-UV and looked for HP with masses between 2 and 8 eV. In our preliminary results, we found no events excess but provide the up-to-date strongest limit for HP direct searches in this region.
DOI: 10.22323/1.358.0517
2019
Search for dark photons as candidates for Dark Matter with FUNK
An additional U(1) symmetry predicted in theories beyond the Standard Model of particle physics can give rise to hidden (dark) photons.Depending on the mass and density of these hidden photons, they could account for a large fraction of the Dark Matter observed in the Universe.When passing through an interface of materials with different dielectric properties, hidden photons are expected to produce a tiny flux of photons.The wavelength of these photons is directly related to the mass of the hidden photons.In this contribution we report on measurements covering the visible and near-UV spectrum, corresponding to a dark photon mass in the eV range.The data were taken with the FUNK experiment using a spherical mirror of ∼14 m 2 total area built up of 36 aluminum segments.
2019
Recent Results from the Hera Impact Simulation Group: Benchmarking of Shock Physics Codes
DOI: 10.37307/j.2510-5116.2019.03.51
2019
Online-Scheidung? Gibt’s doch gar nicht!
DOI: 10.4324/9780429434105-13
2020
The Kleopatra problem
In considering the life and rule of Kleopatra VII, it is the source situation that proves to be the main problem. Ancient authors adapted, practically without exception, that view of the queen propagated by Octavian/Augustus as a femme fatale. Going beyond this tradition, highly charged with eroticism and polemic, to recognize the real ruler of the Ptolemaic Empire and her political goals and achievements, is a challenge. This, however, is exactly the goal of this chapter, in which a number of questions will be addressed.
DOI: 10.7767/9783205211365.191
2020
Flüsse als Verkehrswege der Römer in den Alpenraum
No AccessFlüsse als Verkehrswege der Römer in den AlpenraumChristoph SchäferChristoph SchäferSearch for more papers by this authorhttps://doi.org/10.7767/9783205211365.191SectionsPDF/EPUB ToolsAdd to favoritesDownload CitationsTrack Citations ShareShare onFacebookTwitterLinkedInRedditEmail About Previous chapter Next chapter FiguresReferencesRelatedDetails Download book coverMontafoner GipfeltreffenVolume 4 1. AuflageISBN: 978-3-205-21134-1 eISBN: 978-3-205-21136-5HistoryPublished online:October 2020 PDF download
DOI: 10.1007/978-3-658-26133-7_7
2019
Funktionen
In diesem Kapitel lernen Sie wie ein Python-Programm in Funktionen strukturiert werden kann. Spezielle Funktionen wie die eingebauten (englisch builtin) Funktionen von Python, Generatoren, Iteratoren und Dekoratoren werden vorgestellt.
DOI: 10.37307/j.2366-2913.2007.01.11
2007
Vorlesepaten
2020
Galaxy cluster cores as seen with VLT/MUSE: new strong-lensing analyses of RX J2129.4+0009, MS 0451.6-0305 & MACSJ2129.4-0741
DOI: 10.48550/arxiv.1902.03252
2019
High Performance Computing for gravitational lens modeling: single vs double precision on GPUs and CPUs
Strong gravitational lensing is a powerful probe of cosmology and the dark matter distribution. Efficient lensing software is already a necessity to fully use its potential and the performance demands will only increase with the upcoming generation of telescopes. In this paper, we study the possible impact of High Performance Computing techniques on a performance-critical part of the widely used lens modeling software LENSTOOL. We implement the algorithm once as a highly optimized CPU version and once with graphics card acceleration for a simple parametric lens model. In addition, we study the impact of finite machine precision on the lensing algorithm. While double precision is the default choice for scientific applications, we find that single precision can be sufficiently accurate for our purposes and lead to a big speedup. Therefore we develop and present a mixed precision algorithm which only uses double precision when necessary. We measure the performance of the different implementations and find that the use of High Performance Computing Techniques dramatically improves the code performance both on CPUs and GPUs. Compared to the current LENSTOOL implementation on 12 CPU cores, we obtain speedup factors of up to 170. We achieve this optimal performance by using our mixed precision algorithm on a high-end GPU which is common in modern supercomputers. We also show that these techniques reduce the energy consumption by up to 98%. Furthermore, we demonstrate that a highly competitive speedup can be reached with consumer GPUs. While they are an order of magnitude cheaper than the high-end graphics cards, they are rarely used for scientific computations due to their low double precision performance. Our mixed precision algorithm unlocks their full potential. The consumer GPU delivers a speedup which is only a factor of four lower than the best speedup achieved by a high-end GPU.
1991
Der weströmische Senat als Träger antiker Kontinuität unter den Ostgotenkönigen (490-540 n. Chr.)
2003
Measurement of Branching Fractions of $\tau$ Hadronic Decays
DOI: 10.37307/j.2510-5116.2021.03.22
2021
Inkasso bei Pflichtteilsansprüchen?
DOI: 10.1007/978-3-658-33552-6_6
2021
Conditional Statements and Loops
In this chapter you will learn about the control structures such as conditional statements and loops of Python.
DOI: 10.1007/978-3-658-33552-6_8
2021
Structuring with Modules
In this chapter, you will learn how to structure a larger Python program into modules and how to use other modules in your program.
DOI: 10.1007/978-3-658-33552-6_5
2021
Data Types, Variables, Lists, Strings, Dictionaries, and Operators
In this chapter, you will learn about the different data types and variables of the programming language.
DOI: 10.1007/978-3-658-33552-6_9
2021
Extensions for Scientists: NumPy, SciPy, Matplotlib, Pandas
In this chapter, you will learn about the three most important modules for scientists: NumPy, SciPy, and Matplotlib.
DOI: 10.1007/978-3-658-33552-6_7
2021
Functions
In this chapter, you will learn how to structure a Python program using functions. Special functions like the built-in functions of Python, generators, iterators, and decorators are introduced.
DOI: 10.1007/978-3-322-85202-1_10
2000
Wissenstransfer mit Authorware für Geschichtsstudentinnen und -Studenten
Zusammenfassung„Die neuen digitalen Methoden und Technologien“ stellen eine „leistungsfähige Alternative zur Schriftkultur“ dar, multimediale Vermittlung von Wissen und rechnergestütztes Arbeiten ergreift zunehmend alle Lebensbereiche der postindustriellen Welt, schreibt Mihai Nadin in „Jenseits der Schriftkultur“1. Er hält zudem das Studium der visuellen Techniken und Ausdrucksformen heute für mindestens ebenso wichtig wie das Studium sprachbezogener Gegenstände mit der Begründung, die wissenschaftlichtechnologische Revolution eröffne neue Möglichkeiten von Kommunikation, Wissensproduktion und Selbstkonstituierung. Um diese in der Praxis voran zu treiben, verlangt der Philosoph und Computerwissenschaftler Nadin in einem Interview mit einem populären Magazin von den Universitäten, die allgemeine EDV-Ausbildung nicht ausschließlich den Rechenzentren zu überlassen, sondern in die Fächer zu integrieren. Es sollen die traditionellerweise isoliert betriebenen Unterrichtsgegenstände der ganzheitlichen Perspektive der Zusammenarbeit in gemeinsamen Interessens- und Erfahrungsbereichen weichen.
2000
Alte Geschichte und neue Medien : zum EDV-Einsatz in der Altertumsforschung
DOI: 10.1016/s0920-5632(97)00982-1
1998
The Z lineshape at LEP
The e+e− collider LEP at CERN was operated in the years 1989–1995 at center-of-mass energies around 91 GeV. More than 16 million Z boson decays have been recorded by the four experiments ALEPH, DELPHI, L3 and OPAL. This vast amount of data allows a precise determination of the Standard Model parameters. An interpretation of the measurements within the Standard Model leads to a prediction of the top quark mass which is in agreement with the measurements at Tevatron. In addition the Higgs boson mass is constrained.