Unbiased analysis of geomagnetic data sets and comparison of historical data with paleomagnetic and archeomagnetic records

2017 ◽  
Vol 55 (1) ◽  
pp. 5-39 ◽  
Author(s):  
Patrick Arneitz ◽  
Ramon Egli ◽  
Roman Leonhardt
2021 ◽  
Vol 7 (s2) ◽  
Author(s):  
Alexander Bergs

Abstract This paper focuses on the micro-analysis of historical data, which allows us to investigate language use across the lifetime of individual speakers. Certain concepts, such as social network analysis or communities of practice, put individual speakers and their social embeddedness and dynamicity at the center of attention. This means that intra-speaker variation can be described and analyzed in quite some detail in certain historical data sets. The paper presents some exemplary empirical analyses of the diachronic linguistic behavior of individual speakers/writers in fifteenth to seventeenth century England. It discusses the social factors that influence this behavior, with an emphasis on the methodological and theoretical challenges and opportunities when investigating intra-speaker variation and change.


1997 ◽  
Vol 102 (C13) ◽  
pp. 27835-27860 ◽  
Author(s):  
Alexey Kaplan ◽  
Yochanan Kushnir ◽  
Mark A. Cane ◽  
M. Benno Blumenthal

2019 ◽  
Vol 18 (2) ◽  
pp. 390-415
Author(s):  
Andrei Vorobev ◽  
Gulnara Vorobeva ◽  
Nafisa Yusupova

. As is known, today the problem of geomagnetic field and its variations parameters monitoring is solved mainly by a network of magnetic observatories and variational stations, but a significant obstacle in the processing and analysis of the data thus obtained, along with their spatial anisotropy, are omissions or reliable inconsistency with the established format. Heterogeneity and anomalousness of the data excludes (significantly complicates) the possibility of their automatic integration and the application of frequency analysis tools to them. Known solutions for the integration of heterogeneous geomagnetic data are mainly based on the consolidation model and only partially solve the problem. The resulting data sets, as a rule, do not meet the requirements for real-time information systems, may include outliers, and omissions in the time series of geomagnetic data are eliminated by excluding missing or anomalous values from the final sample, which can obviously lead to both to the loss of relevant information, violation of the discretization step, and to heterogeneity of the time series. The paper proposes an approach to creating an integrated space of geomagnetic data based on a combination of consolidation and federalization models, including preliminary processing of the original time series with an optionally available procedure for their recovery and verification, focused on the use of cloud computing technologies and hierarchical format and processing speed of large amounts of data and, as a result, providing users with better and more homogeneous data.


2017 ◽  
Vol 21 (2) ◽  
pp. 317-340 ◽  
Author(s):  
HENDRIK DE SMET ◽  
FREEK VAN DE VELDE

While it is undoubtedly true that historical data do not lend themselves well to the reproduction of experimental findings, the availability of increasingly extensive data sets has brought some experimenting within practical reach. This means that certain predictions based on a combination of synchronic observations and uniformitarian thinking are now testable. Synchronic evidence shows a negative correlation between analysability in morphologically complex words and various measures of frequency. It is therefore expected that when the frequency of morphologically complex items changes, their analysability will change along with this. If analysability decreases, this should in turn be reflected in decreasing sensitivity to priming by items with analogous composition. The latter prediction is in principle testable on diachronic data, offering a way of verifying the diachronic effect of frequency change on analysability. In this spirit, the present article examines the relation between changing frequency and priming sensitivity, as a proxy to analysability. This is done for a sample of 250 English ly-adverbs, such as roughly, blindly, publicly, etc. over the period 1950–2005, using data from the Hansard Corpus. Some of the expected relations between frequency and analysability can be shown to hold, albeit with great variation across lexical items. At the same time, much of the variation in our measure of analysability cannot be accounted for by frequency or frequency change alone.


1987 ◽  
Vol 44 (S2) ◽  
pp. s156-s165 ◽  
Author(s):  
Carl J. Walters

Stock assessment usually proceeds from the assumption that there are time-invariant relationships between stock size and rate processes such as recruitment, although such relationships are difficult to discern due to noise caused by factors other than stock size. There are good biological reasons not to trust this assumption in exploited populations, where persistent environmental changes and shifts in stock structure may cause various parameters to change. Graphical and statistical procedures can be used to detect this nonstationarity in historical data sets for which stock size has varied so as to repeatedly sample a range of sizes. The policy implications of nonstationarity depend on whether the changes are clearly observable as deviations from known, Song-term baseline responses. If the changes are observable, it is usually best to pretend that the current deviation will persist unless strong constraints on policy change make it necessary to plan for changes that may occur far into the future. If the changes are not observable (the usual case), then it is necessary to make a difficult policy choice between passively waiting for informative stock responses versus actively experimenting with harvest rates so as to quickly get information about responses over a range of stock sizes.


2021 ◽  
pp. 133-140
Author(s):  
C. Hardner ◽  
K. Gasic ◽  
C. da Silva Linge ◽  
M. Worthington ◽  
D. Byrne ◽  
...  

Author(s):  
Arminée Kazanjian ◽  
Kathryn Friesen

AbstractIn order to explore the diffusion of the selected technologies in one Canadian province (British Columbia), two administrative data sets were analyzed. The data included over 40 million payment records for each fiscal year on medical services provided to British Columbia residents (2,968,769 in 1988) and information on physical facilities, services, and personnel from 138 hospitals in the province. Three specific time periods were examined in each data set, starting with 1979–80 and ending with the most current data available at the time. The detailed retrospective analysis of laboratory and imaging technologies provides historical data in three areas of interest: (a) patterns of diffusion and volume of utilization, (b) institutional profile, and (c) provider profile. The framework for the analysis focused, where possible, on the examination of determinants of diffusion that may be amenable to policy influence.


Sensors ◽  
2021 ◽  
Vol 22 (1) ◽  
pp. 226
Author(s):  
Marek Hermansa ◽  
Michał Kozielski ◽  
Marcin Michalak ◽  
Krzysztof Szczyrba ◽  
Łukasz Wróbel ◽  
...  

In this paper, the problem of the identification of undesirable events is discussed. Such events can be poorly represented in the historical data, and it is predominantly impossible to learn from past examples. The discussed issue is considered in the work in the context of two use cases in which vibration and temperature measurements collected by wireless sensors are analysed. These use cases include crushers at a coal-fired power plant and gantries in a steelworks converter. The awareness, resulting from the cooperation with industry, of the need for a system that works in cold start conditions and does not flood the machine operator with alarms was the motivation for proposing a new predictive maintenance method. The proposed solution is based on the methods of outlier identification. These methods are applied to the collected data that was transformed into a multidimensional feature vector. The novelty of the proposed solution stems from the creation of a methodology for the reduction of false positive alarms, which was applied to a system identifying undesirable events. This methodology is based on the adaptation of the system to the analysed data, the interaction with the dispatcher, and the use of the XAI (eXplainable Artificial Intelligence) method. The experiments performed on several data sets showed that the proposed method reduced false alarms by 90.25% on average in relation to the performance of the stand-alone outlier detection method. The obtained results allowed for the implementation of the developed method to a system operating in a real industrial facility. The conducted research may be valuable for systems with a cold start problem where frequent alarms can lead to discouragement and disregard for the system by the user.


2018 ◽  
Vol 96 (7) ◽  
pp. 753-759
Author(s):  
Paul A. Finigan ◽  
Nicholas E. Mandrak ◽  
Bruce L. Tufts

Biodiversity loss is a serious issue for freshwater fishes in temperate climates and there is a need for more information in this area. A study was conducted to assess fish community changes in the littoral zone of 22 lakes over a 45 year period (compared years 1969–1979 and year 2014). To compare fish communities, historical seining records were compiled for 22 inland lakes and compared with contemporary data sampled using the same protocol. Fish abundance data analyzed using a multivariate approach identified a shift from cyprinid-dominated communities to centrarchid-dominated communities between time periods. There was no evidence to support a strong influence of invasive species on these communities, but there have been significant changes in temperature and land use around these lakes since the historical data sets were collected. This is an important contribution to our understanding of biodiversity change in North American freshwater fish communities and may influence fisheries management approaches in the future.


2000 ◽  
Vol 57 (4) ◽  
pp. 677-686 ◽  
Author(s):  
Michael J Bradford ◽  
Ransom A Myers ◽  
James R Irvine

We describe a simple scheme for the management of coho salmon (Oncorhynchus kisutch) population aggregates that uses reference points derived from an empirical analysis of freshwater production data. We fit a rectilinear "hockey stick" model to 14 historical data sets of female spawner abundance and resulting smolt production and found that at low spawner abundance, the average productivity was about 85 smolts per female spawner. Variation in productivity among streams may be related to the quality of the stream habitat. We show how freshwater productivity can be combined with forecasts of marine survival to provide a limit reference point harvest rate. Our method will permit harvest rates to track changes in ocean productivity. We also used the historical data to estimate that, on average, a density of 19 female spawners·km-1 is required to fully seed freshwater habitats with juveniles. However, there was considerable variation among the streams that might limit the utility of this measure as a reference point. Uncertainty in the forecasts of marine survival and other parameters needs to be incorporated into our scheme before it can be considered a precautionary approach.


Sign in / Sign up

Export Citation Format

Share Document