Analysis of anomaly correlations

Geophysics ◽  
1997 ◽  
Vol 62 (1) ◽  
pp. 342-351 ◽  
Author(s):  
Ralph R. B. von Frese ◽  
Michael B. Jones ◽  
Jeong Woo Kim ◽  
Jeong‐Hee Kim

Recognizing correlations between data sets is the basis for rationalizing geophysical interpretation and theory. Procedures are presented that constitute an effective process for identifying correlative features between two or more digital data sets. The procedures include the development of normalization factors from the mean and variance properties of the data sets. Using these factors, the data sets may be transformed so that they have common amplitude ranges, means, and variances, thereby allowing a common graphical representation of the data sets that facilitates the visualization of feature correlations. Anomaly features that show direct, inverse, or no correlations between data sets may be separated by the application of correlation filters in the frequency domains of the data sets. The correlation filter passes or rejects wavenumbers between coregistered data sets based on the correlation coefficient between common wavenumbers as given by the cosine of their phase difference. Standardizing and summing the filtered outputs where directly correlative features have been enhanced yields local favorability indices that optimize the perception of these features. Differencing the standardized outputs where inversely correlative features have been enhanced, on the other hand, provides favorability indices that improve the perception of the inverse correlations. This study includes a generic example, as well as magnetic and gravity anomaly profile examples that illustrate the usefulness of these procedures for extracting correlative features between digital data sets.

2015 ◽  
Vol 31 (2) ◽  
pp. 357 ◽  
Author(s):  
Hela Miniaoui ◽  
Hameedah Sayani ◽  
Anissa Chaibi

<p>We study performance of Islamic and conventional indices of the Gulf Cooperation Council (GCC) countries in the wake of financial crisis of 2008 and test whether Islamic indices were less risky than conventional indices. We make use of data of the six GCC markets as well as the Dow Jones Islamic Market Index GCC. The mean and variance of each of the indices are analyzed based on augmented GARCH models. The results show that the financial crisis impacted on the mean returns of Bahrain, the other indices remained unaffected. The financial crisis, however, impacted volatility in three GCC markets (Kuwait, Bahrain, and the UAE), while the impact on the remaining markets (Saudi Arabia, Oman, and Qatar) and the Islamic index was insignificant. More interestingly, we show that the Islamic index did not exhibit lower volatility than its conventional counterparts.</p>


1987 ◽  
Vol 63 (5) ◽  
pp. 347-350 ◽  
Author(s):  
Stephen J. Titus

Boxplots are a useful enhancement to the traditional summary statistics such as the mean and variance. Based on the median and other percentiles of the data distribution, they provide more information in a graphic format which is convenient for interpreting the nature of one or several data sets Use of boxplots is illustrated with three common types of forestry data: 1) tree diameter distributions, 2) tree volume function residuals, and 3) forest inventory summaries.


2020 ◽  
Author(s):  
Faezeh Bayat ◽  
Maxwell Libbrecht

AbstractMotivationA sequencing-based genomic assay such as ChIP-seq outputs a real-valued signal for each position in the genome that measures the strength of activity at that position. Most genomic signals lack the property of variance stabilization. That is, a difference between 100 and 200 reads usually has a very different statistical importance from a difference between 1,100 and 1,200 reads. A statistical model such as a negative binomial distribution can account for this pattern, but learning these models is computationally challenging. Therefore, many applications—including imputation and segmentation and genome annotation (SAGA)—instead use Gaussian models and use a transformation such as log or inverse hyperbolic sine (asinh) to stabilize variance.ResultsWe show here that existing transformations do not fully stabilize variance in genomic data sets. To solve this issue, we propose VSS, a method that produces variance-stabilized signals for sequencingbased genomic signals. VSS learns the empirical relationship between the mean and variance of a given signal data set and produces transformed signals that normalize for this dependence. We show that VSS successfully stabilizes variance and that doing so improves downstream applications such as SAGA. VSS will eliminate the need for downstream methods to implement complex mean-variance relationship models, and will enable genomic signals to be easily understood by [email protected]://github.com/faezeh-bayat/Variance-stabilized-units-for-sequencing-based-genomic-signals.


2021 ◽  
Vol 1 (2) ◽  
pp. 50-57
Author(s):  
Evaldo de Paiva Lima ◽  
Rosandro Boligon Minuzzi ◽  
Yuri de Almeida Lyra Corrêa ◽  
Camila Sanches De Oliveira

Soloteca is the term used in Brazil to refer to the place where reference soil samples are stored. The Soloteca of Embrapa Solos-RJ, for example, stores soil samples from different regions of Brazil, and these samples need to be stored in conditions that preserve their intrinsic characteristics. In this context, the objective of this work was to determine the air temperature and relative humidity conditions in the Soloteca of Embrapa Solos-RJ. The data were collected by three Thermo hygrometers, installed inside and outside the place where the samples are stored, in the period from December 1, 2016, to March 31, 2017, corresponding to summer 2016/17. The difference between the mean and variance of the data for each environment/height was evaluated at the 5% level, respectively, by the  t-Student and Snedecor's F tests. The other statistical analyses were presented by boxplots. It was observed that the air temperature, on average, did not differ between the indoor and outdoor environments at a height of 1.6 meters, but there was a difference with the sensor installed near the surface (0.5 meters). On the other hand, the dispersion of the data attested by the variance and the coefficient of variation in the external environment was greater than those recorded internally.


Author(s):  
Behzad Alipour ◽  
Ali Haroon Abadi

Service-oriented architecture presents a frame in which the system functions are defined as a series of the distributed services in the intended sizes of the organization. These services are called by the other software and also are used for building the new services. Although this architecture offers a simple solution for building the distributed systems with loosely coupling, it introduces some additional concerns. One of the main concerns in designing a SOA system is general reliability of the system. Then the new technique for modeling the reliability is needed for certificating the services. Regarding to this weakness, in this paper, a certification method for the reliability in which services have been simulated as the discrete Markov chains, this work presents a model for estimating the reliability by exploiting the mean and variance of the visits obtained from analyzing Markov chain and integrating them into the reliability of the characteristics of each individual service. Results show that in the proposed method, less fault-tolerance than the recent methods for predicting the reliability of the systems is used.


1993 ◽  
Vol 69 (01) ◽  
pp. 035-040 ◽  
Author(s):  
A M H P van den Besselaar ◽  
R M Bertina

SummaryFour thromboplastin reagents were tested by 18 laboratories in Europe, North-America, and Australasia, according to a detailed protocol. One thromboplastin was the International Reference Preparation for ox brain thromboplastin combined with adsorbed bovine plasma (coded OBT/79), and the second was a certified reference material for rabbit brain thromboplastin, plain (coded CRM 149R). The other two thromboplastin reagents were another rabbit plain brain thromboplastin (RP) with a lower ISI than CRM 149R and a rabbit brain thromboplastin combined with adsorbed bovine plasma (RC). Calibration of the latter two reagents was performed according to methods recommended by the World Health Organization (W. H. O.).The purpose of this study was to answer the following questions: 1) Is the calibration of the RC reagent more precise against the bovine/combined (OBT/79) than against the rabbit/plain reagent (CRM 149R)? 2) Is the precision of calibration influenced by the magnitude of the International Sensitivity Index (ISI)?The lowest inter-laboratory variation of ISI was observed in the calibration of the rabbit/plain reagent (RP) against the other rabbit/plain reagent (CRM 149R) (CV 1.6%). The highest interlaboratory variation was obtained in the calibration of rabbit/plain (RP) against bovine/combined (OBT/79) (CV 5.1%). In the calibration of the rabbit/combined (RC) reagent, there was no difference in precision between OBT/79 (CV 4.3%) and CRM 149R (CV 4.2%). Furthermore, there was no significant difference in the precision of the ISI of RC obtained with CRM 149R (ISI = 1.343) and the rabbit/plain (RP) reagent with ISI = 1.14. In conclusion, the calibration of RC could be performed with similar precision with either OBT/79 or CRM 149R, or RP.The mean ISI values calculated with OBT/79 and CRM 149R were practically identical, indicating that there is no bias in the ISI of these reference preparations and that these reference preparations have been stable since their original calibration studies in 1979 and 1987, respectively.International Normalized Ratio (INR) equivalents were calculated for a lyophilized control plasma derived from patients treated with oral anticoagulants. There were small but significant differences in the mean INR equivalents between the bovine and rabbit thromboplastins. There were no differences in the interlaboratory variation of the INR equivalents, when the four thromboplastins were compared.


1979 ◽  
Vol 42 (04) ◽  
pp. 1073-1114 ◽  

SummaryIn collaborative experiments in 199 laboratories, nine commercial thromboplastins, four thromboplastins held by the National Institute for Biological Standards and Control (NIBS & C), London and the British Comparative Thromboplastin were tested on fresh normal and coumarin plasmas, and on three series of freeze-dried plasmas. One of these was made from coumarin plasmas and the other two were prepared from normal plasmas; in each series, one plasma was normal and the other two represented different degrees of coumarin defect.Each thromboplastin was calibrated against NIBS&C rabbit brain 70/178, from the slope of the line joining the origin to the point of intersection of the mean ratios of coumarin/normal prothrombin times when the ratios obtained with the two thromboplastins on the same fresh plasmas were plotted against each other. From previous evidence, the slopes were calculated which would have been obtained against the NIBS&C “research standard” thromboplastin 67/40, and termed the “calibration constant” of each thromboplastin. Values obtained from the freeze-dried coumarin plasmas gave generally similar results to those from fresh plasmas for all thromboplastins, whereas values from the artificial plasmas agreed with those from fresh plasmas only when similar thromboplastins were being compared.Taking into account the slopes of the calibration lines and the variation between laboratories, precision in obtaining a patient’s prothrombin time was similar for all thromboplastins.


1985 ◽  
Vol 54 (04) ◽  
pp. 739-743 ◽  
Author(s):  
Federica Delaini ◽  
Elisabetta Dejana ◽  
Ine Reyers ◽  
Elisa Vicenzi ◽  
Germana De Bellis Vitti ◽  
...  

SummaryWe have investigated the relevance of some laboratory tests of platelet function in predicting conditions of thrombotic tendency. For this purpose, we studied platelet survival, platelet aggregation in response to different stimuli, TxB2 and 6-keto-PGFlα production in serum of rats bearing a nephrotic syndrome induced by adriamycin. These animals show a heavy predisposition to the development of both arterial and venous thrombosis. The mean survival time was normal in nephrotic rats in comparison to controls. As to aggregation tests, a lower aggregating response was found in ADR-treated rats using ADP or collagen as stimulating agents. With arachidonic acid (AA) we observed similar aggregating responses at lower A A concentrations, whereas at higher AA concentrations a significantly lower response was found in nephrotic rats, despite their higher TxB2 production. Also TxB2 and 6-keto-PGFlα levels in serum of nephrotic rats were significantly higher than in controls. No consistent differences were found in PGI2-activity generated by vessels of control or nephrotic rats.These data show that platelet function may appear normal or even impaired in rats with a markedly increased thrombotic tendency. On the other hand, the significance of high TxB2 levels in connection with mechanisms leading to thrombus formation remains a controversial issue.


2004 ◽  
Vol 9 (3) ◽  
pp. 233-240 ◽  
Author(s):  
S. Kim

This paper describes a Voronoi analysis method to analyze a soccer game. It is important for us to know the quantitative assessment of contribution done by a player or a team in the game as an individual or collective behavior. The mean numbers of vertices are reported to be 5–6, which is a little less than those of a perfect random system. Voronoi polygons areas can be used in evaluating the dominance of a team over the other. By introducing an excess Voronoi area, we can draw some fruitful results to appraise a player or a team rather quantitatively.


Sign in / Sign up

Export Citation Format

Share Document