Quantifying Microtiming Patterning and Variability in Drum Kit Recordings

2015 ◽  
Vol 33 (2) ◽  
pp. 147-162 ◽  
Author(s):  
Kahl Hellmer ◽  
Guy Madison

Human performers introduce temporal variability in their performance of music. The variability consists of both long-range tempo changes and microtiming variability that are note-to-note level deviations from the nominal beat time. In many contexts, microtiming is important for achieving certain preferred characteristics in a performance, such as hang, drive, or groove; but this variability is also, to some extent, stochastic. In this paper, we present a method for quantifying the microtiming variability. First, we transcribed drum performance audio files into empirical data using a very precise onset detection system. Second, we separated the microtiming variability into two components: systematic variability (SV), defined as recurrent temporal patterns, and residual variability (RV), defined as the residual, unexplained temporal deviation. The method was evaluated using computer-performed audio drum tracks and the results show a slight overestimation of the variability magnitude, but proportionally correct ratios between SV and RV. Thereafter two data sets were analyzed: drum performances from a MIDI drum kit and real-life drum performances from professional drum recordings. The results from these data sets show that up to 65 percent of the total micro-timing variability can be explained by recurring and consistent patterns.

2008 ◽  
Vol 2008 ◽  
pp. 1-13 ◽  
Author(s):  
Andrew Simsky ◽  
David Mertens ◽  
Jean-Marie Sleewaegen ◽  
Martin Hollreiser ◽  
Massimo Crisci

Analysis of GIOVE-A signals is an important part of the in-orbit validation phase of the Galileo program. GIOVE-A transmits the ranging signals using all the code modulations currently foreseen for the future Galileo and provides a foretaste of their performance in real-life applications. Due to the use of advanced code modulations, the ranging signals of Galileo provide significant improvement of the multipath performance as compared to current GPS. In this paper, we summarize the results of about 1.5 years of observations using the data from four antenna sites. The analysis of the elevation dependence of averaged multipath errors and the multipath time series for static data indicate significant suppression of long-range multipath by the best Galileo codes. The E5AltBOC signal is confirmed to be a multipath suppression champion for all the data sets. According to the results of the observations, the Galileo signals can be classified into 3 groups: high-performance (E5AltBOC, L1A, E6A), medium-performance (E6BC, E5a, E5b) and an L1BC signal, which has the lowest performance among Galileo signals, but is still better than GPS-CA. The car tests have demonstrated that for kinematic multipath the intersignal differences are a lot less pronounced. The phase multipath performance is also discussed.


2016 ◽  
Vol 2016 ◽  
pp. 1-18 ◽  
Author(s):  
Mustafa Yuksel ◽  
Suat Gonul ◽  
Gokce Banu Laleci Erturkmen ◽  
Ali Anil Sinaci ◽  
Paolo Invernizzi ◽  
...  

Depending mostly on voluntarily sent spontaneous reports, pharmacovigilance studies are hampered by low quantity and quality of patient data. Our objective is to improve postmarket safety studies by enabling safety analysts to seamlessly access a wide range of EHR sources for collecting deidentified medical data sets of selected patient populations and tracing the reported incidents back to original EHRs. We have developed an ontological framework where EHR sources and target clinical research systems can continue using their own local data models, interfaces, and terminology systems, while structural interoperability and Semantic Interoperability are handled through rule-based reasoning on formal representations of different models and terminology systems maintained in the SALUS Semantic Resource Set. SALUS Common Information Model at the core of this set acts as the common mediator. We demonstrate the capabilities of our framework through one of the SALUS safety analysis tools, namely, the Case Series Characterization Tool, which have been deployed on top of regional EHR Data Warehouse of the Lombardy Region containing about 1 billion records from 16 million patients and validated by several pharmacovigilance researchers with real-life cases. The results confirm significant improvements in signal detection and evaluation compared to traditional methods with the missing background information.


2013 ◽  
Vol 96 (2) ◽  
pp. 392-398 ◽  
Author(s):  
Ted Hadfield ◽  
Valorie Ryan ◽  
Usha K Spaulding ◽  
Kristine M Clemens ◽  
Irene M Ota ◽  
...  

Abstract The RAZOR™ EX Anthrax Air Detection System was validated in a collaborative study for the detection of Bacillus anthracis in aerosol collection buffer. Phosphate-buffered saline was charged with 1 mg/mL standardized dust to simulate an authentic aerosol collection sample. The dust-charged buffer was spiked with either B. anthracis Ames at 2000 spores/mL or Bacillus cereus at 20 000 spores/mL. Twelve collaborators participated in the study, with four collaborators at each of three sites. Each collaborator tested 12 replicates of B. anthracis in dust-charged buffer and 12 replicates of B. cereus in dust-charged buffer. All samples sets were randomized and blind-coded. All collaborators produced valid data sets (no collaborators displayed systematic errors) and there was only one invalid data point. After unblinding, the analysis revealed a cross-collaborator probability of detection (CPOD) of 1.00 (144 positive results from 144 replicates, 95% confidence interval 0.975–1.00) for the B. anthracis samples and a CPOD of 0.00 (0 positive results from 143 replicates, 95% confidence interval 0.00–0.0262) for the B. cereus samples. These data meet the requirements of AOAC Standard Method Performance Requirement 2010.003, developed by the Stakeholder Panel on Agent Detection Assays.


The Intrusion is a major threat to unauthorized data or legal network using the legitimate user identity or any of the back doors and vulnerabilities in the network. IDS mechanisms are developed to detect the intrusions at various levels. The objective of the research work is to improve the Intrusion Detection System performance by applying machine learning techniques based on decision trees for detection and classification of attacks. The methodology adapted will process the datasets in three stages. The experimentation is conducted on KDDCUP99 data sets based on number of features. The Bayesian three modes are analyzed for different sized data sets based upon total number of attacks. The time consumed by the classifier to build the model is analyzed and the accuracy is done.


Biosensors ◽  
2021 ◽  
Vol 11 (9) ◽  
pp. 343
Author(s):  
Chin-Teng Lin ◽  
Wei-Ling Jiang ◽  
Sheng-Fu Chen ◽  
Kuan-Chih Huang ◽  
Lun-De Liao

In the assistive research area, human–computer interface (HCI) technology is used to help people with disabilities by conveying their intentions and thoughts to the outside world. Many HCI systems based on eye movement have been proposed to assist people with disabilities. However, due to the complexity of the necessary algorithms and the difficulty of hardware implementation, there are few general-purpose designs that consider practicality and stability in real life. Therefore, to solve these limitations and problems, an HCI system based on electrooculography (EOG) is proposed in this study. The proposed classification algorithm provides eye-state detection, including the fixation, saccade, and blinking states. Moreover, this algorithm can distinguish among ten kinds of saccade movements (i.e., up, down, left, right, farther left, farther right, up-left, down-left, up-right, and down-right). In addition, we developed an HCI system based on an eye-movement classification algorithm. This system provides an eye-dialing interface that can be used to improve the lives of people with disabilities. The results illustrate the good performance of the proposed classification algorithm. Moreover, the EOG-based system, which can detect ten different eye-movement features, can be utilized in real-life applications.


Author(s):  
Barinaadaa John Nwikpe ◽  
Isaac Didi Essi

A new two-parameter continuous distribution called the Two-Parameter Nwikpe (TPAN) distribution is derived in this paper. The new distribution is a mixture of gamma and exponential distributions. A few statistical properties of the new probability distribution have been derived. The shape of its density for different values of the parameters has also been established.  The first four crude moments, the second and third moments about the mean of the new distribution were derived using the method of moment generating function. Other statistical properties derived include; the distribution of order statistics, coefficient of variation and coefficient of skewness. The parameters of the new distribution were estimated using maximum likelihood method. The flexibility of the Two-Parameter Nwikpe (TPAN) distribution was shown by fitting the distribution to three real life data sets. The goodness of fit shows that the new distribution outperforms the one parameter exponential, Shanker and Amarendra distributions for the data sets used for this study.


2012 ◽  
Vol 9 (10) ◽  
pp. 13439-13496 ◽  
Author(s):  
M. J. Smith ◽  
M. C. Vanderwel ◽  
V. Lyutsarev ◽  
S. Emmott ◽  
D. W. Purves

Abstract. The feedback between climate and the terrestrial carbon cycle will be a key determinant of the dynamics of the Earth System over the coming decades and centuries. However Earth System Model projections of the terrestrial carbon-balance vary widely over these timescales. This is largely due to differences in their carbon cycle models. A major goal in biogeosciences is therefore to improve understanding of the terrestrial carbon cycle to enable better constrained projections. Essential to achieving this goal will be assessing the empirical support for alternative models of component processes, identifying key uncertainties and inconsistencies, and ultimately identifying the models that are most consistent with empirical evidence. To begin meeting these requirements we data-constrained all parameters of all component processes within a global terrestrial carbon model. Our goals were to assess the climate dependencies obtained for different component processes when all parameters have been inferred from empirical data, assess whether these were consistent with current knowledge and understanding, assess the importance of different data sets and the model structure for inferring those dependencies, assess the predictive accuracy of the model, and to identify a methodology by which alternative component models could be compared within the same framework in future. Although formulated as differential equations describing carbon fluxes through plant and soil pools, the model was fitted assuming the carbon pools were in states of dynamic equilibrium (input rates equal output rates). Thus, the parameterised model is of the equilibrium terrestrial carbon cycle. All but 2 of the 12 component processes to the model were inferred to have strong climate dependencies although it was not possible to data-constrain all parameters indicating some potentially redundant details. Similar climate dependencies were obtained for most processes whether inferred individually from their corresponding data sets or using the full terrestrial carbon model and all available data sets, indicating a strong overall consistency in the information provided by different data sets under the assumed model formulation. A notable exception was plant mortality, in which qualitatively different climate dependencies were inferred depending on the model formulation and data sets used, highlighting this component as the major structural uncertainty in the model. All but two component processes predicted empirical data better than a null model in which no climate dependency was assumed. Equilibrium plant carbon was predicted especially well (explaining around 70% of the variation in the withheld evaluation data). We discuss the advantages of our approach in relation to advancing our understanding of the carbon cycle and enabling Earth System Models make better constrained projections.


2021 ◽  
Vol 2021 ◽  
pp. 1-12
Author(s):  
Yi Lu ◽  
Menghan Liu ◽  
Jie Zhou ◽  
Zhigang Li

Intrusion Detection System (IDS) is an important part of ensuring network security. When the system faces network attacks, it can identify the source of threats in a timely and accurate manner and adjust strategies to prevent hackers from intruding. Efficient IDS can identify external threats well, but traditional IDS has poor performance and low recognition accuracy. To improve the detection rate and accuracy of IDS, this paper proposes a novel ACGA-BPNN method based on adaptive clonal genetic algorithm (ACGA) and backpropagation neural network (BPNN). ACGA-BPNN is simulated on the KDD-CUP’99 and UNSW-NB15 data sets. The simulation results indicate that, in contrast to the methods based on simulated annealing (SA) and genetic algorithm (GA), the detection rate and accuracy of ACGA-BPNN are much higher than of GA-BPNN and SA-BPNN. In the classification results of KDD-CUP’99, the classification accuracy of ACGA-BPNN is 11% higher than GA-BPNN and 24.2% higher than SA-BPNN, and F-score reaches 99.0%. In addition, ACGA-BPNN has good global searchability and its convergence speed is higher than that of GA-BPNN and SA-BPNN. Furthermore, ACGA-BPNN significantly improves the overall detection performance of IDS.


Sign in / Sign up

Export Citation Format

Share Document