scholarly journals Recognizing Events in Spatiotemporal Soccer Data

2020 ◽  
Vol 10 (22) ◽  
pp. 8046
Author(s):  
Victor Khaustov ◽  
Maxim Mozgovoy

Spatiotemporal datasets based on player tracking are widely used in sports analytics research. Common research tasks often require the analysis of game events, such as passes, fouls, tackles, and shots on goal. However, spatiotemporal datasets usually do not include event information, which means it has to be reconstructed automatically. We propose a rule-based algorithm for identifying several basic types of events in soccer, including ball possession, successful and unsuccessful passes, and shots on goal. Our aim is to provide a simple procedure that can be used for practical soccer data analysis tasks, and also serve as a baseline model for algorithms based on more advanced approaches. The resulting algorithm is fast, easy to implement, achieves high accuracy on the datasets available to us, and can be used in similar scenarios without modification.

2013 ◽  
Vol 281 ◽  
pp. 23-27
Author(s):  
Mei Yuan ◽  
Si Si Xiong ◽  
Shao Peng Dong

A brand new self-compensated capacitive fuel level sensor has been proposed in this paper. Through mathematics manipulation and theoretical analysis, we design the self-compensated structure of capacitive level sensor. The multiple segmentation structure makes compensation for temperature and medium possible. Furthermore, the effect caused by adhesion on the sensor electrodes if the adhesion fails to return initial position when the plane’s attitude is changing has been analyzed. Additionally, based on RF admittance theory, the transducer which can eliminate the adhesion effect has been designed and implemented using phase-locked sampling technique. Through level experiment and data analysis, the fuel level sensor proved to achieve all the destinations, including compensation for temperature and medium and elimination of adhesion effect. Hence, the accuracy of level measurement has been improved.


1994 ◽  
Vol 38 ◽  
pp. 47-57 ◽  
Author(s):  
D. L. Bish ◽  
Steve. J. Chipera

Abstract Accuracy, or how well a measurement conforms to the true value of a parameter, is important in XRD analyses in three primary areas, 1) 26 position or d-spacing; 2) peak shape; and 3) intensity. Instrumental factors affecting accuracy include zero-point, axial-divergence, and specimen- displacement errors, step size, and even uncertainty in X-ray wavelength values. Sample factors affecting accuracy include specimen transparency, structural strain, crystallite size, and preferred orientation effects. In addition, a variety of other sample-related factors influence the accuracy of quantitative analyses, including variations in sample composition and order/disorder. The conventional method of assessing accuracy during experimental diffractometry measurements is through the use of certified internal standards. However, it is possible to obtain highly accurate d-spacings without an internal standard using a well-aligned powder diffractometer coupled with data analysis routines that allow analysis of and correction for important systematic errors. The first consideration in such measurements is the use of methods yielding precise peak positions, such as profile fitting. High accuracy can be achieved if specimen-displacement, specimen- transparency, axial-divergence, and possibly zero-point corrections are included in data analysis. It is also important to consider that most common X-ray wavelengths (other than Cu Kα1) have not been measured with high accuracy. Accuracy in peak-shape measurements is important in the separation of instrumental and sample contributions to profile shape, e.g., in crystallite size and strain measurements. The instrumental contribution must be determined accurately using a standard material free from significant sample-related effects, such as NIST SRM 660 (LaB6). Although full-pattern fitting methods for quantitative analysis are available, the presence of numerous systematic errors makes the use of an internal standard, such as a-alumina mandatory to ensure accuracy; accuracy is always suspect when using external-standard, constrained-total quantitative analysis methods. One of the most significant problems in quantitative analysis remains the choice of representative standards. Variations in sample chemistry, order-disorder, and preferred orientation can be accommodated only with a thorough understanding of the coupled effects of all three on intensities. It is important to recognize that sample preparation methods that optimize accuracy for one type of measurement may not be appropriate for another. For example, the very fine crystallite size that is optimum for quantitative analysis is unnecessary and can even be detrimental in d-spacing and peak shape measurements.


2012 ◽  
Vol 608-609 ◽  
pp. 433-436 ◽  
Author(s):  
Yan Fang Li ◽  
Chang Wang ◽  
Yu Bei Wei ◽  
Yan Jie Zhao ◽  
Tingting Zhang ◽  
...  

One fiber laser CH4 detection system based on the spectrum absorption is provided. The system has the advantages such as high accuracy, fast response and highly specific and so on. But the absorption coefficient of the methane and the characteristic of the electronic and optic device are all affected by the temperature. So the test value has interdependency to the environment temperature. In this paper, we give the measure principle of the detector and the analyses of the data which last one month uninterrupted. And then the temperature compensate was introduced to improve the performance of the detector.


2009 ◽  
Vol 13 (1) ◽  
pp. 31-38 ◽  
Author(s):  
Alexander Ilic ◽  
Thomas Andersen ◽  
Florian Michahelles

2014 ◽  
Vol 2014 ◽  
pp. 1-6
Author(s):  
Shasha Li ◽  
Zhongmei Zhou ◽  
Weiping Wang

The Internet of things (IOT) is a hot issue in recent years. It accumulates large amounts of data by IOT users, which is a great challenge to mining useful knowledge from IOT. Classification is an effective strategy which can predict the need of users in IOT. However, many traditional rule-based classifiers cannot guarantee that all instances can be covered by at least two classification rules. Thus, these algorithms cannot achieve high accuracy in some datasets. In this paper, we propose a new rule-based classification, CDCR-P (Classification based on the Pruning and Double Covered Rule sets). CDCR-P can induce two different rule setsAandB. Every instance in training set can be covered by at least one rule not only in rule setA, but also in rule setB. In order to improve the quality of rule setB, we take measure to prune the length of rules in rule setB. Our experimental results indicate that, CDCR-P not only is feasible, but also it can achieve high accuracy.


Author(s):  
Roman Shults ◽  
Khaini-Kamal Kassymkanova ◽  
Shugyla Burlibayeva ◽  
Daria Skopinova ◽  
Roman Demianenko ◽  
...  

The first stage of any construction is carrying out excavation works. These works are high-priced and timeconsuming. Mostly, for geodetic control of the works, the surveyors are using total stations and GNSS equipment. Last decade, UAV technology was a breakthrough in the geodetic technologies market. One of the possible applications of UAV is the monitoring of excavation works. In the article, the opportunities and accuracy of UAV data while performing the excavation works were studied. The surveying of earth volume in the middle of construction works was made using DJI Phantom 4 UAV. The data were being processed using two photogrammetric software: Agisoft Metashape and PhotoModeler Premium. For comparison, the surveying also was made using a conventional total station. For each data source, the 3D models were generated. The obtained models were compared with each other in CloudCompare software. The comparison revealed the high accuracy of UAV data that satisfies customer’s requirements. For the case of two software comparing, it is better to process data using PhotoModeler. The PhotoModeler software allows performing in-depth data analysis and blunders searching.


2016 ◽  
Author(s):  
Hannes L Röst ◽  
Ruedi Aebersold ◽  
Olga T Schubert

Targeted mass spectrometry comprises a set of methods able to quantify protein analytes in complex mixtures with high accuracy and sensitivity. These methods, e.g., Selected Reaction Monitoring (SRM) and SWATH MS, use specific mass spectrometric coordinates (assays) for reproducible detection and quantification of proteins. In this protocol, we describe how to analyze in a targeted manner data from a SWATH MS experiment aimed at monitoring thousands of proteins reproducibly over many samples. We present a standard SWATH MS analysis workflow, including manual data analysis for quality control (based on Skyline) as well as automated data analysis with appropriate control of error rates (based on the OpenSWATH workflow). We also discuss considerations to ensure maximal coverage, reproducibility and quantitative accuracy.


Sign in / Sign up

Export Citation Format

Share Document