Complexity analysis in the sport of boxing

2017 ◽  
Vol 5 (6) ◽  
pp. 953-963 ◽  
Author(s):  
Adam G Tennant ◽  
Nasir Ahmad ◽  
Sybil Derrible

Abstract A general model of the complexity of the sport of boxing has yet to be produced exploring the match play that goes on between combatants. The sport has a long history that dates back to the eighth century before common era (BCE) to the time of ancient Greece. Also known as the ‘sweet science’, most research work has legitimately focused on the combat sport’s long-term health affects concerning brain trauma. This present study seeks to explore the complexity of the sport by utilizing a data set of welterweights (63.5–67 kg). This data set was used to build a contact network with the boxers as nodes and the actual fights as the links. Additionally a PageRank algorithm was used to rank the boxers from the contact network. Devon Alexander was calculated as the top welterweight from data set. This was compared with the rankings of the sport’s notoriously corrupt sanctioning bodies, journalistic rankings, and a more standard non-network based ranking system. The network visualization displayed features typical of many others seen in the literature. A closer look was taken on several of the boxers by the visualization technique known as the rank clock. This allowed for the boxer’s rank history to be tracked and allowed for insight on their career trajectory. Timothy Bradley and Vyacheslav Senchenko had rank clocks that displayed them to be the most consistent boxers in the 2004–2014 decade. These research findings supply further confirmation of value of the network based approach in athletic match play.

2021 ◽  
Vol 22 (1) ◽  
Author(s):  
Nabil Serrano ◽  
Marc Kissling ◽  
Hannah Krafft ◽  
Karl Link ◽  
Oliver Ullrich ◽  
...  

Abstract Background For optimal prosthetic anchoring in omarthritis surgery, a differentiated knowledge on the mineralisation distribution of the glenoid is important. However, database on the mineralisation of diseased joints and potential relations with glenoid angles is limited. Methods Shoulder specimens from ten female and nine male body donors with an average age of 81.5 years were investigated. Using 3D-CT-multiplanar reconstruction, glenoid inclination and retroversion angles were measured, and osteoarthritis signs graded. Computed Tomography-Osteoabsorptiometry (CT-OAM) is an established method to determine the subchondral bone plate mineralisation, which has been demonstrated to serve as marker for the long-term loading history of joints. Based on mineralisation distribution mappings of healthy shoulder specimens, physiological and different CT-OAM patterns were compared with glenoid angles. Results Osteoarthritis grades were 0-I in 52.6% of the 3D-CT-scans, grades II-III in 34.3%, and grade IV in 13.2%, with in females twice as frequently (45%) higher grades (III, IV) than in males (22%, III). The average inclination angle was 8.4°. In glenoids with inclination ≤10°, mineralisation was predominantly centrally distributed and tended to shift more cranially when the inclination raised to > 10°. The average retroversion angle was − 5.2°. A dorsally enhanced mineralisation distribution was found in glenoids with versions from − 15.9° to + 1.7°. A predominantly centrally distributed mineralisation was accompanied by a narrower range of retroversion angles between − 10° to − 0.4°. Conclusions This study is one of the first to combine CT-based analyses of glenoid angles and mineralisation distribution in an elderly population. The data set is limited to 19 individuals, however, indicates that superior inclination between 0° and 10°-15°, and dorsal version ranging between − 9° to − 3° may be predominantly associated with anterior and central mineralisation patterns previously classified as physiological for the shoulder joint. The current basic research findings may serve as basic data set for future studies addressing the glenoid geometry for treatment planning in omarthritis.


2020 ◽  
Vol 13 (3) ◽  
pp. 381-393
Author(s):  
Farhana Fayaz ◽  
Gobind Lal Pahuja

Background:The Static VAR Compensator (SVC) has the capability of improving reliability, operation and control of the transmission system thereby improving the dynamic performance of power system. SVC is a widely used shunt FACTS device, which is an important tool for the reactive power compensation in high voltage AC transmission systems. The transmission lines compensated with the SVC may experience faults and hence need a protection system against the damage caused by these faults as well as provide the uninterrupted supply of power.Methods:The research work reported in the paper is a successful attempt to reduce the time to detect faults on a SVC-compensated transmission line to less than quarter of a cycle. The relay algorithm involves two ANNs, one for detection and the other for classification of faults, including the identification of the faulted phase/phases. RMS (Root Mean Square) values of line voltages and ratios of sequence components of line currents are used as inputs to the ANNs. Extensive training and testing of the two ANNs have been carried out using the data generated by simulating an SVC-compensated transmission line in PSCAD at a signal sampling frequency of 1 kHz. Back-propagation method has been used for the training and testing. Also the criticality analysis of the existing relay and the modified relay has been done using three fault tree importance measures i.e., Fussell-Vesely (FV) Importance, Risk Achievement Worth (RAW) and Risk Reduction Worth (RRW).Results:It is found that the relay detects any type of fault occurring anywhere on the line with 100% accuracy within a short time of 4 ms. It also classifies the type of the fault and indicates the faulted phase or phases, as the case may be, with 100% accuracy within 15 ms, that is well before a circuit breaker can clear the fault. As demonstrated, fault detection and classification by the use of ANNs is reliable and accurate when a large data set is available for training. The results from the criticality analysis show that the criticality ranking varies in both the designs (existing relay and the existing modified relay) and the ranking of the improved measurement system in the modified relay changes from 2 to 4.Conclusion:A relaying algorithm is proposed for the protection of transmission line compensated with Static Var Compensator (SVC) and criticality ranking of different failure modes of a digital relay is carried out. The proposed scheme has significant advantages over more traditional relaying algorithms. It is suitable for high resistance faults and is not affected by the inception angle nor by the location of fault.


2021 ◽  
pp. 108602662110316
Author(s):  
Tiziana Russo-Spena ◽  
Nadia Di Paola ◽  
Aidan O’Driscoll

An effective climate change action involves the critical role that companies must play in assuring the long-term human and social well-being of future generations. In our study, we offer a more holistic, inclusive, both–and approach to the challenge of environmental innovation (EI) that uses a novel methodology to identify relevant configurations for firms engaging in a superior EI strategy. A conceptual framework is proposed that identifies six sets of driving characteristics of EI and two sets of beneficial outcomes, all inherently tensional. Our analysis utilizes a complementary rather than an oppositional point of view. A data set of 65 companies in the ICT value chain is analyzed via fuzzy-set comparative analysis (fsQCA) and a post-QCA procedure. The results reveal that achieving a superior EI strategy is possible in several scenarios. Specifically, after close examination, two main configuration groups emerge, referred to as technological environmental innovators and organizational environmental innovators.


2021 ◽  
pp. 002224372110092
Author(s):  
Zhenling Jiang ◽  
Dennis J. Zhang ◽  
Tat Chan

This paper studies how receiving a bonus changes the consumers’ demand for auto loans and the risk of future delinquency. Unlike traditional consumer products, auto loans have a long-term impact on consumers’ financial state because of the monthly payment obligation. Using a large consumer panel data set of credit and employment information, the authors find that receiving a bonus increases auto loan demand by 21 percent. These loans, however, are associated with higher risk, as the delinquency rate increases by 18.5 −31.4 percent depending on different measures. In contrast, an increase in consumers’ base salary will increase the demand for auto loans but not the delinquency. By comparing consumers with bonuses with those without bonuses, the authors find that bonus payments lead to both demand expansion and demand shifting on auto loans. The empirical findings help shed light on how consumers make financial decisions and have important implications for financial institutions on when demand for auto loans and the associated risk arise.


Sensors ◽  
2021 ◽  
Vol 21 (7) ◽  
pp. 2532
Author(s):  
Encarna Quesada ◽  
Juan J. Cuadrado-Gallego ◽  
Miguel Ángel Patricio ◽  
Luis Usero

Anomaly Detection research is focused on the development and application of methods that allow for the identification of data that are different enough—compared with the rest of the data set that is being analyzed—and considered anomalies (or, as they are more commonly called, outliers). These values mainly originate from two sources: they may be errors introduced during the collection or handling of the data, or they can be correct, but very different from the rest of the values. It is essential to correctly identify each type as, in the first case, they must be removed from the data set but, in the second case, they must be carefully analyzed and taken into account. The correct selection and use of the model to be applied to a specific problem is fundamental for the success of the anomaly detection study and, in many cases, the use of only one model cannot provide sufficient results, which can be only reached by using a mixture model resulting from the integration of existing and/or ad hoc-developed models. This is the kind of model that is developed and applied to solve the problem presented in this paper. This study deals with the definition and application of an anomaly detection model that combines statistical models and a new method defined by the authors, the Local Transilience Outlier Identification Method, in order to improve the identification of outliers in the sensor-obtained values of variables that affect the operations of wind tunnels. The correct detection of outliers for the variables involved in wind tunnel operations is very important for the industrial ventilation systems industry, especially for vertical wind tunnels, which are used as training facilities for indoor skydiving, as the incorrect performance of such devices may put human lives at risk. In consequence, the use of the presented model for outlier detection may have a high impact in this industrial sector. In this research work, a proof-of-concept is carried out using data from a real installation, in order to test the proposed anomaly analysis method and its application to control the correct performance of wind tunnels.


Author(s):  
Marcus Pietsch ◽  
Pierre Tulowitzki ◽  
Colin Cramer

Both organizational and management research suggest that schools and their leaders need to be ambidextrous to secure prosperity and long-term survival in dynamic environments characterized by competition and innovation. In this context, ambidexterity refers to the ability to simultaneously pursue exploitation and exploration and thus to deliver efficiency, control and incremental improvements while embracing flexibility, autonomy and discontinuous innovation. Using a unique, randomized and representative data set of N = 405 principals, we present findings on principals’ exploitation and exploration. The results indicate: (a) that principals engage far more often in exploitative than in explorative activities; (b) that exploitative activities in schools are executed at the expense of explorative activities; and (c) that explorative and ambidextrous activities of principals are positively associated with the (perceived) competition between schools. The study brings a novel perspective to educational research and demonstrates that applying the concept of ambidexterity has the potential to further our understanding of effective educational leadership and management.


2014 ◽  
Vol 112 (11) ◽  
pp. 2729-2744 ◽  
Author(s):  
Carlo J. De Luca ◽  
Joshua C. Kline

Over the past four decades, various methods have been implemented to measure synchronization of motor-unit firings. In this work, we provide evidence that prior reports of the existence of universal common inputs to all motoneurons and the presence of long-term synchronization are misleading, because they did not use sufficiently rigorous statistical tests to detect synchronization. We developed a statistically based method (SigMax) for computing synchronization and tested it with data from 17,736 motor-unit pairs containing 1,035,225 firing instances from the first dorsal interosseous and vastus lateralis muscles—a data set one order of magnitude greater than that reported in previous studies. Only firing data, obtained from surface electromyographic signal decomposition with >95% accuracy, were used in the study. The data were not subjectively selected in any manner. Because of the size of our data set and the statistical rigor inherent to SigMax, we have confidence that the synchronization values that we calculated provide an improved estimate of physiologically driven synchronization. Compared with three other commonly used techniques, ours revealed three types of discrepancies that result from failing to use sufficient statistical tests necessary to detect synchronization. 1) On average, the z-score method falsely detected synchronization at 16 separate latencies in each motor-unit pair. 2) The cumulative sum method missed one out of every four synchronization identifications found by SigMax. 3) The common input assumption method identified synchronization from 100% of motor-unit pairs studied. SigMax revealed that only 50% of motor-unit pairs actually manifested synchronization.


Paleobiology ◽  
10.1666/12050 ◽  
2013 ◽  
Vol 39 (4) ◽  
pp. 628-647 ◽  
Author(s):  
Leah J. Schneider ◽  
Timothy J. Bralower ◽  
Lee R. Kump ◽  
Mark E. Patzkowsky

The Paleocene-Eocene Thermal Maximum (PETM; ca. 55.8 Ma) is thought to coincide with a profound but entirely transient change among nannoplankton communities throughout the ocean. Here we explore the ecology of nannoplankton during the PETM by using multivariate analyses of a global data set that is based upon the distribution of taxa in time and space. We use these results, coupled with stable isotope data and geochemical modeling, to reinterpret the ecology of key genera. The results of the multivariate analyses suggest that the community was perturbed significantly in coastal and high-latitudes sites compared to the open ocean, and the relative influence of temperature and nutrient availability on the assemblage varies regionally. The open ocean became more stratified and less productive during the PETM and the oligotrophic assemblage responded primarily to changes in nutrient availability. Alternatively, assemblages at the equator and in the Southern Ocean responded to temperature more than to nutrient reduction. In addition, the assemblage change at the PETM was not merely transient—there is evidence of adaptation and a long-term change in the nannoplankton community that persists after the PETM and results in the disappearance of a high-latitude assemblage. The long-term effect on communities caused by transient warming during the PETM has implications for modern-day climate change, suggesting similar permanent changes to nannoplankton community structure as the oceans warm.


Sign in / Sign up

Export Citation Format

Share Document