Parametric definition of the influence of a paper in a citation network using communicability functions

2019 ◽  
Vol 7 (4) ◽  
pp. 623-640 ◽  
Author(s):  
Juan A Pichardo-Corpus ◽  
J Guillermo Contreras ◽  
José A de la Peña

Abstract Communicability functions quantify the flow of information between two nodes of a network. In this work, we use them to explore the concept of the influence of a paper in a citation network. These functions depend on a parameter. By varying the parameter in a continuous way we explore different definitions of influence. We study six citation networks, three from physics and three from computer science. As a benchmark, we compare our results against two frequently used measures: the number of citations of a paper and the PageRank algorithm. We show that the ranking of the articles in a network can be varied from being equivalent to the ranking obtained from the number of citations to a behaviour tending to the eigenvector centrality, these limits correspond to small and large values of the communicability-function parameter, respectively. At an intermediate value of the parameter a PageRank-like behaviour is recovered. As a test case, we apply communicability functions to two sets of articles, where at least one author of each paper was awarded a Nobel Prize for the research presented in the corresponding article.

2021 ◽  
Vol 6 (1) ◽  
Author(s):  
Ruaridh A. Clark ◽  
Malcolm Macdonald

AbstractContact networks provide insights on disease spread due to the duration of close proximity interactions. For systems governed by consensus dynamics, network structure is key to optimising the spread of information. For disease spread over contact networks, the structure would be expected to be similarly influential. However, metrics that are essentially agnostic to the network’s structure, such as weighted degree (strength) centrality and its variants, perform near-optimally in selecting effective spreaders. These degree-based metrics outperform eigenvector centrality, despite disease spread over a network being a random walk process. This paper improves eigenvector-based spreader selection by introducing the non-linear relationship between contact time and the probability of disease transmission into the assessment of network dynamics. This approximation of disease spread dynamics is achieved by altering the Laplacian matrix, which in turn highlights why nodes with a high degree are such influential disease spreaders. From this approach, a trichotomy emerges on the definition of an effective spreader where, for susceptible-infected simulations, eigenvector-based selections can either optimise the initial rate of infection, the average rate of infection, or produce the fastest time to full infection of the network. Simulated and real-world human contact networks are examined, with insights also drawn on the effective adaptation of ant colony contact networks to reduce pathogen spread and protect the queen ant.


2001 ◽  
Vol 49 ◽  
pp. 105-125 ◽  
Author(s):  
Ruth Garrett Millikan

‘According to informational semantics, if it's necessary that a creature can't distinguish Xs from Ys, it follows that the creature can't have a concept that applies to Xs but not Ys.’ (Fodor, 1994, p. 32)There is, indeed, a form of informational semantics that has this verificationist implication. The original definition of information given in Dretske'sKnowledge and the Flow of Information(1981, hereafter KFI), when employed as a base for a theory of intentional representation or ‘content,’ has this implication. I will argue that, in fact, most of what an animal needs to know about its environment is not available as natural information of this kind. It is true, I believe, that there is one fundamental kind of perception that depends on this kind of natural information, but more sophisticated forms of inner representation do not. It is unclear, however, exactly what ‘natural information’ is supposed to mean, certainly in Fodor's, and even in Dretske's writing. In many places, Dretske seems to employ a softer notion than the one he originally defines. I will propose a softer view of natural information that is, I believe, at least hinted at by Dretske, and show that it does not have verificationist consequences. According to this soft informational semantics, a creature can perfectly well have a representation of Xs without being able to discriminate Xs from Ys.


2020 ◽  
Vol 6 (2) ◽  
pp. 210-217
Author(s):  
Radouane Azennar ◽  
Driss Mentagui

AbstractIn this paper, we prove that the intermediate value theorem remains true for the conformable fractional derivative and we prove some useful results using the definition of conformable fractional derivative given in R. Khalil, M. Al Horani, A. Yousef, M. Sababhehb [4].


Author(s):  
Xavi Marsellach

The current state of biological knowledge contains an unresolved paradox: life as a continuity in the face of the phenomena of ageing. In this manuscript I propose a theoretical framework that offers a solution for this apparent contradiction. The framework proposed is based on a rethinking of what ageing is at a molecular level, as well as on a rethinking of the mechanisms in charge of the flow of information from one generation to the following ones. I propose an information-based conception of ageing instead of the widely accepted damage-based conception of ageing and propose a full recovery of the chromosome theory of inheritance to describe the intergenerational flow of information. Altogether the proposed framework allows a precise and unique definition of what life is: a continuous flow of biological information. The proposed framework also implies that ageing is merely a consequence of the way in which epigenetically-coded phenotypic characteristics are passed from one generation to the next ones.


Author(s):  
Rémi Berriet ◽  
René Fillod ◽  
Noureddine Bouhaddi

Abstract In order to take into account information from test data, not only at the resonances, but also in the other parts of the measured frequency spectrum, it is of interest to use directly measured Frequency Response Functions (FRF) instead of modal data. We also avoid by this way an experimental modal analysis. In return we have to introduce damping terms into the analytical model, we have to weight the FRF data in a systematic manner and to compute simultaneously a large amount of data. The presented procedure analyses overall these three aspects: definition of modal damping parameters, definition of weighted FRF data and condensation of the problem. This last notion is particularly pointed out. The condensation is performed in two steps : a static condensation of the model on the degrees of freedom corresponding to the location of the sensors, and a simultaneous condensation of experimental and analytical FRF data by a common transformation matrix. The first applications are performed on a simulated test case with large stiffness, mass and modal damping perturbations introduced in the initial model as well as strong noise pollution of measured responses and applied forces.


2019 ◽  
Vol 2019 ◽  
pp. 1-6
Author(s):  
Miguel Angel Sanchez-Tena ◽  
Cristina Alvarez-Peregrina ◽  
Jose Sanchez-Valverde ◽  
Cesar Villa-Collar

Introduction. Citation network analysis is a powerful tool that allows for a visual and objective representation of the past, present, and potential future directions of a research field. The objective of this study is using citation analysis network to analyse the evolution of knowledge in the field of orthokeratology. Materials and Methods. The database used in this citation networks analysis study was Scopus. The descriptor used was “orthokeratology” limited to three fields: title, keywords, and/or abstract, analysing the five most cited authors. Only articles cited at least twenty times were used. The computer software used was UCINET with two types of analysis, qualitative and quantitative. Results. 27 nodes have been included according to the search and inclusion criteria. In qualitative analysis, based on illustrate results, the relationships among nodes and their positions and connections show how the study of Cho et al. in 2005 is clearly positioned as a central cutoff point in the network. Quantitative analysis reveals the normalized value of the sample and shows how the study of Cho et al. in 2005 presents the highest percentage of input connections. Conclusions. This study shows the state of the flow of information in the orthokeratology field by providing links in bibliographic citations from a qualitative and quantitative point of view.


Author(s):  
Everton Note Narciso ◽  
Márcio Eduardo Delamaro ◽  
Fátima De Lourdes Dos Santos Nunes

Time and resource constraints should be taken into account in software testing activities, and thus optimizing the test suite is fundamental in the development process. In this context, the test case selection aims to eliminate redundant or unnecessary test data, which is crucial for the definition of test strategies. This paper presents a systematic review on the test case selection conducted through a selection of 449 articles published in leading journals and conferences in Computer Science. We addressed the state-of-art by collecting and comparing existing evidence on the methods used in the different software domains and the methods used to evaluate the test case selection. Our study identified 32 papers that met the research objectives, which featured 18 different selection methods and were evaluated through 71 case studies. The most commonly reported methods are adaptive random testing, genetic algorithms and greedy algorithm. Most approaches rely on heuristics, such as diversity of test cases and code or model coverage. This paper also discusses the key concepts and approaches, areas of application and evaluation metrics inherent to the methods of test case selection available in the literature.


2017 ◽  
Vol 20 (6) ◽  
pp. 1268-1285 ◽  
Author(s):  
Masoud Arami Fadafan ◽  
Masoud-Reza Hessami Kermani

Abstract Moving particle semi-implicit (MPS) method is one of the Lagrangian methods widely used in engineering issues. This method, however, suffers from unphysical oscillations in its original form. In the present study, a modified incompressible MPS method is proposed to suppress these oscillations and is used for simulating free surface problems. To demonstrate the stability of the presented method, different kernel functions are used in the case of numerical dam break modeling as a benchmark simulation. A simple form of definition of curved wall boundaries is suggested which eliminates dummy particles and subsequently saves CPU time. Flow over an ogee spillway is simulated for the first time with the I-MPS method and as a new test case which has several curved lines in its geometry. The comparisons between theoretical solutions/experimental data and simulation results in terms of free surface and pressure show the accuracy of the method.


2014 ◽  
Vol 5 (2) ◽  
Author(s):  
Yin Liu

The definition of text is still a live issue with important implications for emerging forms of digital textuality. This paper proposes that no single definition of text is sufficient to account for all manifestations of textuality. Medieval textuality is a test case: four different models for text are offered, corresponding to ways in which modern medievalists approach medieval texts. Studying medieval texts has value not only to support historically informed theories of reading and writing, but also to suggest alternative models of organizing, representing, and processing textual information.


2015 ◽  
Author(s):  
B. Ian Hutchins ◽  
Xin Yuan ◽  
James M. Anderson ◽  
George M. Santangelo

AbstractDespite their recognized limitations, bibliometric assessments of scientific productivity have been widely adopted. We describe here an improved method that makes novel use of the co-citation network of each article to field-normalize the number of citations it has received. The resulting Relative Citation Ratio is article-level and field-independent, and provides an alternative to the invalid practice of using Journal Impact Factors to identify influential papers. To illustrate one application of our method, we analyzed 88,835 articles published between 2003 and 2010, and found that the National Institutes of Health awardees who authored those papers occupy relatively stable positions of influence across all disciplines. We demonstrate that the values generated by this method strongly correlate with the opinions of subject matter experts in biomedical research, and suggest that the same approach should be generally applicable to articles published in all areas of science. A beta version of iCite, our web tool for calculating Relative Citation Ratios of articles listed in PubMed, is available at https://icite.od.nih.gov.


Sign in / Sign up

Export Citation Format

Share Document