Information theory on the shape of the scanning objective apertureand the quality of color-separation image

1998 ◽  
Author(s):  
Yixin Zhang ◽  
Qiang Xu ◽  
Linghua Chen
1980 ◽  
Vol 17 (4) ◽  
pp. 407-422 ◽  
Author(s):  
Eli P. Cox

A conceptual framework employing the distinction between stimulus-centered and subject-centered scales is presented as a basis for reviewing 80 years of literature on the optimal number of response alternatives for a scale. Concepts and research from information theory and the absolute judgment paradigm of psychophysics are used. The author reviews the major factors influencing the quality of scaled information, points out areas in particular need of additional research, and makes some recommendations for the applied researcher.


1996 ◽  
Vol 83 (3_suppl) ◽  
pp. 1127-1138
Author(s):  
Horacio J. A. Rimoldi ◽  
Elsa Inés Bei de Libonatti

The performance of 30 subjects when solving problems built around logical connectives (Conjunction, Inclusive Disjunction, Exclusive Disjunction, Conditional, and Biconditional) was compared with the results obtained when the same logical connectives were presented using a multiple-choice approach. The processes followed by the subjects in solving 20 problems were evaluated in terms of information theory, making it possible to investigate (a) the processes followed by the subjects and (b) the quality of the final answer Analysis indicated that the problem-solving processes do not necessarily provide the same information as that obtained by the final answers. The knowledge obtained by examining the questions subjects ask is different from the knowledge obtained when examining the answers to multiple-choice items.


1989 ◽  
Vol 42 (2) ◽  
pp. 161-186 ◽  
Author(s):  
John Charnley

1. INTRODUCTION. In his Presidential address last year J. E. D. Williams treated us to a most absorbing, largely philosophical review of the development of the science of navigation over the last five centuries. In discussing the profound and enduring influence of radio waves on air navigation over the last century he commended as of ‘transcending importance’ the ‘homing quality’ of the radio beam in which the accuracy increases as distance from the transmitter decreases. He illustrated this ‘homing quality’ with references to the radio range for en-route navigation and the instrument landing system (ILS) for guidance to the runway. But he went further and remarked ‘Automatic landings in regular airline service are an example of how a perceived potential of the homing quality of radio has drawn to navigation a whole range of diverse applied sciences including, in this case, information theory, the properties of semi-conductors and the theory of servomechanisms and control, among many others.’


Entropy ◽  
2019 ◽  
Vol 21 (4) ◽  
pp. 405 ◽  
Author(s):  
Kyumin Moon

Integrated information theory (IIT) asserts that both the level and the quality of consciousness can be explained by the ability of physical systems to integrate information. Although the scientific content and empirical prospects of IIT have attracted interest, this paper focuses on another aspect of IIT, its unique theoretical structure, which relates the phenomenological axioms with the ontological postulates. In particular, the relationship between the exclusion axiom and the exclusion postulate is unclear. Moreover, the exclusion postulate leads to a serious problem in IIT: the quale underdetermination problem. Therefore, in this paper, I will explore answers to the following three questions: (1) how does the exclusion axiom lead to the exclusion postulate? (2) How does the exclusion postulate cause the qualia underdetermination problem? (3) Is there a solution to this problem? I will provide proposals and arguments for each question. If successful, IIT can be confirmed with respect to, not only its theoretical foundation, but also its practical application.


2018 ◽  
Vol 224 ◽  
pp. 04006 ◽  
Author(s):  
Aleksander Dulesov ◽  
Denis Karandeev ◽  
Natalia Dulesova

Improving the operation quality of the technical system performance while designing and operating is considered through the application of information theory models. The models are based on the mathematical description of the probabilistic state of system elements and the possibility of determining the information entropy. To implement the model, the role of the stochastic behavior of a technical object is emphasized. This role determines its probability state. The basis for the determining of the entropy of object state based on a series of scientific and conceptual positions, developments of well-known scientists in connection with the problems of realization of physical processes. A mathematical model is proposed, that allows using the classical methods to determine the amount of information entropy and to use it to solve several problems: to choose the first preference structure with less uncertainty; to evaluate the behavior ("aging") of a system; to identify the problem areas of structures for the purpose of timely execution of equipment preventative maintenance; to construct the optimal system structure.


2018 ◽  
Vol 15 (05) ◽  
pp. 1850045 ◽  
Author(s):  
Charles M. Weber ◽  
Rainer P. Hasenauer ◽  
Nitin V. Mayande

Aristotle’s dictum scio nescio (I know that I don’t know) may serve as a source of enhanced performance for organizations. Awareness of nescience sets the direction for further inquiry, as managers tend to move in the direction that they believe will reduce nescience most. However, nescience is difficult to quantify, so, to date, managers have primarily relied on intuition. This paper introduces a theoretical framework for managing nescience that is based on information theory. This framework is tested in three exploratory empirical studies that take place in highly contrasting settings: semiconductor manufacturing, medical diagnostics and social media analytics. All three studies demonstrate that metrics related to information entropy can be used to quantify nescience. However, practitioners value the framework and its metrics more highly in the settings where the quality of or access to information drives successful product development. The problems encountered in these settings tended to be well-structured, or they were converted from being ill-structured to being well-structured. Further study of more highly contrasting practical settings will be required to determine whether frameworks based on information theory can serve as foundations for a broadly based, pragmatic theory for managing nescience.


2004 ◽  
Vol 8 (1) ◽  
pp. 54-58
Author(s):  
Jin L. Hu

In this paper, the author applies the principle of information theory to probe the role that experience plays in the process of fabric hand evaluation. 8 experts and 8 non-experts, two group judges assessed 8 medium-weight worsted fabrics for men’s suiting. The results have been analysed. The data show that the expert group has the higher ability of handling instinct attributes of a fabric than the non-expert group do. In other words, the experts seem to be more sensitive to the change of fabric physical and mechanical properties than the non-experts. And when a expert assesses a fabric by hand, he introduces few subjective factors, that is to say, the result of the evaluation of touch quality of a fabric by a expert has higher objectivity or reliability. This may provide a fundamental or a train of thought for the objective assessment of fabric hand in textile science and technology.


2013 ◽  
Vol 11 (9) ◽  
pp. 2987-2993 ◽  
Author(s):  
Sathesh Sathesh ◽  
Dr.J.Samuel Manoharan

Noise reduction is one of the most important processes to enhance the quality of images. This paper proposes a statistical filter, the decorrelation stretch filter for the reduction of Poisson noise that occurs frequently in galaxy images. The primary purpose of decorrelation stretch is visual enhancement. Decorrstretch is applied to the three band images but can also work on arbitrary number of bands. This filter enhances the color separation of an image with significant band-band correlation. Effectiveness of the proposed filter is compared on the basis of Peak Signal to Noise Ratio (PSNR), Mean Square Error (MSE).


RBRH ◽  
2019 ◽  
Vol 24 ◽  
Author(s):  
Luiz Henrique Resende de Pádua ◽  
Nilo de Oliveira Nascimento ◽  
Francisco Eustáquio Oliveira e Silva ◽  
Leonardo Alfonso

ABSTRACT In this work a comparative study was carried out, in which different methods were used in the literature that seek to evaluate the number of stations and the quality of the information generated by the hydrometric network of a watershed, using Information Theory concepts. The underlying idea is the so-called optimal network whose function, according to World Meteorological Organization (WMO) is to optimally and inexpensively meet the primary goal of hydrometry, which is to provide the necessary information with a minimum number of stations correctly positioned in the basin. Methodologies based on Information Theory ascend to fill the gap on a standard method for the design of hydrometric networks. The evaluated methods were applied to the subbasin of the Rio das Velhas belonging to the São Francisco River basin in Brazil. The results showed that the methods analyzed, which use the concept of entropy, are adequate and efficient for evaluation of existing fluviometric networks, since they allow the reduction of eventual redundancies and at the same time, seek to maximize the information generated. It was possible to compare them and indicate the most appropriate method for the application within the national context, as well as indicate new methods for use thereof.


Sign in / Sign up

Export Citation Format

Share Document