scholarly journals FSM Decomposition and Functional Verification of FSM Networks

VLSI Design ◽  
1995 ◽  
Vol 3 (3-4) ◽  
pp. 249-265 ◽  
Author(s):  
Zafar Hasan ◽  
Maciej J. Ciesielski

Here we present a new method for the decomposition of a Finite State Machine (FSM) into a network of interacting FSMs and a framework for the functional verification of the FSM network at different levels of abstraction. The problem of decomposition is solved by output partitioning and state space decomposition using a multiway graph partitioning technique. The number of submachines is determined dynamically during the partitioning process. The verification algorithm can be used to verify (a) the result of FSM decomposition on a behavioral level, (b) the encoded FSM network, and (c) the FSM network after logic optimization. Our verification technique is based on an efficient enumeration-simulation method which involves traversal of the state transition graph of the prototype machine and simulation of the decomposed machine network. Both the decomposition and verification/simulation algorithms have been implemented as part of an interactive FSM synthesis system and tested on a set of benchmark examples.

2021 ◽  
Vol 2021 (2) ◽  
Author(s):  
Jake R Hanson ◽  
Sara I Walker

Abstract The scientific study of consciousness is currently undergoing a critical transition in the form of a rapidly evolving scientific debate regarding whether or not currently proposed theories can be assessed for their scientific validity. At the forefront of this debate is Integrated Information Theory (IIT), widely regarded as the preeminent theory of consciousness because it quantified subjective experience in a scalar mathematical measure called Φ that is in principle measurable. Epistemological issues in the form of the “unfolding argument” have provided a concrete refutation of IIT by demonstrating how it permits functionally identical systems to have differences in their predicted consciousness. The implication is that IIT and any other proposed theory based on a physical system’s causal structure may already be falsified even in the absence of experimental refutation. However, so far many of these arguments surrounding the epistemological foundations of falsification arguments, such as the unfolding argument, are too abstract to determine the full scope of their implications. Here, we make these abstract arguments concrete, by providing a simple example of functionally equivalent machines realizable with table-top electronics that take the form of isomorphic digital circuits with and without feedback. This allows us to explicitly demonstrate the different levels of abstraction at which a theory of consciousness can be assessed. Within this computational hierarchy, we show how IIT is simultaneously falsified at the finite-state automaton level and unfalsifiable at the combinatorial-state automaton level. We use this example to illustrate a more general set of falsification criteria for theories of consciousness: to avoid being already falsified, or conversely unfalsifiable, scientific theories of consciousness must be invariant with respect to changes that leave the inference procedure fixed at a particular level in a computational hierarchy.


Sensors ◽  
2021 ◽  
Vol 21 (15) ◽  
pp. 5136
Author(s):  
Bassem Ouni ◽  
Christophe Aussagues ◽  
Saadia Dhouib ◽  
Chokri Mraidha

Sensor-based digital systems for Instrumentation and Control (I&C) of nuclear reactors are quite complex in terms of architecture and functionalities. A high-level framework is highly required to pre-evaluate the system’s performance, check the consistency between different levels of abstraction and address the concerns of various stakeholders. In this work, we integrate the development process of I&C systems and the involvement of stakeholders within a model-driven methodology. The proposed approach introduces a new architectural framework that defines various concepts, allowing system implementations and encompassing different development phases, all actors, and system concerns. In addition, we define a new I&C Modeling Language (ICML) and a set of methodological rules needed to build different architectural framework views. To illustrate this methodology, we extend the specific use of an open-source system engineering tool, named Eclipse Papyrus, to carry out many automation and verification steps at different levels of abstraction. The architectural framework modeling capabilities will be validated using a realistic use case system for the protection of nuclear reactors. The proposed framework is able to reduce the overall system development cost by improving links between different specification tasks and providing a high abstraction level of system components.


2016 ◽  
Author(s):  
Arnold Gehlen

Moral and Hypermoral, Arnold Gehlen´s final book-length publication, is an elaboration on basic theses which had initially been brought forward in Gehlen´s anthropological magnum opus "Der Mensch". In this respect, this draft of a "pluralistic ethics" is conceived as an elaboration on as well as a concretion of his doctrine of man. In this book, Gehlen set himself the task of combining anthropology, behavioral science, and sociology in a “genealogy of morality”, thus exposing four interdependent forms of ethics: from an ethos of "reciprocity" via “eudaimonism” and “humanitarianism” to an ethos of institutions, including the state. Gehlen made a decisive stand against the "abstract ethics of the Enlightenment": systematically, his book is primarily an anthropological justification of ethics, conceived as a "majority of moral authorities" and "social regulations." These are not subjected to an evolutionary interpretation, that is, as progress from an ethics of proximity to a world-encompassing morality. Moralities, whether based on instinct or arising from the needs of particular institutions, are always culturally shaped and set on different levels of abstraction. With its broad scope, the book belongs in the context of basic philosophical-sociological research known as philosophical anthropology.


2013 ◽  
Vol 2013 ◽  
pp. 1-14 ◽  
Author(s):  
Yanlong Sun ◽  
Hongbin Wang

According to the data-frame theory, sensemaking is a macrocognitive process in which people try to make sense of or explain their observations by processing a number of explanatory structures called frames until the observations and frames become congruent. During the sensemaking process, the parietal cortex has been implicated in various cognitive tasks for the functions related to spatial and temporal information processing, mathematical thinking, and spatial attention. In particular, the parietal cortex plays important roles by extracting multiple representations of magnitudes at the early stages of perceptual analysis. By a series of neural network simulations, we demonstrate that the dissociation of different types of spatial information can start early with a rather similar structure (i.e., sensitivity on a common metric), but accurate representations require specific goal-directed top-down controls due to the interference in selective attention. Our results suggest that the roles of the parietal cortex rely on the hierarchical organization of multiple spatial representations and their interactions. The dissociation and interference between different types of spatial information are essentially the result of the competition at different levels of abstraction.


2002 ◽  
Vol 11 (2) ◽  
pp. 175-189 ◽  
Author(s):  
Anne van Kleeck ◽  
Amy Beckley-McCall

Many studies have demonstrated that adults fine tune book-sharing discussions to the developmental levels of preschoolers, but little is known regarding how reading simultaneously to different-aged preschoolers is negotiated. We observed five mothers of different-aged preschoolers sharing books with each child individually and with both children together. Analyses focused on the linguistic complexity of the book, the amount of time spent sharing a book, and on several aspects of the mothers' book-sharing mediation. Results revealed developmental differences on several measures of how mothers mediated with younger as compared to older children individually. Book complexity, the time spent sharing books, and the percent of utterances at higher levels of abstraction were higher when reading to the older children; the number of mediation strategies per minute and the percent of mothers' behaviors that were used to get and maintain attention were higher when reading to the younger children. When reading to both children simultaneously, which aspects of the mediation fell at these different levels varied among the different mothers. This suggests that different mothers reach different solutions to the task of simultaneously reading to preschoolers of different ages. One mother approached the simultaneous book sharing much as she did sharing a book with her older child, one mother approached it as she did with her younger child, one mother simply read and did little mediation, and two mothers appeared to use a mixed strategy in the simultaneous reading condition.


2013 ◽  
Vol 10 (3) ◽  
pp. 2089-2103 ◽  
Author(s):  
T. Wutzler ◽  
M. Reichstein

Abstract. Interactions between different qualities of soil organic matter (SOM) affecting their turnover are rarely represented in models. In this study, we propose three mathematical strategies at different levels of abstraction to represent those interactions. By implementing these strategies into the Introductory Carbon Balance Model (ICBM) and applying them to several scenarios of litter input, we show that the different levels of abstraction are applicable at different timescales. We present a simple one-parameter equation of substrate limitation that can straightforwardly be implemented into other models of SOM dynamics at decadal timescale. The study demonstrates how substrate quality interactions can explain patterns of priming effects, accelerate turnover in FACE experiments, and the slowdown of decomposition in long-term bare fallow experiments as an effect of energy limitation of microbial biomass. The mechanisms of those interactions need to be further scrutinized empirically for a more complete understanding. Overall, substrate quality interactions contribute to both understanding and quantitatively modelling SOM dynamics.


10.29007/gpsh ◽  
2018 ◽  
Author(s):  
Abdulbasit Ahmed ◽  
Alexei Lisitsa ◽  
Andrei Nemytykh

It has been known for a while that program transformation techniques, in particular, program specialization, can be used to prove the properties of programs automatically. For example, if a program actually implements (in a given context of use) a constant function, sufficiently powerful and semantics preserving program transformation may reduce the program to a syntactically trivial ``constant'' program, pruning unreachable branches and proving thereby the property. Viability of such an approach to verification has been demonstrated in previous works where it was applied to the verification of parameterized cache coherence protocols and Petri Nets models.In this paper we further extend the method and present a case study on its appication to the verification of a cryptographic protocol. The protocol is modeled by functional programs at different levels of abstraction and verification via program specialization is done by using Turchin's supercompilation method.


Sign in / Sign up

Export Citation Format

Share Document