scholarly journals NON FUNCTIONAL REQUIREMENTS (NFRS) TRACEABILITY METAMODEL FOR AGILE DEVELOPMENT

2015 ◽  
Vol 77 (9) ◽  
Author(s):  
Adila Firdaus ◽  
Imran Ghani ◽  
Dayang Norhayati Abg Jawawi ◽  
Wan Mohd Nasir Wan Kadir

Agile methodologies are well known for early and frequent releases. Besides, these methodologies also handle requirement changes well without causing delays. However, it has been noticed that the functional requirements changes can affect the non-functional requirements (NFRs) such as security and performance. It is also possible that the agile team is not even aware of these effects causing dysfunctional system. This issue could be addressed by offering traceability mechanism that helps to trace the effect of functional requirement changes on the non-functional requirements. Unfortunately, a few researchers have conducted studies regarding this issue. Thus, this study attempts to present a Traceability Process Model (TPM) to tackle the issue of tracing NFR especially security and performance. However, to materialize a full scale TPM, a metamodel is necessary. Therefore in this paper, we present a metamodel by integrating two existing metamodels. Then we validate the newly built metamodel with precision and recall methods. Lastly, we also develop a traceability tool that is based on the proposed metamodel.

2000 ◽  
Vol 16 (2) ◽  
pp. 107-114 ◽  
Author(s):  
Louis M. Hsu ◽  
Judy Hayman ◽  
Judith Koch ◽  
Debbie Mandell

Summary: In the United States' normative population for the WAIS-R, differences (Ds) between persons' verbal and performance IQs (VIQs and PIQs) tend to increase with an increase in full scale IQs (FSIQs). This suggests that norm-referenced interpretations of Ds should take FSIQs into account. Two new graphs are presented to facilitate this type of interpretation. One of these graphs estimates the mean of absolute values of D (called typical D) at each FSIQ level of the US normative population. The other graph estimates the absolute value of D that is exceeded only 5% of the time (called abnormal D) at each FSIQ level of this population. A graph for the identification of conventional “statistically significant Ds” (also called “reliable Ds”) is also presented. A reliable D is defined in the context of classical true score theory as an absolute D that is unlikely (p < .05) to be exceeded by a person whose true VIQ and PIQ are equal. As conventionally defined reliable Ds do not depend on the FSIQ. The graphs of typical and abnormal Ds are based on quadratic models of the relation of sizes of Ds to FSIQs. These models are generalizations of models described in Hsu (1996) . The new graphical method of identifying Abnormal Ds is compared to the conventional Payne-Jones method of identifying these Ds. Implications of the three juxtaposed graphs for the interpretation of VIQ-PIQ differences are discussed.


1996 ◽  
Vol 12 (1) ◽  
pp. 27-32 ◽  
Author(s):  
Louis M. Hsu

The difference (D) between a person's Verbal IQ (VIQ) and Performance IQ (PIQ) has for some time been considered clinically meaningful ( Kaufman, 1976 , 1979 ; Matarazzo, 1990 , 1991 ; Matarazzo & Herman, 1985 ; Sattler, 1982 ; Wechsler, 1984 ). Particularly useful is information about the degree to which a difference (D) between scores is “abnormal” (i.e., deviant in a standardization group) as opposed to simply “reliable” (i.e., indicative of a true score difference) ( Mittenberg, Thompson, & Schwartz, 1991 ; Silverstein, 1981 ; Payne & Jones, 1957 ). Payne and Jones (1957) proposed a formula to identify “abnormal” differences, which has been used extensively in the literature, and which has generally yielded good approximations to empirically determined “abnormal” differences ( Silverstein, 1985 ; Matarazzo & Herman, 1985 ). However applications of this formula have not taken into account the dependence (demonstrated by Kaufman, 1976 , 1979 , and Matarazzo & Herman, 1985 ) of Ds on Full Scale IQs (FSIQs). This has led to overestimation of “abnormality” of Ds of high FSIQ children, and underestimation of “abnormality” of Ds of low FSIQ children. This article presents a formula for identification of abnormal WISC-R Ds, which overcomes these problems, by explicitly taking into account the dependence of Ds on FSIQs.


1990 ◽  
Vol 22 (7-8) ◽  
pp. 35-43
Author(s):  
K. D. Tracy ◽  
S. N. Hong

The anaerobic selector of the A/0™ process offers many advantages over conventional activated sludge processes with respect to process performance and operational stability. This high-rate, single-sludge process has been successfully demonstrated in full-scale operations for biological phosphorus removal and total nitrogen control in addition to BOD and TSS removal. This process can be easily utilized in upgrading existing treatment plants to meet stringent discharge limitations and to provide capacity expansion. Upgrades of two full-scale installations are described and performance data from the two facilities are presented.


2021 ◽  
Vol 13 (7) ◽  
pp. 168781402110343
Author(s):  
Mei Yang ◽  
Yimin Xia ◽  
Lianhui Jia ◽  
Dujuan Wang ◽  
Zhiyong Ji

Modular design, Axiomatic design (AD) and Theory of inventive problem solving (TRIZ) have been increasingly popularized in concept design of modern mechanical product. Each method has their own advantages and drawbacks. The benefit of modular design is reducing the product design period, and AD has the capability of problem analysis, while TRIZ’s expertise is innovative idea generation. According to the complementarity of these three approaches, an innovative and systematic methodology is proposed to design big complex mechanical system. Firstly, the module partition is executed based on scenario decomposition. Then, the behavior attributes of modules are listed to find the design contradiction, including motion form, spatial constraints, and performance requirements. TRIZ tools are employed to deal with the contradictions between behavior attributes. The decomposition and mapping of functional requirements and design parameters are carried out to construct the structural hierarchy of each module. Then, modules are integrated considering the connections between each other. Finally, the operation steps in application scenario are designed in temporal and spatial dimensions. Design of cutter changing robot for shield tunneling machine is taken as an example to validate the feasibility and effectiveness of the proposed method.


2021 ◽  
pp. 016555152098549
Author(s):  
Donghee Shin

The recent proliferation of artificial intelligence (AI) gives rise to questions on how users interact with AI services and how algorithms embody the values of users. Despite the surging popularity of AI, how users evaluate algorithms, how people perceive algorithmic decisions, and how they relate to algorithmic functions remain largely unexplored. Invoking the idea of embodied cognition, we characterize core constructs of algorithms that drive the value of embodiment and conceptualizes these factors in reference to trust by examining how they influence the user experience of personalized recommendation algorithms. The findings elucidate the embodied cognitive processes involved in reasoning algorithmic characteristics – fairness, accountability, transparency, and explainability – with regard to their fundamental linkages with trust and ensuing behaviors. Users use a dual-process model, whereby a sense of trust built on a combination of normative values and performance-related qualities of algorithms. Embodied algorithmic characteristics are significantly linked to trust and performance expectancy. Heuristic and systematic processes through embodied cognition provide a concise guide to its conceptualization of AI experiences and interaction. The identified user cognitive processes provide information on a user’s cognitive functioning and patterns of behavior as well as a basis for subsequent metacognitive processes.


Author(s):  
A.N. Belikov ◽  
◽  
S.A. Belikova

The existing approach to requirements extraction is that the requirements are formed by the system developer through direct interaction with the customer using a number of methods (for example, interviewing; prototyping; analysis of use cases; user stories; seminars, etc.). In this case, most often the requirements are formed by the developer himself, taking into account the opinion of the customer’s representative. The disadvantage of the existing approach is the problem of loss of knowledge transferred from the customer’s representatives to the developer, which results in the failure of projects, which is recorded by the existing statistics. As statistical studies show, more than half of projects for the creation of information systems (IS) are failures or require changes (in terms of budget, time and customer satisfaction). In modern research in the field of__ design and development of information systems, there is a tendency to involve the end user (customer) in the design process. To develop this idea, an approach is proposed to involve the user in the process of extracting requirements, where the developer will no longer be the person forming the requirements. The main idea of the approach is to develop special tools that allow you to independently transform the customer’s natural language into such a form of representation of the model of the process of solving professional problems, from which an interface will be built, which will allow extracting functional requirements from the unity (process model and interface).


Author(s):  
Dheeraj Chhillar ◽  
Kalpana Sharma

<span>There are various root causes of software failures. Few years ago, software used to fail mainly due to functionality related bugs. That used to happen due to requirement misunderstanding, code issues and lack of functional testing. A lot of work has been done in past on this and software engineering has matured over time, due to which software’s hardly fail due to functionality related bugs. To understand the most recent failures, we had to understand the recent software development methodologies and technologies. In this paper we have discussed background of technologies and testing progression over time. A survey of more than 50 senior IT professionals was done to understand root cause of their software project failures. It was found that most of the softwares fail due to lack of testing of non-functional parameters these days. A lot of research was also done to find most recent and most severe software failures. Our study reveals that main reason of software failures these days is lack of testing of non-functional requirements. Security and Performance parameters mainly constitute non-functional requirements of software. It has become more challenging these days due to lots of development in the field of new technologies like Internet of things (IoT), Cloud of things (CoT), Artificial Intelligence, Machine learning, robotics and excessive use of mobile and technology in everything by masses. Finally, we proposed a software development model called as T-model to ensure breadth and depth of software is considered while designing and testing of software. </span>


Sign in / Sign up

Export Citation Format

Share Document