Hasselt

2016 ◽  
Vol 5 (1) ◽  
pp. 19-38 ◽  
Author(s):  
Fredy Cuenca ◽  
Jan Van den Bergh ◽  
Kris Luyten ◽  
Karin Coninx

Implementing multimodal interactions with event-driven languages results in a ‘callback soup', a source code littered with a multitude of flags that have to be maintained in a self-consistent manner and across different event handlers. Prototyping multimodal interactions adds to the complexity and error sensitivity, since the program code has to be refined iteratively as developers explore different possibilities and solutions. The authors present a declarative language for rapid prototyping multimodal interactions: Hasselt permits declaring composite events, sets of events that are logically related because of the interaction they support, that can be easily bound to dedicated event handlers for separate interactions. The authors' approach allows the description of multimodal interactions at a higher level of abstraction than event languages, which saves developers from dealing with the typical ‘callback soup' thereby resulting in a gain in programming efficiency and a reduction in errors when writing event handling code. They compared Hasselt with using a traditional programming language with strong support for events in a study with 12 participants each having a solid background in software development. When performing equivalent modifications to a multimodal interaction, the use of Hasselt leads to higher completion rates, lower completion times, and less code testing than when using a mainstream event-driven language.

Author(s):  
Tran Thanh Luong ◽  
Le My Canh

JavaScript has become more and more popular in recent years because its wealthy features as being dynamic, interpreted and object-oriented with first-class functions. Furthermore, JavaScript is designed with event-driven and I/O non-blocking model that boosts the performance of overall application especially in the case of Node.js. To take advantage of these characteristics, many design patterns that implement asynchronous programming for JavaScript were proposed. However, choosing a right pattern and implementing a good asynchronous source code is a challenge and thus easily lead into less robust application and low quality source code. Extended from our previous works on exception handling code smells in JavaScript and exception handling code smells in JavaScript asynchronous programming with promise, this research aims at studying the impact of three JavaScript asynchronous programming patterns on quality of source code and application.


Author(s):  
Marco Konersmann ◽  
Michael Goedicke

AbstractAs software architecture is a main driver for the software quality, source code is often accompanied by software architecture specifications. When the implementation is changed, the architecture specification is often not updated along with the code, which introduces inconsistencies between these artifacts. Such inconsistencies imply a risk of misunderstandings and errors during the development, maintenance, and evolution, causing serious degradation over the lifetime of the system. In this chapter we present the Explicitly Integrated Architecture approach and its tool Codeling, which remove the necessity for a separate representation of software architecture by integrating software architecture information with the program code. By using our approach, the specification can be extracted from the source code and changes in the specification can be propagated to the code. The integration of architecture information with the code leaves no room for inconsistencies between the artifacts and creates links between artifacts. We evaluate the approach and tool in a use case with real software in development and with a benchmark software, accompanied by a performance evaluation.


2021 ◽  
Vol 33 (4) ◽  
pp. 195-210
Author(s):  
Roman Vyacheslavovich Baev ◽  
Leonid Vladlenovich Skvortsov ◽  
Evgeny Alekseevich Kudryashov ◽  
Ruben Arturovich Buchatskiy ◽  
Roman Aleksandrovich Zhuykov

Aggressive optimization in modern compilers may uncover vulnerabilities in program code that did not lead to bugs prior to optimization. The source of these vulnerabilities is in code with undefined behavior. Programmers use such constructs relying on some particular behavior these constructs showed before in their experience, but the compiler is not obliged to stick to that behavior and may change the behavior if it’s needed for optimization since the behavior is undefined by language standard. This article describes approaches to detection and elimination of vulnerabilities arising from optimization in the case when source code is available but its modification is undesirable or impossible. Concept of a safe compiler (i.e. compiler that ensures no vulnerability is added to the program during optimization) is presented and implementation of such a compiler on top of GCC compiler is described. Implementation of safe compiler’s functionality is divided into three security levels whose applicability is discussed in the article. Feasibility of using the safe compiler on real-world codebases is demonstrated and possible performance losses are estimated.


2005 ◽  
Vol 2 (8) ◽  
Author(s):  
Joyce M. Lieberman ◽  
Nina G. Dorsch

Doctoral completion rates are a concern across disciplines.  This paper describes the way in which Curriculum Leadership faculty redesigned their doctoral program from coursework through completion to include a strong support system, intellectually and emotionally.  This culminated in the creation of the “Big Paper Network,” designed to support candidates from proposal writing through defense.


2019 ◽  
Vol 8 (2S3) ◽  
pp. 1004-1009

Without a doubt, multiple core processors have become primary stream in parallel computing. Therefore, future generations of applications pivotal role will be played by parallelism. It must be noted that, the compilers and programmers could immensely benefit from a program source code classified in a structured manner. Such a classification surely helps programmers to identify parallelization scopes or reasoning about the program code, and associate with other programmers. To address the challenge of parallel programming, we worked on source-to-source compiler Bones and developed species extraction tool extended A-Darwin to ease parallel programming. In the work done, we present ’Algorithmic Species’, a new algorithm classification, that encapsulates required information for parallelization in classes, and embeds memory transfer requirements for optimization of communication on heterogeneous platforms. The evaluation of algorithmic species and the validation of extended A-Darwin are done by testing the tool against the benchmark suit HPCC. The unique approach is developed to generate code automatically for parallel target machines.


2009 ◽  
Vol 14 (6) ◽  
pp. 720-777 ◽  
Author(s):  
Gürcan Güleşir ◽  
Klaas van den Berg ◽  
Lodewijk Bergmans ◽  
Mehmet Akşit

2021 ◽  
Vol 16 (93) ◽  
pp. 69-78

The present study is devoted to the development of a software module that converts computational meshes created on the basis of the OpenFOAM platform into the msh format, used in numerical experiments using the ANSYS FLUENT package. Thanks to this conversion, the user is able to use both products in parallel. The ANSYS FLUENT functionality can, for example, be used within the framework of post-processing of a numerical model in most fundamental problems of continuum mechanics (CM), including in hydrodynamics, aerodynamics, and solid mechanics. The existing analogues of the OpenFOAM platform, such as Salome, Helyx-OS, Visual-CFD, have already implemented tools for solving this problem, but due to their partial commercial distribution, the need to pay for technical support services and the lack of full-fledged Russian documentation, the problem of the lack of a graphical shell to simplify the procedure conversion remains relevant. The process of converting computational meshes generated by means of the OpenFOAM platform into the msh-format used in the ANSYS FLUENT package is the subject of this study. The purpose of the work is to develop the source code of a software module that automates the process of determining conversion parameters and starting the conversion process. The work presents a diagram corresponding to the algorithm of a specialist's work with the considered software module. A stack of technologies for typing, debugging and running program code is presented, a stack of tools for using the module in question is presented. The results of the research have been determined, the provisions of its scientific novelty and supposed practical significance have been formulated. The results of testing the application are presented on the example of one of the classic experiments based on the OpenFOAM platform.


Author(s):  
Andrew Sears ◽  
Julie A. Jacko

We report on an investigation of the effects of hardware performance, application design, and cognitive demands on user productivity and perceptions. This investigation focuses on clerical tasks typical of those activities that many lower level organization workers encounter. This was accomplished by engaging one hundred seventy-five representative participants in a field-based experiment. Participants worked one eight-hour shift and completed a variety of realistic tasks involving the creation and modification of documents using Microsoft® Word, Excel, and PowerPoint®. Motivation was ensured through the use of a quantity/quality-based financial incentive. An analysis of both task-completion times and error rates revealed significant effects for cognitive demands, with more demanding tasks resulting in longer task completion times and higher error rates. The analysis also confirmed that, under the right circumstances, providing individuals with a more powerful computing platform can lead to an increase in productivity. Participants also expressed a preference for more powerful computing platforms. Finally, the results provide strong support for the importance of navigational activities even when the users’ primary goal is not navigation. Implications for user training, task design, and future research are discussed.


1988 ◽  
Vol 5 (3) ◽  
pp. 203-211 ◽  
Author(s):  
E. Jane Watkinson ◽  
Sock Miang Koh

Moderately mentally handicapped children ages 10 to 12 and youths 13 years and older ran the endurance run of the Canada Fitness Awards Adapted Format under two testing conditions. The current test protocol is one in which subjects select a pace for the entire race and are prompted only by verbal encouragement. A second testing protocol was used in which subjects were paced by a runner at a pace just a bit faster than that displayed during their runs under the current protocol. In the pacing protocol, instructors ran in front of the subjects and verbally and visually prompted them to keep up. The objective of the pacing protocol was to reduce the degree to which the subjects had to plan their runs, and to increase motivation to continue. Completion rates improved with the pacing protocol for both groups. Completion times improved for the younger group. Heart rate responses under both testing conditions were very high and small differences were observed between the two conditions in this dependent variable. Heart rates of subjects in both conditions were at vigorous to severe intensity levels throughout the runs, indicating that subjects were lacking in fitness and were performing at or near maximal capacities.


Contracts provide a pre-emptive approach in identifying programming errors at run-time using assertions or by formal Static analysis tool or Manual source code reviews. They describe the expected software behavior. Contracts written by developers have a greater error detection ability than the generic ones that are created automatically but may involve strenuous efforts for larger sized source codes. The intent of this paper is a concise study of prevalent approaches in the generation of contracts and to put forward an approach to derive programming rules for real-time concurrent Java source code automatically with reduced efforts. The proposed method extracts the scalar variables and computed constants from Static program code analysis, then identifies various dependencies dynamically and generates the declarative contracts automatically by Decision tree modeling of computed dependencies. These rules can then be utilized for software Verification.


Sign in / Sign up

Export Citation Format

Share Document