scholarly journals Template Metaprogramming Techniques for Concept-Based Specialization

2013 ◽  
Vol 21 (1-2) ◽  
pp. 43-61
Author(s):  
Bruno Bachelet ◽  
Antoine Mahul ◽  
Loïc Yon

In generic programming, software components are parameterized on types. When available, a static specialization mechanism allows selecting, for a given set of parameters, a more suitable version of a generic component than its primary version. The normal C++ template specialization mechanism is based on the type pattern of the parameters, which is not always the best way to guide the specialization process: type patterns are missing some information on types that could be relevant to define specializations. The notion of a concept, which represents a set of requirements (including syntactic and semantic aspects) for a type, is known to be an interesting approach to control template specialization. For many reasons, concepts were dropped from C++11 standard, this article therefore describes template metaprogramming techniques for declaring concepts, modeling relationships (meaning that a type fulfills the requirements of a concept), and refinement relationships (meaning that a concept refines the requirements of another concept). From a taxonomy of concepts and template specializations based on concepts, an automatic mechanism selects the most appropriate version of a generic component for a given instantiation. Our purely library-based solution is also open for retroactive extension: new concepts, relationships, and template specializations can be defined at any time; such additions will then be picked up by the specialization mechanism.

Author(s):  
Ratneshwer N/A

In order to develop software components that are reusable across the pervasive computing applications it would be required to consider the variations and properties (mobility, adaptability, composability, context awareness etc.) that may be required for different pervasive computing applications (application types). It should go without saying that various requirements and variations may not always be known a priori and hence developing all the multiple variants may not always be possible or feasible. It is quite unlikely that all the pervasive computing applications would be able to reuse a component ‘as-is’ always. One idea is to use lightweight components such that the overheads (those that are not required in a particular pervasive computing application) do not get transported with the body of the component. Based on this idea, a model of “Generic Component” with ‘Component Generator’ has been proposed that will generate components according to the requirements of a specific pervasive computing application. This work starts a discussion and calls for more extensive research oriented studies by professionals and academicians for perfection of the model.


2012 ◽  
Vol 20 (2) ◽  
pp. 115-128 ◽  
Author(s):  
C.G. Baker ◽  
M.A. Heroux

We present Tpetra, a Trilinos package for parallel linear algebra primitives implementing the Petra object model. We describe Tpetra's design, based on generic programming via C++ templated types and template metaprogramming. We discuss some benefits of this approach in the context of scientific computing, with illustrations consisting of code and notable empirical results.


Author(s):  
Shubh Shah

Abstract: The central idea of Component Based Engineering is to develop a system software by selecting the well defined software components not used often and assembling them with certain system architecture. Nowadays the software development pattern is far different from the earlier approach as many new concepts are being taken into consideration E.g. QA (QualityAssurance). This term paper includes a detailed description of all the current component based software techniques used as well as their advantages and disadvantages. We also address the quality assurance issue of component based software engineering.


Author(s):  
Arthur V. Jones

In comparison with the developers of other forms of instrumentation, scanning electron microscope manufacturers are among the most conservative of people. New concepts usually must wait many years before being exploited commercially. The field emission gun, developed by Albert Crewe and his coworkers in 1968 is only now becoming widely available in commercial instruments, while the innovative lens designs of Mulvey are still waiting to be commercially exploited. The associated electronics is still in general based on operating procedures which have changed little since the original microscopes of Oatley and his co-workers.The current interest in low-voltage scanning electron microscopy will, if sub-nanometer resolution is to be obtained in a useable instrument, lead to fundamental changes in the design of the electron optics. Perhaps this is an opportune time to consider other fundamental changes in scanning electron microscopy instrumentation.


1971 ◽  
Vol 4 (2) ◽  
pp. 431-443
Author(s):  
LaVonne Bergstrom ◽  
Janet Stewart

1988 ◽  
Vol 3 (5) ◽  
pp. 171 ◽  
Author(s):  
Patrick A.V. Hall
Keyword(s):  

Sign in / Sign up

Export Citation Format

Share Document