scholarly journals Tolerancing Informatics: Towards Automatic Tolerancing Information Processing in Geometrical Variations Management

2020 ◽  
Vol 11 (1) ◽  
pp. 198
Author(s):  
Benjamin Schleich ◽  
Nabil Anwer

The management of geometrical variations throughout the product life cycle strongly relies on the gathering, processing, sharing and dissemination of tolerancing information and knowledge. While today, this is performed with many manual interventions, new means for automatic information processing are required in future geometrical variations management to make full use of new digitalization paradigms, such as industry 4.0 and digital twins. To achieve this, the paper proposes the term tolerancing informatics and investigates new concepts and means for automatic information processing, novel information sharing workflows as well as the integration of tools for next generation geometrical variations management. In this regard, the main aim of the paper is to structure existing tolerancing informatics workflows as well as to derive future research potentials and challenges in this domain. The novelty of the paper can be found in providing a comprehensive overview of tolerancing informatics as an important enabler for future geometrical variations management.

Author(s):  
David E. Lee ◽  
Michel A. Melkanoff

Abstract Traditional engineering analysis of product designs has focused primarily on a product’s operational performance without considering costs of manufacturing and other stages downstream from design. In contrast, life cycle analysis of a product during its initial development can play a crucial role in determining the product’s overall life cycle cost and useful life span. This paper examines product life cycle engineering analysis - measurement of product operational performance in a life cycle context. Life cycle engineering analysis is thus considered both as an extension of traditional engineering analysis methods and as a subset of a total product life cycle analysis. The issues critical to life cycle engineering analysis are defined and include product life cycle data modeling and analysis, analysis tools and their performance regimes, performance tradeoff measurement and problems of life cycle engineering analysis in an organizational context. Recommendations are provided for future research directions into life cycle engineering analysis in the context of integration architectures for concurrent engineering.


Author(s):  
David Edward Jones ◽  
Chris Snider ◽  
Lee Kent ◽  
Ben Hicks

ABSTRACTWhile extensive modelling - both physical and virtual - is imperative to develop right-first-time products, the parallel use of virtual and physical models gives rise to two interrelated issues: the lack of revision control for physical prototypes; and the need for designers to manually inspect, measure, and interpret modifications to either virtual or physical models, for subsequent update of the other. The Digital Twin paradigm addresses similar problems later in the product life-cycle, and while these digital twins, or the “twinning” process, have shown significant value, there is little work to date on their implementation in the earlier design stages. With large prospective benefits in increased product understanding, performance, and reduced design cycle time and cost, this paper explores the concept of using the Digital Twin in early design, including an introduction to digital twinning, examination of opportunities for and challenges of their implementation, a presentation of the structure of Early Stage Twins, and evaluation via two implementation cases.


Processes ◽  
2020 ◽  
Vol 8 (9) ◽  
pp. 1088
Author(s):  
Yingjie Chen ◽  
Ou Yang ◽  
Chaitanya Sampat ◽  
Pooja Bhalode ◽  
Rohit Ramachandran ◽  
...  

The development and application of emerging technologies of Industry 4.0 enable the realization of digital twins (DT), which facilitates the transformation of the manufacturing sector to a more agile and intelligent one. DTs are virtual constructs of physical systems that mirror the behavior and dynamics of such physical systems. A fully developed DT consists of physical components, virtual components, and information communications between the two. Integrated DTs are being applied in various processes and product industries. Although the pharmaceutical industry has evolved recently to adopt Quality-by-Design (QbD) initiatives and is undergoing a paradigm shift of digitalization to embrace Industry 4.0, there has not been a full DT application in pharmaceutical manufacturing. Therefore, there is a critical need to examine the progress of the pharmaceutical industry towards implementing DT solutions. The aim of this narrative literature review is to give an overview of the current status of DT development and its application in pharmaceutical and biopharmaceutical manufacturing. State-of-the-art Process Analytical Technology (PAT) developments, process modeling approaches, and data integration studies are reviewed. Challenges and opportunities for future research in this field are also discussed.


Author(s):  
Sebastian K. Fixson

Product families and product platforms have been suggested as design strategies to serve heterogeneous markets via mass customization. Numerous, individual cost advantages of these strategies have been identified for various life cycle processes such as product design, manufacturing, or inventory. However, these advantages do not always occur simultaneously, and sometimes even counteract each other. To develop a better understanding of these phenomena, this paper investigates the cost implications of the underlying design decision: the product architecture choice. The investigation includes factors such as product life cycle phases, allocation rules, and cost models, all of which impact the cost analysis results. Based on this investigation, directions for future research on product architecture costing are provided.


2006 ◽  
Vol 20 (4) ◽  
pp. 267-275 ◽  
Author(s):  
O. Laloyaux ◽  
M. Ansseau ◽  
M. Hansenne

Repetitive transcranial magnetic stimulation (rTMS) is considered a powerful method for the study of the relationships between cortical activity and cognitive processes. Previous ERPs studies that focused on P300 response have shown that inhibitory/excitatory effects on prefrontal cortex (PFC), induced by low- and high-frequency rTMS, were able to modulate controlled but not automatic information processing. The present study assessed the impact of inhibition over left and right PFC induced by rTMS on mismatch negativity (MMN), which is known to represent automatic cerebral processes for detecting change. Auditory MMN was recorded in 20 subjects before and after application of left and right PFC 1-Hz rTMS for 15 min. MMN was also recorded before and after a sham-occipital 1-Hz rTMS as control condition. Results showed that 1-Hz rTMS induced no modification to either MMN latency or amplitude. In addition, N100 and P200 components to the frequent tones were not affected by rTMS. These results are consistent with previous findings showing that rTMS over both PFC is unable to disrupt automatic information processing. However, since two sites were stimulated in the present study, no definite conclusions about the inability of rTMS to disrupt automatic processing can be made.


Sign in / Sign up

Export Citation Format

Share Document