scholarly journals Manufacturing System Design Meets Big Data Analytics for Continuous Improvement

Procedia CIRP ◽  
2016 ◽  
Vol 50 ◽  
pp. 647-652 ◽  
Author(s):  
David S. Cochran ◽  
Don Kinard ◽  
Zhuming Bi
Author(s):  
David S. Cochran ◽  
Steve Hendricks ◽  
Jason Barnes ◽  
Zhuming Bi

This paper offers an extension of axiomatic design theory to ensure that leaders, managers, and engineers can sustain manufacturing systems throughout the product lifecycle. The paper has three objectives: to provide a methodology for designing and implementing manufacturing systems to be sustainable in the context of the enterprise, to define the use of performance metrics and investment criteria that sustain manufacturing, and to provide a systems engineering approach that enables continuous improvement (CI) and adaptability to change. The systems engineering methodology developed in this paper seeks to replace the use of the word “lean” to describe the result of manufacturing system design. Current research indicates that within three years of launch, ninety percent of “lean implementations” fail. This paper provides a methodology that leaders, managers, and engineers may use to sustain their manufacturing system design and implementation.


Author(s):  
S. J. Pavnaskar ◽  
D. Weaver ◽  
J. K. Gershenson

Lean has become a “must-use” philosophy for businesses today. Lean manufacturing focuses on the elimination of waste in manufacturing operations. Similarly, companies have started using lean engineering to eliminate wastes from their engineering processes. Both lean manufacturing and lean engineering yield dramatic improvements in quality, cost, and delivery. However, the philosophy of lean (manufacturing and engineering) revolves around the continuous improvement of existing processes. Costs associated with continuous improvement can be significantly reduced by incorporating “lean” considerations when designing a product, process, or manufacturing system. This is known as design for lean manufacturing (DfLM). DfLM guides the design of a product, process, or a manufacturing system to enable lean operations when in production, just as design for assembly (DFA) guides the design of a product to allow easier assembly during production. Currently, there are no guidelines that would help a product or process designer in considering to lean operations during design. Note that usage of the word “product” in this paper must be interpreted in a literary sense and not as a “widget.” The “product” of a manufacturing engineering process is a complete manufacturing system. In this paper, we consider manufacturing system design and propose a novel set of structured DfLM guidelines for designing a manufacturing system. These guidelines will be a valuable resource for manufacturing engineers to guide manufacturing system design for new products to enable lean operations once the system is in production. DfLM guidelines for system design also will help plant engineers and rapid continuous improvement managers to assess existing manufacturing systems and identify and prioritize improvement efforts. The proposed DfLM guidelines are then validated for accuracy, completeness, and redundancy by using them to evaluate an existing benchmark manufacturing system. The initial DfLM guidelines show promise for use in designing manufacturing systems that are easy to manage, flexible, safe, build quality into the products, optimize material flow, fully utilize all resources, maximize throughput, and continuously produce what the customer wants just in time. Similar guidelines can be proposed for product and process design to further enhance the efficiency of operations and reduce the overhead of continuous improvement efforts.


2022 ◽  
Author(s):  
Song Guo ◽  
Zhihao Qu

Discover this multi-disciplinary and insightful work, which integrates machine learning, edge computing, and big data. Presents the basics of training machine learning models, key challenges and issues, as well as comprehensive techniques including edge learning algorithms, and system design issues. Describes architectures, frameworks, and key technologies for learning performance, security, and privacy, as well as incentive issues in training/inference at the network edge. Intended to stimulate fruitful discussions, inspire further research ideas, and inform readers from both academia and industry backgrounds. Essential reading for experienced researchers and developers, or for those who are just entering the field.


2015 ◽  
Vol 115 (9) ◽  
pp. 1666-1682 ◽  
Author(s):  
Kun Chen ◽  
Xin Li ◽  
Huaiqing Wang

Purpose – Although big data analytics has reaped great business rewards, big data system design and integration still face challenges resulting from the demanding environment, including challenges involving variety, uncertainty, and complexity. These characteristics in big data systems demand flexible and agile integration architectures. Furthermore, a formal model is needed to support design and verification. The purpose of this paper is to resolve the two problems with a collective intelligence (CI) model. Design/methodology/approach – In the conceptual CI framework as proposed by Schut (2010), a CI design should be comprised of a general model, which has formal form for verification and validation, and also a specific model, which is an implementable system architecture. After analyzing the requirements of system integration in big data environments, the authors apply the CI framework to resolve the integration problem. In the model instantiation, the authors use multi-agent paradigm as the specific model, and the hierarchical colored Petri Net (PN) as the general model. Findings – First, multi-agent paradigm is a good implementation for reuse and integration of big data analytics modules in an agile and loosely coupled method. Second, the PN models provide effective simulation results in the system design period. It gives advice on business process design and workload balance control. Third, the CI framework provides an incrementally build and deployed method for system integration. It is especially suitable to the dynamic data analytics environment. These findings have both theoretical and managerial implications. Originality/value – In this paper, the authors propose a CI framework, which includes both practical architectures and theoretical foundations, to solve the system integration problem in big data environment. It provides a new point of view to dynamically integrate large-scale modules in an organization. This paper also has practical suggestions for Chief Technical Officers, who want to employ big data technologies in their companies.


2019 ◽  
Vol 54 (5) ◽  
pp. 20
Author(s):  
Dheeraj Kumar Pradhan

2020 ◽  
Vol 49 (5) ◽  
pp. 11-17
Author(s):  
Thomas Wrona ◽  
Pauline Reinecke

Big Data & Analytics (BDA) ist zu einer kaum hinterfragten Institution für Effizienz und Wettbewerbsvorteil von Unternehmen geworden. Zu viele prominente Beispiele, wie der Erfolg von Google oder Amazon, scheinen die Bedeutung zu bestätigen, die Daten und Algorithmen zur Erlangung von langfristigen Wettbewerbsvorteilen zukommt. Sowohl die Praxis als auch die Wissenschaft scheinen geradezu euphorisch auf den „Datenzug“ aufzuspringen. Wenn Risiken thematisiert werden, dann handelt es sich meist um ethische Fragen. Dabei wird häufig übersehen, dass die diskutierten Vorteile sich primär aus einer operativen Effizienzperspektive ergeben. Strategische Wirkungen werden allenfalls in Bezug auf Geschäftsmodellinnovationen diskutiert, deren tatsächlicher Innovationsgrad noch zu beurteilen ist. Im Folgenden soll gezeigt werden, dass durch BDA zwar Wettbewerbsvorteile erzeugt werden können, dass aber hiermit auch große strategische Risiken verbunden sind, die derzeit kaum beachtet werden.


2019 ◽  
Vol 7 (2) ◽  
pp. 273-277
Author(s):  
Ajay Kumar Bharti ◽  
Neha Verma ◽  
Deepak Kumar Verma

2017 ◽  
Vol 49 (004) ◽  
pp. 825--830
Author(s):  
A. AHMED ◽  
R.U. AMIN ◽  
M. R. ANJUM ◽  
I. ULLAH ◽  
I. S. BAJWA

Sign in / Sign up

Export Citation Format

Share Document