PDQ (Product Data Quality): Representation of Data Quality for Product Data and Specifically for Shape Data

Author(s):  
Yoshihito Kikuchi ◽  
Hiroyuki Hiraoka ◽  
Akihiko Otaka ◽  
Fumiki Tanaka ◽  
Kazuya G. Kobayashi ◽  
...  

In the communication and sharing of product data, if the difference of the required data quality and the data quality actually incorporated into data is significant, it causes various problems. It is often the case that a creator of low quality data does not realize it unless it is harmful for his job. In most cases, low quality data passed to subsequent processes, such as manufacturing process, cause problems since these are not appropriate from the machining precision point of view or the detailed shape modeling point of view. In these cases, rework or repair of data is necessitated before commencing the target process, which results in significant economy loss and delay of product development. Today’s product model data are dumb data because design intents and data quality incorporated are not explicitly represented. Receiving systems cannot know whether the data passed possess sufficient quality for the target job or not. Another problem is that engineers in later processes, such as the manufacturing process, cannot issue data quality related request beforehand in a commonly agreed manner. The problems mentioned above are caused by the lack of a commonly agreed representation of product data quality (PDQ) information. Our proposed solution is designed to enable the communication and sharing of data quality information. This paper reports the development of a PDQ standard (ISO 10303-59), which is a resource part of ISO 10303 Standard for the Exchange of Product Model Data (STEP) (2008, “ISO 10303-59, Industrial Automation Systems and Integration. Product Data Representation and Exchange. Part 59 Integrated Generic Resource: Quality of Product Shape Data,” International Standard Organization, Geneva). The objective of ISO 10303-59 is to establish a PDQ model and to enable the use of PDQ data independently or in combination with product data. The developed PDQ information model represents concepts such as data quality criteria, measurement requirements, and measured results. Based on the PDQ model, the PDQ for shape data model, which is a specialization of the PDQ model to 3D shape data quality, is also developed.

Author(s):  
Yoshihito Kikuchi ◽  
Hiroyuki Hiraoka ◽  
Akihiko Ohtaka ◽  
Fumiki Tanaka ◽  
Kazuya G. Kobayashi ◽  
...  

Inappropriate data problems due to mismatching of product data quality are always possible in data communication and sharing. In this problem, the data is evaluated as valid within the original system and its initial usage, but problem brake out during the reuse phase of the same data. This delayed problem requires data repair or rework operation in the subsequent manufacturing process and results in significant economic loss. As cause, current product data is a dumb model and its quality is not guaranteed. The receiver system cannot identify whether the data has sufficient quality for reuse or not. Furthermore, manufacture’s advance notice of their request for data quality is impossible. This paper proposes an establishment of PDQ (product data quality) information for communication of data quality that can be used independently or in combination with product data. The PDQ information model presents qualitative measurements according to given criteria, and the measured portion of product data. Based on PDQ model, a PDQ-S (PDQ for shape data) model has been developed. For PDQ-S, industrial guidelines about shape data communication are systematically categorized as the baseline of guarantee. The PDQ model is intended to be independent and neutral from calculation algorithm of measurement. As the method of development, STEP (ISO 10303: Standard for the exchange of product model data) architecture was adopted and the PDQ model was developed as a resource of STEP standards. We show an example about combination of shape data and concerned quality guarantying data.


Author(s):  
László Horváth ◽  
◽  
Imre J. Rudas ◽  

This paper presents a novel methodology for modeling manufacturing processes of mechanical parts. The aim was to develop a manufacturing process model that describes all possible process variants in a single model and involves generic process description for a cluster of manufacturing tasks. It must be fit into the product model concept. A four-level generic manufacturing process model has been developed by using Petri net representation for model entities. Advanced shape models do not describe the intent of the designer and other information that is necessary for the application of the model. As a contribution to solving this problem, we propose a methodology for attaching designer intent information and knowledge to geometric and form feature models. This improves communication between the product designer and production engineer. First, the importance of the manufacturing process model and its interconnections with other product related models are emphasized. Then, the structure, entities, creating, evaluation, and application of the manufacturing process model are explained. Next, product and production process modeling procedures are analyzed from the point of view of design intent information to be transferred between product designers and manufacturing engineers. Finally, characteristics of communication between engineers and modeling of human intent are outlined.


2002 ◽  
Vol 2 (2) ◽  
pp. 132-135 ◽  
Author(s):  
Allison Barnard Feeney

The first Technical Note in this series [1] introduced the international standard ISO 10303, informally known as STEP (STandard for the Exchange of Product model data). Subsequent Technical Notes discussed various issues faced by users of STEP and how the ISO TC184/SC4 committee is addressing these issues. This paper presents the current move to modularize the STEP application protocol architecture. This paper describes the initial STEP architecture, requirements for improvements to the architecture, features of the new modular architecture, status and issues.


1994 ◽  
Vol 10 (01) ◽  
pp. 24-30
Author(s):  
James Murphy

The use of computer-aided design (CAD) technology in the U.S. Navy and marine industry has evolved from a drafting-based design tool to a three-dimensional (3D) product-oriented information base, used for design, production and service life support. One of the most significant enhancements to current CAD technology has been the incorporation or integration of non-graphic attribute information with traditional graphics data. This expanded information base or product model has enabled the marine industry to expand CAD use to include such activities as engineering analysis, production control, and logistics support. While significant savings can be achieved through the exchange of digital product model data between different agents, current graphics-based CAD data exchange standards do not support this expanded information content. The Navy/Industry Digital Data Exchange Standards Committee (NIDDESC) was formed as a cooperative effort of the Naval Sea Systems Command (NAVSEA) and the National Shipbuilding Research Program to develop an industry consensus on product data and to ensure these industry requirements are incorporated into national and international data exchange standards. The NIDDESC effort has resulted in the development of a suite of product model specifications or application protocols (APs) defining marine industry product model data. These APs have been submitted for inclusion into the next generation of data exchange standards.


2006 ◽  
Vol 2006.16 (0) ◽  
pp. 291-292
Author(s):  
Hiroyuki HIRAOKA ◽  
Fumiki TANAKA ◽  
Kazuya G. KOBAYASHI ◽  
Atsuto SOMA

Author(s):  
Soonjo Kwon ◽  
Laetitia Monnier ◽  
Raphael Barbau ◽  
William Bernstein

Abstract Barbau et al. (2012) proposed OntoSTEP that translates the STandard for the Exchange of Product Model Data (STEP) schema and its instances to an ontology and knowledge graphs represented in the Web Ontology Language (OWL). OntoSTEP models can be integrated with any OWL models to enrich their semantics. However, the current implementation has several limitations, mainly in (1) supporting the latest ISO 10303 schemas and (2) generating various representation types depending on the purpose of use. We present an improved implementation of OntoSTEP to overcome these limitations. In this paper, we demonstrate that the new implementation can successfully translate STEP schemas and instances in a faster and more flexible way, thus furthering the adoption of the full capabilities of ISO 10303. By encoding STEP entities in OWL, we facilitate integration with other standards through knowledge graphs.


2005 ◽  
Vol 21 (03) ◽  
pp. 160-169
Author(s):  
T. Briggs ◽  
B. Gischner ◽  
P. Lazo ◽  
P. Lazo ◽  
A. Royal ◽  
...  

Successful and efficient exchange of product model data has been a major challenge in the shipbuilding industry for the past two decades. The Standard for the Exchange of Product Model Data (STEP) has been developed to enable this capability. Four STEP application protocols (APs) to facilitate the exchange of structural and distributed systems models in shipbuilding were completed in 2003 and were adopted by the International Organization for Standardization (ISO) by mid-2004. In August 2003, ISO 10303–216: Ship Moulded Forms (AP216) became the first shipbuilding STEP AP to be published as an international standard. Participants involved in these efforts represent several major US shipyards, the Navy, and their computer-aided design/ engineering (CAD/CAE) vendors. The thrust of shipbuilding data exchange efforts has now shifted from development to implementation. This paper will report on efforts to develop and use translators for this AP to exchange hull form product data in the ship modeling and simulation arena. In addition, process simulation is becoming common in the design of new ships to validate that the design meets the customer's specifications. Current technology requires that the ship be modeled both in the computer-aided design (CAD) environment and then repeated in the simulation workbench. Not only is this effort inefficient, but it is inherently error prone. Through the National Shipbuilding Research Program (NSRP)-sponsored Integrated Shipbuilding Environment (ISE) projects, we have developed tool sets that use AP227: Plant Spatial Configuration to permit the design to flow smoothly from the CAD workbench to the simulation workbench. This paper summarizes the efforts to develop and use a suite of tools that enables US shipyards to become more productive. It details the specific successes in using AP216 and AP227 for modeling and simulation, as well as efforts to exchange design data electronically between CAD systems. The report also outlines efforts that are underway to use other APs to successfully exchange data describing ship electrical; heating, ventilation, and air-conditioning (HVAC); and controls systems.


2019 ◽  
Vol 9 (23) ◽  
pp. 5054
Author(s):  
Jang ◽  
Lee ◽  
Kim ◽  
Gim

In the era of the Fourth Industrial Revolution, companies are focusing on securing artificial intelligence (AI) technology to enhance their competitiveness via machine learning, which is the core technology of AI, and to allow computers to acquire a high level of quality data through self-learning. Securing good-quality big data is becoming a very important asset for companies to enhance their competitiveness. The volume of digital information is expected to grow rapidly around the world, reaching 90 zettabytes (ZB) by 2020. It is very meaningful to present the value quality index on each data attribute as it may be desirable to evaluate the data quality for a user with regard to whether the data is suitable for use from the user’s point of view. As a result, this allows the user to determine whether they would take the data or not based on the data quality index. In this study, we propose a quality index calculation model with structured and unstructured data, as well as a calculation method for the attribute value quality index (AVQI) and the structured data value quality index (SDVQI).


Author(s):  
Allison Barnard Feeney ◽  
Simon P. Frechette ◽  
Vijay Srinivasan

The International Organization for Standardization (ISO) has just completed a major effort on a new standard ISO 10303-242 titled “Managed Model Based 3D Engineering.” It belongs to a family of standards called STEP (STandard for the Exchange of Product model data). ISO 10303-242 is also called the STEP Application Protocol 242 (STEP AP 242, for short). The intent of STEP AP 242 is to support a manufacturing enterprise with a range of standardized information models that flow through a long and wide “digital thread” that makes the manufacturing systems in the enterprise smart. One such standardized information model is that of tolerances specified on a product’s geometry so that the product can be manufactured according to the specifications. This paper describes the attributes of smart manufacturing systems, the capabilities of STEP AP 242 in handling tolerance information associated with product geometry, and how these capabilities enable the manufacturing systems to be smart.


High Quality Data are the precondition for examining and making use of enormous facts and for making sure the estimation of the facts. As of now, far reaching exam and research of price gauges and satisfactory appraisal strategies for massive records are inadequate. To begin with, this paper abridges audits of Data excellent studies. Second, this paper examines the records attributes of the enormous records condition, presents high-quality difficulties appeared by large data, and defines a progressive facts exceptional shape from the point of view of records clients. This system accommodates of big records best measurements, best attributes, and best files. At long last, primarily based on this system, this paper builds a dynamic appraisal technique for records exceptional. This technique has excellent expansibility and versatility and can address the troubles of enormous facts fine appraisal. A few explores have verified that preserving up the character of statistics is regularly recognized as hazardous, however at the equal time is considered as simple to effective basic leadership in building aid the executives. Enormous data sources are exceptionally wide and statistics structures are thoughts boggling. The facts got may additionally have satisfactory troubles, for example, facts mistakes, lacking data, irregularities, commotion, and so forth. The motivation behind facts cleansing (facts scouring) is to pick out and expel mistakes and irregularities from facts so as to decorate their quality. Information cleansing may be separated into 4 examples dependent on usage techniques and degrees manual execution, composing of splendid software programs, records cleaning inconsequential to specific software fields, and taking care of the difficulty of a kind of explicit software area. In these 4 methodologies, the 1/3 has terrific down to earth esteem and may be connected effectively.


Sign in / Sign up

Export Citation Format

Share Document