A method for measuring the width of welding seam based on machine vision

Author(s):  
Wei Jin ◽  
Haibin Sun
Keyword(s):  
2021 ◽  
Vol 63 (9) ◽  
pp. 547-553
Author(s):  
Jing Ye ◽  
Guisuo Xia ◽  
Fang Liu ◽  
Ping Fu ◽  
Qiangqiang Cheng

This study proposes a weld defect inspection method based on a combination of machine vision and weak magnetic technology to inspect the quality of weld formation comprehensively. In accordance with the principle of laser triangulation, surface information about the weldment is obtained, the weld area is extracted using mutation characteristics of the weld edge and an algorithm for identifying defects with abnormal average height in the weld surface is proposed. Subsequently, a welding seam inspection process is developed and implemented, which is composed of a camera, a structured light sensor, a magnetic sensor and a motion control system. Inspection results from an austenitic stainless steel weldment show that the method combining machine vision and magnetism can identify defect locations accurately. Comprehensive analysis of the test results can effectively classify surface and internal defects, estimate the equivalent sizes of defects and evaluate the quality of weld formation in multiple dimensions.


Author(s):  
Yanbiao Zou ◽  
Jinchao Li ◽  
Xiangzhi Chen

Purpose This paper aims to propose a set of six-axis robot arm welding seam tracking experiment platform based on Halcon machine vision library to resolve the curve seam tracking issue. Design/methodology/approach Robot-based and image coordinate systems are converted based on the mathematical model of the three-dimensional measurement of structured light vision and conversion relations between robot-based and camera coordinate systems. An object tracking algorithm via weighted local cosine similarity is adopted to detect the seam feature points to prevent effectively the interference from arc and spatter. This algorithm models the target state variable and corresponding observation vector within the Bayes framework and finds the optimal region with highest similarity to the image-selected modules using cosine similarity. Findings The paper tests the approach and the experimental results show that using metal inert-gas (MIG) welding with maximum welding current of 200A can achieve real-time accurate curve seam tracking under strong arc light and splash. Minimal distance between laser stripe and welding molten pool can reach 15 mm, and sensor sampling frequency can reach 50 Hz. Originality/value Designing a set of six-axis robot arm welding seam tracking experiment platform with a system of structured light sensor based on Halcon machine vision library; and adding an object tracking algorithm to seam tracking system to detect image feature points. By this technology, this system can track the curve seam while welding.


Author(s):  
Wesley E. Snyder ◽  
Hairong Qi
Keyword(s):  

2018 ◽  
pp. 143-149 ◽  
Author(s):  
Ruijie CHENG

In order to further improve the energy efficiency of classroom lighting, a classroom lighting energy saving control system based on machine vision technology is proposed. Firstly, according to the characteristics of machine vision design technology, a quantum image storage model algorithm is proposed, and the Back Propagation neural network algorithm is used to analyze the technology, and a multi­feedback model for energy­saving control of classroom lighting is constructed. Finally, the algorithm and lighting model are simulated. The test results show that the design of this paper can achieve the optimization of the classroom lighting control system, different number of signals can comprehensively control the light and dark degree of the classroom lights, reduce the waste of resources of classroom lighting, and achieve the purpose of energy saving and emission reduction. Technology is worth further popularizing in practice.


1997 ◽  
Vol 117 (10) ◽  
pp. 1339-1344
Author(s):  
Katsuhiko Sakaue ◽  
Hiroyasu Koshimizu
Keyword(s):  

2005 ◽  
Vol 125 (11) ◽  
pp. 692-695
Author(s):  
Kazunori UMEDA ◽  
Yoshimitsu AOKI
Keyword(s):  

Fast track article for IS&T International Symposium on Electronic Imaging 2020: Stereoscopic Displays and Applications proceedings.


2020 ◽  
Vol 64 (5) ◽  
pp. 50411-1-50411-8
Author(s):  
Hoda Aghaei ◽  
Brian Funt

Abstract For research in the field of illumination estimation and color constancy, there is a need for ground-truth measurement of the illumination color at many locations within multi-illuminant scenes. A practical approach to obtaining such ground-truth illumination data is presented here. The proposed method involves using a drone to carry a gray ball of known percent surface spectral reflectance throughout a scene while photographing it frequently during the flight using a calibrated camera. The captured images are then post-processed. In the post-processing step, machine vision techniques are used to detect the gray ball within each frame. The camera RGB of light reflected from the gray ball provides a measure of the illumination color at that location. In total, the dataset contains 30 scenes with 100 illumination measurements on average per scene. The dataset is available for download free of charge.


Author(s):  
Sunita Nadella ◽  
Lloyd A. Herman

Video traffic data were collected in 24 combinations of four different camera position parameters. A machine vision processor was used to detect vehicle speeds and volumes from the videotapes. The machine vision results were then compared with the actual vehicle volumes and speeds to give the percentage errors in each case. The results of the study provide a procedure with which to establish camera position parameters with specific reference points to help machine vision users select suitable camera positions and develop appropriate measurement error expectations. The camera position parameters that were most likely to produce the least overall volume and speed errors, for the specific site and field setup with the parameter ranges used in this study, were the low height of approximately 7.6 m (25 ft), with an upstream orientation (traffic moving toward the camera), a 50-mm (midangle) focal length, and a 15° vertical angle.


Sign in / Sign up

Export Citation Format

Share Document