Hostname: page-component-cd9895bd7-gxg78 Total loading time: 0 Render date: 2024-12-25T05:13:08.939Z Has data issue: false hasContentIssue false

Research on welding seam tracking algorithm for automatic welding process of X-shaped tip of concrete piles using laser distance sensor

Published online by Cambridge University Press:  18 April 2024

Cao Tri Huynh
Affiliation:
Faculty of Mechanical Engineering, Ho Chi Minh University of Technology (HCMUT), Ho Chi Minh City, Vietnam Vietnam National University Ho Chi Minh City, Ho Chi Minh City, Vietnam
Tri Cong Phung*
Affiliation:
Faculty of Mechanical Engineering, Ho Chi Minh University of Technology (HCMUT), Ho Chi Minh City, Vietnam Vietnam National University Ho Chi Minh City, Ho Chi Minh City, Vietnam
*
Corresponding author: Tri Cong Phung; Email: [email protected]
Rights & Permissions [Opens in a new window]

Abstract

The manufacturing of the X-shaped tip of prestressed centrifugal concrete piles is nowadays done half automatically by combining the manual worker and the automatic welding robot. To make this welding process full automatically, the welding seam tracking algorithm is considered. There are many types of sensors that can be used to detect the welding seam such as vision sensor, laser vision sensor, arc sensor, or touch sensor. Each type of sensor has its advantages and disadvantages. In this paper, an algorithm for welding seam tracking using laser distance sensor is proposed. Firstly, the fundamental mathematics theory of the algorithm is presented. Next, the positioning table system supports the procedure is designed and manufactured. The object of this research is the fillet joint because of the characteristics of the X-shaped tip of the concrete piles. This paper proposes a new method to determine the welding trajectory of the tip using laser distance sensor. After that, the experimental results are received to verify the proposed idea. Finally, the improved proposal of the algorithm is considered to increase the accuracy of the suggested algorithm.

Type
Research Article
Copyright
© The Author(s), 2024. Published by Cambridge University Press

1. Introduction

Prestressed centrifugal concrete piles are manufactured by modern technology compared with traditional concrete piles. This type of concrete pile is used popularly in building construction. Thus, the increasing productivity and the quality of the piles are very important. To do this, an automatic process of manufacturing the piles is necessary. In this paper, the X-shaped pile tip is considered in the automatic welding procedure. Gas metal arc welding (GMAW) is used welding method in this research. One industrial manipulator is used to move the welding gun to make the welding process.

One big problem for the automatic welding process is how to determine the trajectory of the welding seam. Because the dimensions and the shape of the X-shaped pile tip are not the same for all welding parts, it cannot use the same position and orientation values for all parts. In the case of manual welding process, the welder can adjust the errors. In the case of automatic welding process, the sensors are used to determine the trajectory of the welding seam.

There are many types of sensors that can be used to detect the welding seam paths. Each type of sensor has its own advantages and disadvantages. Rout et al. mentioned the used type of sensors for detecting the welding seam such as arc sensor, vision sensor, laser vision sensor, ultrasonic sensor, electromagnetic sensor, infrared sensor, tactile sensor, and touch sensor [Reference Rout, Deepak and Biswal1]. Mahajan et al. presented a novel approach for seam tracking using ultrasonic sensor [Reference Mahajan and Figueroa2]. An ultrasonic seam tracking system has been developed for robotic welding which tracks a seam that curves freely on a two-dimensional surface. Among these other sensors, there are three types of sensors that can be used the most for detecting the welding seam in GMAW because of high precision. They are arc sensor, vision sensor, and laser vision sensor.

The arc sensor is typically designed for metal arc welding process and depends on a detecting technique which uses a key arc property, that is, arc voltage and current features change according to the height of torch comprising of arc length and wire extension changes [Reference Rout, Deepak and Biswal1]. Ushio et al. developed a non-linear model to describe the relationship between the output (welding current and voltage) and input (torch height) of a through-the-arc sensor (arc sensor) for DC MIG/MAG welding in open arc mode [Reference Ushio and Mao3]. Jiluan described a mathematical model of the static and the dynamic properties of the sensor derived based on control theory and experimental method [Reference Jiluan4]. Moon et al. introduced a new piece of automatic welding equipment used in the dual tandem welding process for pipeline construction [Reference Moon, Ko and Kim5]. The arc sensor was also developed for a narrow welding groove to achieve higher accuracy of seam tracking and fully automatic operation. Fridenfalk et al. presented the design and validation of a novel and universal 6D seam tracking system that reduces the need for accurate robot trajectory programming and geometrical databases in robotic arc welding [Reference Fridenfalk and Bolmsjö6]. However, the arc sensor has some disadvantages such as seam tracking operation has been done only in real time by finding only the deviations in torch position.

A vision sensor can be utilized to recognize and find the position of welding creases which can be used to arrange a way to weld the parts naturally [Reference Rout, Deepak and Biswal1]. Ali et al. introduced a new supervised learning technique for programming a 4-degree of freedom (DOF) welding arm robot with an automatic feeding electrode [Reference Ali and Atia7]. Xu et al. proposed a visual control system to locate the start welding position and track the narrow butt-welding seam in container manufacturing [Reference Xu, Fang, Chen, Yan and Tan8]. Nele et al. presented an automatic seam tracking system, where the automatic tracking of welding path and torch positioning is performed by a newly developed image acquisition system [Reference Nele, Sarno and Keshari9]. Xu et al. presented a method for real-time image capturing and processing for the application of robotic seam tracking [Reference Xu, Fang, Chen, Zou and Ye10]. By analyzing the characteristics of robotic GMAW, the real-time weld images are captured clearly by the passive vision sensor. Xu et al. introduced the research in the application of computer vision technology for real-time seam tracking in robotic gas tungsten arc welding and GMAW [Reference Xu, Fang, Lv, Chen and Zou11]. The key aspect in using vision techniques to track welding seams is to acquire clear real-time weld images and to process them accurately. Dinham et al. introduced an autonomous robotic arc welding system that can detect realistic weld joints and calculate their position in the robot workspace with minimal human interaction [Reference Dinham and Fang12]. The proposed method is capable of detecting and localizing butt and fillet weld joints regardless of base material, surface finish, or imperfections. Lin et al. proposed a hybrid CNN and adaptive ROI operation algorithm for intelligent seam tracking of an ultranarrow gap during K-TIG welding [Reference Lin, Shi, Wang, Li and Chen13]. Fei et al. suggested using machine vision method to analyze various welding processes, to know how much machine vision technology will improve the efficiency of the welding industry [Reference Fei, Tan and Yuan14]. Liang et al. proposed a weld track identification method based on illumination correction and center point extraction to extract welds with different shapes and non-uniform illumination [Reference Liang, Wu, Hu, Bu, Liang, Feng and Ma15]. Sawano et al. introduced a new robot system which was designed to seal the seams of car body panels using a solid-state camera [Reference Sawano, Ikeda, Utsumi, Ohtani, Kikuchi, Ito and Kiba16].

The vision sensor using only CCD camera for seam tracking becomes very complicated as the algorithms proposed for extracting the weld seam position from the two-dimensional image plane are not so accurate and time-consuming. Again, the measurement of weld groove depth, weld gap, and detail shape condition are quite difficult. Therefore, many researchers have investigated laser vision sensors with structured light emitters that is, laser diode to come with a solution for the above-stated problems [Reference Rout, Deepak and Biswal1]. He et al. presented a method of autonomously detecting weld seam profiles from molten pool background in metal active gas (MAG) arc welding using a novel model of saliency-based visual attention and a vision sensor based on structured light [Reference He, Chen, Xu, Huang and Chen17]. Gu et al. proposed an automatic welding tracking system of arc welding robot for multi-pass welding [Reference Gu, Xiong and Wan18]. The developed system includes an image acquisition module, an image processing module, a tracking control unit, and their software interfaces. Huang et al. introduced a laser welding experimental platform for narrow and burred seam welding (with seam width less than 0.1 mm) of complex three-dimensional (3D) surface based on eight-axis machine tool [Reference Huang, Xiao, Wang and Li19]. Wu et al. presented a method to remove the noise due to the complexity of welding environment using noise filters of the welding seam image captured from the CCD camera [Reference Wu, Lee, Park, Park and Kim20]. He et al. presented a scheme for the extracting feature points of the weld seam profile to implement automatic multi-pass route planning, and guidance of the initial welding position in each layer during MAG arc welding using a vision sensor based on structured light [Reference He, Xu, Chen, Chen and Chen21]. Ma et al. proposed an efficient and robust complex weld seam feature point extraction method based on a deep neural network (Shuffle-YOLO) for seam tracking and posture adjustment [Reference Ma, Fan, Yang, Wang, Xing, Jing and Tan22]. The Shuffle-YOLO model can accurately extract the feature points of butt joints, lap joints, and irregular joints, and the model can also work well despite strong arc radiation and spatters. Ma et al. suggested a fast and robust seam tracking method for spatial circular welding based on laser visual sensors [Reference Ma, Fan, Yang, Yang, Ji, Jing and Tan23]. Fan et al. presented a seam feature point acquisition method based on efficient convolution operator and particle filter, which could be applied to different weld types and could achieve fast and accurate seam feature point acquisition even under the interference of welding arc light and spatter noises [Reference Fan, Deng, Ma, Zhou, Jing and Tan24]. Fan et al. proposed an initial point alignment and seam tracking system for narrow weld [Reference Fan, Deng, Jing, Zhou, Yang, Long and Tan25]. Changyong et al. developed a laser welding seam tracking sensing technology based on swing mirrors [Reference Changyong, Xuhao, Tie and Yi26]. Xiao et al. suggested a laser stripe feature point tracker (LSFP tracker) based on Siamese network for robotic welding seam tracking [Reference Xiao, Xu, Xu, Hou, Zhang and Chen27]. Xu et al. proposed an automatic weld seam tracking method based on laser vision and machine learning [Reference Xu28]. Chen et al. presented an economical 3D measurement sensor based on structured light technique [Reference Chen, Okeke and Zhang29]. Tian et al. designed and implemented a machine vision recognition algorithm system for feature points on the surface of oil pipeline welds [Reference Tian, Zhou, Yin and Zhang30]. Huissoon presented the design of a calibration system with which these frames may be precisely defined with respect to each other [Reference Huissoon31]. This calibration can be difficult to perform since the sensor and laser frames are virtual in the sense that these are in space with respect to the physical hardware, and the wrist frame of the robot is often not physically accessible. Liu et al. designed a structured light stereo vision system based on line laser sensor to realize real-time collection of images of weld lines [Reference Liu and Wang32]. Zou et al. proposed a two-stage seam tracking method named Heatmap method combining welding image inpainting and feature point locating from a heatmap [Reference Zou, Wei, Chen, Zhu and Zhou33]. Yang et al. presented a novel image-denoising method of seam images for automatic laser stripe extraction to serve intelligent robot welding applications, such as seam tracking, seam type detection, weld bead detection, etc. [Reference Yang, Fan, Huo, Li and Liu34]. Dong et al. designed and developed a seam tracking system for human-computer interactive mobile robots [Reference Dong, Qin, Fu, Xu, Wang, Wu and Wang35]. The spatial coordinates were obtained by camera calibration, line laser calibration, and hand-eye calibration. Ma et al. developed a hybrid vision sensor that integrates monocular vision, laser structured light vision, and coded structured light vision simultaneously [Reference Ma, Fan, Zhou, Zhao, Jing and Tan36].

In almost all the above papers, the butt joint and the V-groove joint are considered. In this research, the fillet welding joint is considered because of the shape of the X-shaped tip. There are a few research concentrating on the fillet joint. Chen et al. represented a method to recognize whole seams in complex robotic welding environment [Reference Chen, Chen and Lin37]. According to the welding parameter, the authors defined to judge the degree of image noise, then different size of filter windows were selected to preprocess the image. Dinham et al. presented a method for the automatic identification and location of welding seams for robotic welding using computer vision [Reference Dinham and Fang38]. The methods developed in the paper enable the robust identification of narrow weld seams for ferrous materials combined with reliable image matching and triangulation using 2D homograph. Quan et al. proposed a visible and intelligent method to monitor the fillet welding of corrugated sheets and the equipment [Reference Quan and Bi39]. The results show that the authors can improve the traditional welding methods that cannot meet the high-quality and efficient welding requirements of the corrugated sheets and other large workpieces. Takubo et al. proposed a method for welding line detection using point cloud data to automate welding operations combined with a contact sensor [Reference Takubo, Miyake, Ueno and Kubo40]. The proposed system targets a fillet weld, in which the joint line between two metal plates attached vertically is welded. Dinham presented an autonomous weld joint detection and localization method using computer vision in robotic arc welding including fillet joints [Reference Dinham41]. Zeng et al. proposed a weld joint type identification method for visual sensor based on image features and SVM [Reference Zeng, Cao, Peng and Huang42]. Fridenfalk et al. designed and validated a sensor-guided control system for robot welding in shipbuilding [Reference Fridenfalk and Bolmsjo43]. Huang et al. proposed a Gaussian-weighted PCA-based laser center line extraction method to identify feature points in fillet joints [Reference Huang, Xu, Gao, Wei, Zhang and Li44]. Le et al. proposed a method for realization of rectangular fillet weld tracking based on rotating arc sensors [Reference Le, Zhang and Chen45]. Cibicik et al. presented a novel procedure for robotic scanning of weld grooves in large tubular T-joints [Reference Cibicik, Njaastad, Tingelstad and Egeland46]. Fang et al. proposed vision-based modeling and control for filler weld seam tracking [Reference Fang and Xu47].

This paper suggests a new algorithm with a new method to detect the welding seam position of fillet joint using laser distance sensor. The use of laser distance sensors is cheaper and simpler compared to using laser vision sensors and using only vision sensors. The authors have not seen any research using laser distance sensors to track the weld seam trajectory of fillet joint. The first section presents the introduction of the paper. Section 2 describes the proposed solution of the paper. The third section shows the proposed system configuration of the research. Section 4 shows the seam tracking algorithm. The experimental results are shown in se ction 5.

2. Problem statement and proposed solution

This paper proposes two contributions for automatic welding process of X-shaped tip of the prestressed centrifugal concrete pile. The first one is designing and manufacturing the positioning table for automatic welding process of the concrete pile tip. The second one is building the automatic weld seam tracking algorithm for automatic welding process of X-shaped tip using laser distance sensor.

2.1. Problem statement

The X-shaped tip of the prestressed centrifugal concrete piles is shown in Fig. 1. This tip is made from 4 parts including one square base part, one large triangle part, and 2 small triangle parts. In this research, the dimensions of the square base part are 150 × 150 × 6 (mm), ones of the large triangle part are 212 × 66 × 6, and ones of the small triangle parts are 103 × 64 × 6 (mm). The material of the tip is carbon steel.

Figure 1. The X-shaped tip of the prestressed centrifugal concrete piles.

To manufacture the X-shaped concrete tip, it needs to make the welding paths including 8 horizontal lines in the 2F position and 4 vertical lines in the 3F position. The manufacturing of the X-shaped tip of prestressed centrifugal concrete piles is nowadays done half automatically by combining the manual worker and the automatic welding robot. The horizontal lines are welded automatically by industrial robot using touch sensors. After that, the vertical lines are welded manually. Thus, the quality of the product is not good and is not the same for all products.

2.2. Proposed solution

In this research, the authors proposed the fully automatic process for manufacturing the X-shaped tip. To solve this problem, a positioning table is used for supporting the welding process of the industrial manipulator. All necessary welding lines can be divided into 4 zones including zone I, zone II, zone III, and zone IV. In each zone, there are 2 horizontal lines and 1 vertical line. Fig. 1 also shows the zones and number of the welding lines. The position of each welding zone can be obtained by rotating the positioning table 90 degrees. Fig. 2 shows the idea of using the industrial robot and the positioning table to automatically weld the X-shaped tip.

Figure 2. The automatic welding of the X-shaped concrete tip.

The use of motion coordination between the robot and the positioning table will help expand the working area for the robot as well as give the programmer more options in programming the welding trajectories. For welding positions further away from the base of the robot, for example the positions located near the limits of the robot’s workspace, it is more difficult to program because it has fewer the configurations of the robot that satisfy the welding requirements. Using the positioning table system will move the welding lines to the working area convenient for robot programming. The proposed positioning table can be shown in Fig. 3.

Figure 3. The proposed positioning table for the automatic welding process of the concrete pile tip.

In this research, the laser distance sensor was chosen because it has several advantages compared to the laser stripe sensor or laser vision sensor. Although the laser vision sensor is the most popular sensor used in welding seam tracking today, they still have several weak points. The cost for the laser vision sensor system is expensive and the control algorithm for this system is so complicated. This sensor is often attached to the robot to track the welding seam of a static part. One of the problems of the laser vision sensor is that this method is not applicable for narrow welding seams. The deformation of the laser stripe is unobvious at the narrow weld with 0.2 mm width. The X-shape tip, the object of this research, has all narrow welding seams. The proposed solution using laser distance sensor can overcome this above challenge. This kind of sensor is cheap and simple to use. By adding rotational motion, this sensor can detect the welding seam easily. The big advantage is that this method can detect narrow welding seams.

Fig. 3 shows more details about the proposed positioning table of the X-shaped tip. The idea of the proposed solution is combining the motion of the welding part and the sensor system. This motion is separated from the robot to reduce the effect of vibration compared to method of attaching sensor system to robot. Each laser distance sensor has two motions including one translational motion and one rotational motion. The rotational motion makes the sensor detect the welding point at one position and the translational motion makes the sensor detect all the welding line. In this system, two laser distance sensors are used, one for detecting welding horizontal line and one for detecting welding vertical line. Combining with the rotational motion of the welding part, the proposed system can detect all the necessary welding lines of the X-shaped tip. Using only laser vision sensor attached to the robot cannot track all these welding lines because of the limitation of the movement of the robot.

2.3. Mathematical fundamental of proposed method

The second contribution of this paper is proposing a new weld seam tracking algorithm using laser distance sensor. The object of this research is fillet welding joints. A laser distance sensor combining with an angle sensor is used to determine the distance between the workpiece and the laser sensor. In one vertical plane, the laser sensor is controlled to scan a desired angle. This angle can be divided into many positions to detect the distance from the sensor to the workpiece. The main idea of this method is that the largest distance is the point that belongs to the intersection line between two planes including vertical plane and horizontal plane. This point is also on the welding line. In Fig. 4, the hypotenuse edge OA is larger than the side edge OB and OC because OAC and OAB are right triangles. With a limited rotation angle from OC to OB, OA is also maximum distance from the sensor to the workpiece. A is the point that belongs to the welding line. Thus, the laser distance sensor is attached on a translation axis. Each time of moving the sensor, the largest distance can be measured and the point on the welding line can be detected. Using this method, the horizontal welding lines and the vertical welding lines can be determined using laser distance sensors.

Figure 4. Idea of the method using laser distance sensors.

Fig. 5 shows the coordinate frame attached to the initial position of the sensor system. In this coordinate frame, O is the center of the sensor at the initial position, the axis OX is along the moving axis of the sensor, the axis OY is perpendicular to the vertical plane, and the axis OZ is point down and perpendicular to the horizontal plane.

Figure 5. The coordinate frame is attached to the initial position of the sensor system.

In the proposed algorithm, L is the distance between the sensor and the object, α is the initial angle of the sensor and OY, θ is the small angle of each rotation of the sensor. If calling β is the rotation angle of the sensor and n is the number of rotations, the rotation angle β can be calculated using equation (1).

(1) \begin{equation}\beta =n\theta\end{equation}

Assuming that L max is the maximum distance and received when rotating the sensor k times. At that time, the corresponding rotation angle is (α + kθ). Call Δx is the movement of the sensor along X-axis. The coordinates of the welding point at the i th movement are calculated by equations (2a), (2b), and (2c).

(2a) \begin{equation}X=i x\end{equation}
(2b) \begin{equation}Y=L_{max}cos\left(\alpha +k\theta \right)\end{equation}
(2c) \begin{equation}Z=L_{max}sin\left(\alpha +k\theta \right)\end{equation}

3. The proposed system configuration

3.1. The hardware system

Fig. 6 shows the overview of the hardware configuration of the automatic welding system of the X-shaped concrete tip. This system has three main components including the welding positioning table, the sensors system and the controller of the positioning table, and the welding robot.

Figure 6. The system hardware configuration of the experiment.

The welding robot used in this research is MOTOMAN UP6. This is the 6-DOF robot for welding application of Yaskawa company. This robot has a horizontal range of 1373mm, a vertical range of 1673 mm, a repeatability of 0.08 mm, and a load capacity of 6 kg. The controller of the robot is MOTOMAN XRC Controller.

Fig. 7 shows the welding positioning table including one rotational movement and two translational movements. The table uses a cam indexer to make a rotation of each 90 degrees. A step motor combined with a belt driver is used to rotate the X-shaped tip to the working area of the sensors. The laser sensor is translated by a ball screw. Two ball screws are used, one for detecting the welding horizontal line and one for detecting the welding vertical line.

Figure 7. The real welding positioning table and its components.

3.2. The laser distance sensor calibration and electrical diagram system

In this research, two laser distance sensors are used to detect the horizontal welding line and vertical welding line. The measurement method is based on the time-of-flight method. Each sensor is attached to a motor to control the rotation angle of the sensor. The used laser distance sensor is TW10S-UART, and its parameters are shown in Table I.

Table I. The parameters of the laser distance sensor TW10S-UART.

Because the coordinates of the welding point depend on the zero position of the sensor, it is very important to determine the zero position of the laser distance sensor. To remove the errors in manufacturing and assembling the system, zero position calibration is done. The sensor is controlled to rotate each one degree and calculates the distance between the sensor and the object. The zero position of the sensor happens when the distance is minimum. Fig. 8 shows the calibration process of the sensor that determines the horizontal welding line.

Figure 8. Calibration process of the sensor determining the horizontal welding line.

Fig. 9 shows the electrical wiring diagram of the system. Two types of power are used. The 24V power is used to supply 3 stepper motor drives and 3 proximity sensors. One motor is used to rotate the welding part, two other motors are used to translate the laser distance sensor along horizontal axis and vertical axis. The 5V power is used to supply a microcontroller (MCU), 2 RC motor, and 2 laser distance sensors. The MCU receives the signal from the proximity sensors and the laser distance sensor. The signal from the laser distance sensor is transmitted to the personal computer for calculating the welding trajectory. The MCU also controls the stepper motors and RC motors.

Figure 9. The electrical diagram of the system.

3.3. The graphical user interface of the robot

The data of the welding trajectory are measured from the laser distance sensors and the MCU. The position of the welding path is in the frame coordinate of the laser sensor. Thus, it is necessary to determine the transformation matrix between the sensor coordinate and the robot coordinate. After that, the MCU transmits the signal to the personal computer via RS232 protocol. Next, the data is transmitted from the personal computer to the controller of the robot via RS232 protocol too. Fig. 10 shows the graphical user interface of the welding robot on the personal computer. Using this user interface, the position data of the welding trajectory can be transmitted to the controller of the robot.

Figure 10. The graphical user interface of the welding robot.

4. Automatic weld seam tracking algorithms

4.1. Proposed detecting weld seam algorithms using laser distance sensor

Fig. 11 shows the frame attached to the welding positioning table. The origin O is the center of the circular disk. OX is parallel to the axis of the ball screw that translate the laser sensor detecting the horizontal welding line. OY is on horizontal plane and positive direction is from the center point to outside. OZ is parallel to the axis of the ball screw that translate the sensor detecting the vertical welding line and the direction is from downside to upside.

Figure 11. The attached frame of the welding positioning table.

Figure 12. The main program algorithm.

Fig. 12 shows the main program algorithm for detecting automatic welding paths of the X-shaped tip. At the beginning, the HOME position of the horizontal ball screw and vertical ball screw is done. The beginning position of the rotational table is also detected. From that, the start positions of the horizontal sensor X 0n , Y 0n , Z 0n and the start positions of the vertical sensor X 0d , Y 0d , Z 0d are determined. After that, the horizontal welding line and the vertical welding line are determined by the sensors. The X-shaped concrete tip has 4 parts; thus, it needs to rotate the positioning table 3 times to detect all the welding part of the tip. The position data of the tip is transmitted to the control computer before transferring to the robot controller.

Figure 13. The procedure for detecting horizontal welding paths.

Fig. 13 shows the procedure for detecting the horizontal welding paths. The necessary welding paths include two horizontal lines, and the proposed algorithm will detect 13 points that stay on these two lines. The distance between these points along X-axis is 7.5 mm. To detect one point in the desired welding path, it needs to measure the distance L between the sensor and the welding part 24 times corresponding to 24 rotation angles from the initial angle α 0n . Among 24 values of the distance L, the maximum value is L max corresponding to the rotation of number k. The rotational angle of each step is θ. The coordinate of the detected welding point can be calculated using equations (3a), (3b), and (3c).

(3a) \begin{equation}X_{n}=X_{0n}+i x\end{equation}
(3b) \begin{equation}Y_{n}=Y_{0n}+L_{max}cos\left(\alpha _{0n}+k\theta \right)\end{equation}
(3c) \begin{equation}Z_{n}=Z_{0n}+L_{max}sin\left(\alpha _{0n}+k\theta \right)\end{equation}

Fig. 14 shows the procedure for detecting the vertical welding paths. The process is like the procedure for detecting horizontal welding paths.

Figure 14. The procedure for detecting vertical welding paths.

4.2. The improved algorithms

The accuracy of the above algorithm depends on the rotation angle of each rotation step. The smaller the rotation angle is, the more accurate the welding coordinates are. But the motor has a limited rotation angle depending on the resolution of the motor. On a plane that is perpendicular to the translational axis of the laser sensor, it exists as step i that the point i and point (i + 1) stay on two sides of the welding line, as shown in Fig. 15. It can be concluded that L i is the maximum distance from the sensor to the welding part. However, there is an error between the detected point and the desired welding point. Thus, the large rotation angle at each rotation step will give a large error of the welding point coordinates.

Figure 15. The explanation about the error of the proposed algorithm.

This section shows the improved algorithm that can reduce the error of the detecting welding point. In this algorithm, the coordinates of each measured point are collected. Totally, there are n measured point from O1 to On. These collected points are divided into two groups. The first group includes the points that stay on the vertical plane, number from O1 to Oi. The second group includes the points that stay on the horizontal plane, number from Oi + 1 to On. In each group, one intersection line is interpolated. The intersection point of these two intersection lines is the desired welding point. Fig. 16 shows respectively the real welding point (red point), the detected welding point (black point), and the improved welding point (blue point). It can be concluded that the new welding point using improved algorithm (blue point) has smaller error compared to the detected welding point (black point).

Figure 16. The explanation about the improved algorithm.

In the improved algorithm, the coordinate of X-axis is fixed in one scanning plane of the laser. The two other coordinates of Y-axis and Z-axis can be considered as 2D line. The least square method is used to interpolate the equation of the intersection line in vertical plane and the intersection line in horizontal plane and these lines are in the same plane, the scanning plane of the laser sensor. Assuming that the equations of these lines are shown in equations (4a) and (4b). Because just two coordinates of Y-axis and Z-axis are considered, a 2D line can be used to describe the relationship between them.

(4a) \begin{equation}y_{1}=a_{1}z_{1}+b_{1}\end{equation}
(4b) \begin{equation}y_{2}=a_{2}z_{2}+b_{2}\end{equation}

The intersection point between two intersection lines can be calculated using the equations (5a) and (5b). The position of the welding point can be calculated from Y i and Z i .

(5a) \begin{equation}Z_{i}=\frac{b_{2}-b_{1}}{a_{1}-a_{2}}\end{equation}
(5b) \begin{equation}Y_{i}=a_{1}\frac{b_{2}-b_{1}}{a_{1}-a_{2}}+b_{1}\end{equation}

Finally, the coordinates of the welding point using improved algorithm are as follows:

(6a) \begin{equation}X_{n}=X_{0n}+i x\end{equation}
(6b) \begin{equation}Y_{n}=Y_{0n}+Y_{i}\end{equation}
(6c) \begin{equation}Z_{n}=Z_{0n}+Z_{i}\end{equation}

5. Experimental results

This section shows the experimental results to verify the proposed algorithm in section 4. The hardware system used for the experiment is shown in Fig. 6. Figs. 17, 18, and 19 show the experimental process of detecting the welding part of the X-shaped concrete pile. In these figures, only one-fourth of the tip is detected. Figs. 17 and 18 show the experimental process of detecting the horizontal welding path and Fig. 19 shows the experimental process of detecting the vertical welding path.

Figure 17. The procedure of detecting the 1st horizontal welding path.

Figure 18. The procedure of detecting the 2nd horizontal welding path.

Figure 19. The procedure of detecting the vertical horizontal welding path.

Next, the comparison of 4 types of welding path is done including the real welding path, the detected welding path, the welding path using filter, and the welding path using the improved algorithm. Figs. 20, 21, 22, and 23 show the 4 types of welding path of 4 zones respectively. The real welding path is determined by the robot with high precision, shown in red lines. The detected welding path is determined by processing the data from the sensor system, shown in green lines. To remove the noise from the sensor, a mean filter is used to get the welding path, shown in black lines. Finally, the welding paths received from the improved algorithm are shown in blue lines.

Figure 20. The comparison of the real path and detected path of zone 1.

Figure 21. The comparison of the real path and detected path of zone 2.

Figure 22. Comparison of the real path and detected path of zone 3.

Figure 23. The comparison of the real path and detected path of zone 4.

Figure 24. The tracking 1st horizontal welding path using the robot.

It can be concluded that the detected welding path using laser distance sensors can track the real welding paths of the X-shaped concrete tip. However, the error of the detected path is about 6 mm, so large for GMAW welding application. The reason comes from the noise signal during measurement process. The error of the vertical welding path is larger than the error of the horizontal path. One of the reasons is because the laser sensor for detecting the vertical welding line is farther than the sensor used for detecting the horizontal welding path. In the mean filter method, a mean filter is added to the controller to remove the noise signal. The results of the mean filter method give the error of about 4 mm. The final method uses the above-improved algorithm to look for the intersection point between two intersection lines. The improved algorithm gives an error of about 2 mm. This error can be acceptable for GMAW welding application.

Several previous research using laser vision sensor can have seam tracking error of about 0.5 mm [Reference Quan and Bi39, Reference Cibicik, Njaastad, Tingelstad and Egeland46, Reference Fang and Xu47]. Other research using vision sensor has error of about 1 to 2.5 mm [Reference Takubo, Miyake, Ueno and Kubo40, Reference Dinham41]. Rotating arc sensor was also used for seam tracking with the error of about 1 mm [Reference Le, Zhang and Chen45]. It can be concluded that the method using laser vision sensor or vision sensor can have smaller errors than the proposed method. However, the cost for the proposed system is cheaper than advanced weld seam tracking methods. One reason makes the error of proposed method come from the mechanical system. The motor that controls the rotation of the laser distance sensor is RC motor. The resolution of this motor is not good and can give large errors when rotating at small angles. In the next step, the step motor will be used to reduce this type of error and can reduce the seam tracking error of all systems.

Figs. 24, 25, 26 show the tracking welding path of the robot in one zone after receiving the welding path from the improved algorithm. From the experiment, it can be concluded that the welding gun can track the welding paths.

Figure 25. The tracking 2nd horizontal welding path using the robot.

Figure 26. The tracking vertical welding path using the robot.

6. Conclusion

In this paper, an algorithm for welding seam tracking using laser distance sensor is proposed. The fundamental mathematics of the algorithm is presented. The positioning table system supports the procedure is designed and manufactured. Two detected algorithms of welding part are proposed including the method using laser distance sensor and the improved method. The experimental results show that the accuracy of the proposed method can be acceptable. In the future, the welding experiment will be done to verify the proposed method.

Acknowledgements

We acknowledge the support of time and facilities from Ho Chi Minh City University of Technology (HCMUT), VNU-HCM for this study.

Author contributions

Cao Tri Huynh and Tri Cong Phung conceived and designed the study. Cao Tri Huynh conducted data gathering. Tri Cong Phung performed statistical analyses. Tri Cong Phung wrote the article.

Financial support

This research is funded by Vietnam National University HoChiMinh City (VNU-HCM) under grant number B2021-20-03.

Competing interests

The authors declare no competing interests exist.

Ethical approval

None.

References

Rout, A., Deepak, B. B. V. L. and Biswal, B. B., “Advances in weld seam tracking techniques for robotic welding: A review,” Rob Comp Integr Manufact 56, 1237 (2019).CrossRefGoogle Scholar
Mahajan, A. and Figueroa, F., “Intelligent seam tracking using ultrasonic for robotic welding,” Robotica 15(3), 275281 (1997).CrossRefGoogle Scholar
Ushio, M. and Mao, W., “Modelling of an arc sensor for dc mig/mag welding in open arc mode: Study of improvement of sensitivity and reliability of arc sensors in GMA welding (1st report),” Weld Int 10(8), 622631 (1996).10.1080/09507119609549059CrossRefGoogle Scholar
Jiluan, P., “Arc sensing system for automatic weld seam tracking-mathematic model,” Sci China Ser E: Tech Sci 44, 251257 (2001).Google Scholar
Moon, H.-S., Ko, S.-H. and Kim, J.-C., “Automatic seam tracking in pipeline welding with narrow groove,” Int J Adv Manuf Technol 41(3), 234241 (2009).CrossRefGoogle Scholar
Fridenfalk, M. and Bolmsjö, G., “Design and validation of a universal 6d seam-tracking system in robotic welding using arc sensing,” Adv Rob 18(1), 121 (2004).CrossRefGoogle Scholar
Ali, M. H. M. and Atia, M. R., “A lead through approach for programing a welding arm robot using machine vision,” Robotica 40(3), 464474 (2022).CrossRefGoogle Scholar
Xu, D., Fang, Z., Chen, H., Yan, Z. and Tan, M., “Compact visual control system for aligning and tracking narrow butt seams with CO2 gas-shielded arc welding,” Int J Adv Manufact Technol 62(9), 11571167 (2012).CrossRefGoogle Scholar
Nele, L., Sarno, E. and Keshari, A., “An image acquisition system for real-time seam tracking,” Int J Adv Manufact Technol 69, 20992110 (2013).CrossRefGoogle Scholar
Xu, Y., Fang, G., Chen, S., Zou, J. J. and Ye, Z., “Real-time image processing for vision based weld seam tracking in robotic GMAW,” Int J Adv Manufact Technol 73, 14131425 (2014).CrossRefGoogle Scholar
Xu, Y., Fang, G., Lv, N., Chen, S. and Zou, J. J., “Computer vision technology for seam tracking in robotic GTAW and GMAW,” Robot Comput Integr Manuf 32, 2536 (2015).CrossRefGoogle Scholar
Dinham, M. and Fang, G., “The Development of a Low Cost Autonomous Robotic Arc Welding System,” In: International Conference on Robotic Welding, Intelligence and Automation, (Springer, 2015) pp. 541550. doi: https://doi.org/10.1007/978-3-319-18997-0_46 CrossRefGoogle Scholar
Lin, Z., Shi, Y., Wang, Z., Li, B. and Chen, Y., “Intelligent seam tracking of an ultranarrow gap during K-TIG welding: A hybrid CNN and adaptive ROI operation algorithm,” IEEE Trans Instrum Meas 72, 114 (2022). doi: 10.1109/TIM.2022.3230475.Google Scholar
Fei, X., Tan, C. and Yuan, Z., “Machine Vision Analysis of Welding Region and its Application to Seam Tracking in Arc Welding,” In: 7th International Conference on Cloud Computing and Big Data Analytics (ICCCBDA), (IEEE, 2022) pp. 445452. doi: 10.1109/ICCCBDA55098.2022.9778907.CrossRefGoogle Scholar
Liang, D., Wu, Y., Hu, K., Bu, J. J., Liang, D. T., Feng, Y. F. and Ma, J. Q., “Weld seam track identification for industrial robot based on illumination correction and center point extraction,” J Adv Mech Des, Syst, Manufact 16(3), JAMDSM0028 (2022). doi: 10.1299/jamdsm.2022jamdsm0028.CrossRefGoogle Scholar
Sawano, S., Ikeda, J., Utsumi, N., Ohtani, Y., Kikuchi, A., Ito, Y. and Kiba, H., “A sealing robot system with visual seam tracking,” Robotica 2(1), 4146 (1984).CrossRefGoogle Scholar
He, Y., Chen, Y., Xu, Y., Huang, Y. and Chen, S., “Autonomous detection of weld seam profiles via a model of saliency-based visual attention for robotic arc welding,” J Intell Robot Syst 81, 395406 (2016).CrossRefGoogle Scholar
Gu, W., Xiong, Z. and Wan, W., “Autonomous seam acquisition and tracking system for multi-pass welding based on vision sensor,” Int J Adv Manufact Technol 69, 451460 (2013).CrossRefGoogle Scholar
Huang, Y., Xiao, Y., Wang, P. and Li, M., “A seam-tracking laser welding platform with 3d and 2d visual information fusion vision sensor system,” Int J Adv Manufact Technol 67(1-4), 415426 (2013).CrossRefGoogle Scholar
Wu, Q.-Q., Lee, J.-P., Park, M.-H., Park, C.-K. and Kim, I.-S., “A study on development of optimal noise filter algorithm for laser vision system in GMA welding,” Procedia Eng 97, 819827 (2014).CrossRefGoogle Scholar
He, Y., Xu, Y., Chen, Y., Chen, H. and Chen, S., “Weld seam profile detection and feature point extraction for multi-pass route planning based on visual attention model,” Robot Comput Integr Manuf 37, 251261 (2016).CrossRefGoogle Scholar
Ma, Y., Fan, J., Yang, H., Wang, H., Xing, S., Jing, F. and Tan, M., “An efficient and robust complex weld seam feature point extraction method for seam tracking and posture adjustment,” IEEE Trans Ind Inform 19(11), 1070410715 (2023).CrossRefGoogle Scholar
Ma, Y., Fan, J., Yang, H., Yang, L., Ji, Z., Jing, F. and Tan, M., “A fast and robust seam tracking method for spatial circular weld based on laser visual sensor,” IEEE Trans Instrum Meas 70, 111 (2021). doi: 10.1109/TIM.2021.3106685.Google Scholar
Fan, J., Deng, S., Ma, Y., Zhou, C., Jing, F. and Tan, M., “Seam feature point acquisition based on efficient convolution operator and particle filter in GMAW,” IEEE Trans Ind Inform 17(2), 12201230 (2021). doi: 10.1109/TII.2020.2977121.CrossRefGoogle Scholar
Fan, J., Deng, S., Jing, F., Zhou, C., Yang, L., Long, T. and Tan, M., “An initial point alignment and seam-tracking system for narrow weld,” IEEE Trans Ind Inform 16(2), 877886 (2020). doi: 10.1109/TII.2019.2919658.CrossRefGoogle Scholar
Changyong, T., Xuhao, S., Tie, Y. and Yi, Z., “Laser Weld Seam Tracking Sensing Technology Based on Swing Mirror,” In: IEEE 7th Optoelectronics Global Conference (OGC), (IEEE, 2022) pp. 278281. doi: 10.1109/OGC55558.2022.10050920.CrossRefGoogle Scholar
Xiao, R., Xu, Y., Xu, F., Hou, Z., Zhang, H. and Chen, S., “LSFP-tracker: An autonomous laser stripe feature point extraction algorithm based on siamese network for robotic welding seam tracking,” IEEE Trans Ind Electron 71(1), 10371048 (2024). doi: 10.1109/TIE.2023.3243265.CrossRefGoogle Scholar
Xu, W., “Research and Design of Weld Seam Tracking Algorithm Based on Machine Vision and Machine Learning,” In: IEEE 5th International Conference on Power, Intelligent Computing and Systems (ICPICS), (IEEE, 2023) pp. 359363. doi: 10.1109/ICPICS58376.2023.10235645.CrossRefGoogle Scholar
Chen, H., Okeke, H. and Zhang, B., “Development of an Economical 3D Sensor for Weld Seam Tracking in Robotic Welding,” In: 12th International Conference on CYBER Technology in Automation, Control, and Intelligent Systems (CYBER), (IEEE, 2022) pp. 168173. doi: 10.1109/CYBER55403.2022.9907251.CrossRefGoogle Scholar
Tian, C., Zhou, C., Yin, T. and Zhang, Y., “Research on Feature Ppoint Recognition of Laser Welding Seam Based on Machine Vision,” In: 21st International Conference on Optical Communications and Networks (ICOCN), (IEEE, 2023) pp. 13. doi: 10.1109/ICOCN59242.2023.10236175.CrossRefGoogle Scholar
Huissoon, J. P., “Robotic laser welding: Seam sensor and laser focal frame registration,” Robotica 20(3), 261268 (2002).CrossRefGoogle Scholar
Liu, Y. and Wang, T., “Research on Structured Light Vision Seam Tracking System Based on RCNN,” In: 2022 IEEE International Conference on Advances in Electrical Engineering and Computer Applications (AEECA), (IEEE, 2022) pp. 11581161. doi: 10.1109/AEECA55500.2022.9918843.CrossRefGoogle Scholar
Zou, Y., Wei, X., Chen, J., Zhu, M. and Zhou, H., “A high-accuracy and robust seam tracking system based on adversarial learning,” IEEE Trans Instrum Meas 71, 113 (2022). doi: 10.1109/TIM.2022.3186085.Google Scholar
Yang, L., Fan, J., Huo, B., Li, E. and Liu, Y., “Image denoising of seam images with deep learning for laser vision seam tracking,” IEEE Sens J 22(6), 60986107 (2022). doi: 10.1109/JSEN.2022.3147489.CrossRefGoogle Scholar
Dong, M., Qin, X., Fu, Y., Xu, H., Wang, J., Wu, D. and Wang, Z., “A Vision Algorithm for Robot Seam Tracking Based on Laser Ranging,” In: 3rd International Conference on Computer, Control and Robotics (ICCCR), (IEEE, 2023) pp. 175179. doi: 10.1109/ICCCR56747.2023.10193955.CrossRefGoogle Scholar
Ma, Y., Fan, J., Zhou, Z., Zhao, S., Jing, F. and Tan, M., “Development of a hybrid vision sensor and its application in robotic welding,” In: IEEE 19th International Conference on Automation Science and Engineering (CASE), (IEEE, 2023) pp. 18. doi: 10.1109/CASE56687.2023.10260386.CrossRefGoogle Scholar
Chen, X., Chen, S. and Lin, T., “Recognition of Macroscopic Seam for Complex Robotic Welding Environment,” In: Robotic Welding, Intelligence and Automation, Lecture Notes in Control and Information Sciences, vol. 362 (Springer, Berlin, Heidelberg, 2007) pp. 171178. doi: https://doi.org/10.1007/978-3-540-73374-4_19.CrossRefGoogle Scholar
Dinham, M. and Fang, G., “Autonomous weld seam identification and localisation using eye-in-hand stereo vision for robotic arc welding,” Robot Comput Integr Manuf 29(5), 288301 (2013).CrossRefGoogle Scholar
Quan, Y. and Bi, Q., “Tracking and monitoring of 3-dimensions of welding seam and width in fillet welding of corrugated sheet,” Int J Control Autom 8(5), 337350 (2015).CrossRefGoogle Scholar
Takubo, T., Miyake, E., Ueno, A. and Kubo, M., “Welding line detection using point clouds from optimal shooting position,” J Rob Mechatr 35(2), 492500 (2023).CrossRefGoogle Scholar
Dinham, M., Autonomous Weld Joint Detection and Localization Using Computer Vision in Robotic Arc Welding (School of Computing, Engineering and Mathematics, University of Western Sydney, (2013). PhD thesisCrossRefGoogle Scholar
Zeng, J., Cao, G.-Z., Peng, Y.-P. and Huang, S.-D., “A weld joint type identification method for visual sensor based on image features and SVM,” Sensors 20(2), 471 (2020). doi: 10.3390/s20020471.CrossRefGoogle ScholarPubMed
Fridenfalk, M. and Bolmsjo, G.. Design and validation of a sensor guided control system for robot welding in shipbuilding. In: Proceedings of ICCAS, (2002) pp. 457472.Google Scholar
Huang, Y., Xu, S., Gao, X., Wei, C., Zhang, Y. and Li, M., “Feature point identification in fillet weld joints using an improved CPDA method,” Appl Sci 13(18), 10108 (2023). doi: 10.3390/app131810108.CrossRefGoogle Scholar
Le, J., Zhang, H. and Chen, X. Q., “Realization of rectangular fillet weld tracking based on rotating arc sensors and analysis of experimental results in gas metal arc welding,” Robot Com-Int Manuf 49, 263276 (2018).CrossRefGoogle Scholar
Cibicik, A., Njaastad, E. B., Tingelstad, L. and Egeland, O., “Robotic weld groove scanning for large tubular T‐joints using a line laser sensor,” Int J Adv Manufact Techn 120(7-8), 45254538 (2022). doi: 10.1007/s00170-022-08941-7.CrossRefGoogle Scholar
Fang, Z. and Xu, D., “Vision Based Modeling and Control for Fillet Weld Seam Tracking,” In: The 5th International Conference on the Advanced Mechatronics (ICAM2010), (2010) pp. 557562.Google Scholar
Figure 0

Figure 1. The X-shaped tip of the prestressed centrifugal concrete piles.

Figure 1

Figure 2. The automatic welding of the X-shaped concrete tip.

Figure 2

Figure 3. The proposed positioning table for the automatic welding process of the concrete pile tip.

Figure 3

Figure 4. Idea of the method using laser distance sensors.

Figure 4

Figure 5. The coordinate frame is attached to the initial position of the sensor system.

Figure 5

Figure 6. The system hardware configuration of the experiment.

Figure 6

Figure 7. The real welding positioning table and its components.

Figure 7

Table I. The parameters of the laser distance sensor TW10S-UART.

Figure 8

Figure 8. Calibration process of the sensor determining the horizontal welding line.

Figure 9

Figure 9. The electrical diagram of the system.

Figure 10

Figure 10. The graphical user interface of the welding robot.

Figure 11

Figure 11. The attached frame of the welding positioning table.

Figure 12

Figure 12. The main program algorithm.

Figure 13

Figure 13. The procedure for detecting horizontal welding paths.

Figure 14

Figure 14. The procedure for detecting vertical welding paths.

Figure 15

Figure 15. The explanation about the error of the proposed algorithm.

Figure 16

Figure 16. The explanation about the improved algorithm.

Figure 17

Figure 17. The procedure of detecting the 1st horizontal welding path.

Figure 18

Figure 18. The procedure of detecting the 2nd horizontal welding path.

Figure 19

Figure 19. The procedure of detecting the vertical horizontal welding path.

Figure 20

Figure 20. The comparison of the real path and detected path of zone 1.

Figure 21

Figure 21. The comparison of the real path and detected path of zone 2.

Figure 22

Figure 22. Comparison of the real path and detected path of zone 3.

Figure 23

Figure 23. The comparison of the real path and detected path of zone 4.

Figure 24

Figure 24. The tracking 1st horizontal welding path using the robot.

Figure 25

Figure 25. The tracking 2nd horizontal welding path using the robot.

Figure 26

Figure 26. The tracking vertical welding path using the robot.