1. bookAHEAD OF PRINT
Détails du magazine
License
Format
Magazine
eISSN
2444-8656
Première parution
01 Jan 2016
Périodicité
2 fois par an
Langues
Anglais
access type Accès libre

Research of neural network for weld penetration control

Publié en ligne: 31 Mar 2022
Volume & Edition: AHEAD OF PRINT
Pages: -
Reçu: 22 Jul 2021
Accepté: 06 Dec 2021
Détails du magazine
License
Format
Magazine
eISSN
2444-8656
Première parution
01 Jan 2016
Périodicité
2 fois par an
Langues
Anglais
Abstract

A method has been proposed for welding penetration status prediction in the paper. First, an experimental system was set up and welding experiments were performed. Some groups of welding images could been obtained. A composite filtering system composed of a neutral light reduction filter and a narrow band filter was developed to filter the weld arc disturbance. Some operations were performed to the images, namely the median filter and gray transformation. Then a neural network was setup, containing three layers. The inner widths of pool xn, the outer widths of pool xw, the width difference values e between the inner and outer of pool, ratios of inner pool widths Rn and ratios of outer pool widths Rw between two adjacent images were determined to be the input parameters. The penetration parameter p was chosen to be the output. Based on the images, groups of pool parameter data have been obtained and used to train the network. In this way, the weld penetration prediction model can be deduced. Finally, verification tests have been done. It showed that weld penetration situation predicted by the model is fit to its real condition. The accuracy rate is up to 96%, which affords a new way for penetration detection.

Keywords

Introduction

In some fields of production manufacturing, such as the automobile industry, the shipbuilding industry and so on, the metal is usually joint by welding. It is said that there are about 40% of all steel structure production is connected by welding [1]. The weld quality at the joining position is very important for the final production, which may directly affect the safety of human. Nowadays, the weld quality control is done following the correct procedures, the monitoring of the welding process and the inspection of the final quality, which can’t satisfy the need of mass production.

Actually, skilled welders watch the pool shape, adjust the welding gun posture and the welding speed to realise the welding quality control in real time. From this fact, we can infer that there is much information contained in the weld pool, which may imply the heat transfer situation, the stability, the welding penetration condition and so on. Therefore, information of the welding process may be indirectly inferred by the formation of weld pool, which affords a new way for weld quality control.

Therefore, the weld pool information has become very important in welding. In order to achieve this goal, much elaborate research has been done. Nowadays, there are many methods applied for weld pool detection, such as the vision sensor method, the weld arc sound sensor method, the molten weld pool oscillation method, the infrared sensor detection method, the optical sensor method and so on [2, 3, 4, 5]. The vision method has been greatly used in weld pool detection, which has the merit of being contactless and access to a large amount of information [6]. Yongchao et al. [7] performed some analysis on the evolution process of the pool surface, which can be used for weld monitoring. A model is trained for the root-pass penetration estimation, which is the convolution neural network. Zitouni et al. [8] imported the morphology for pool image processing. The weld pool has been surveyed in order to learn the weld properties. The heat transfer condition of weld assembly can be inferred through the Tungsten Inert Gas (TIG) process. Then a mathematical model was developed and the effect of the melted liquid movement for weld pool was studied. Dukun [9] studied the weld pool centroid. A method for weld deviation prediction has been proposed. Lidong et al. [10] proposed a system composed of structured light vision sensing, in which the images, which are composed of the 3D formation of pool surface, can be captured. Then the long-short-term memory (LSTM) neural network has been researched and used for pool measurement so that the automatic weld control can be realised. Nicholas et al. [11] researched the weld pool characteristics from the high-resolution thermal imaging of the welding process. The weld pool characteristics can be investigated in this manner for welding control.

Using the pool images, a method for weld penetration control has been researched in this paper. First, the experimental platform was set up for TIG welding. Several weld experiments were performed. Then weld pool images were then acquired. Some analysis was performed on the pool images. The main factors affecting weld penetration were obtained and composed into the weld penetration status control parameters.

On this basis, a BP neural network was constructed. The widths of inner pool xn, the widths of outer pool xw, the difference values between the inner and outer welding pool widths e, ratios of the inner pool widths between two adjacent images Rn and ratios of the outer pool widths between two adjacent images Rw, which affect welding penetration, are chosen to be the input parameters in this paper. Welding penetration situation p is chosen to be the output parameter. The data, which were obtained by the welding experiments, are used for the training of the network. In this way, the model was set up, which can be used for weld penetration prediction and affords a new way for weld quality control.

Hardware of experimental system

First, the experimental system is setup for the TIG welding in this paper. The key components of the whole experimental system are the weld robot, the working table, the visual camera, the controller and so on, which are shown in Figure 1. The control system is used for the mechanical table motion control and image capture. A 6-axes robot is used for welding. The mechanical table is composed of two axes, namely the x and y axis, which are driven by stepper motors. The welding fixture on the table is used to clamp the weld assembly.

Fig. 1

Hardware structure of experimental system

The travel switches on the axis can be used to make sure that the work table moves within a certain scope. In the experiments, one axis of the work table is determined to make a feed moving. In order to reduce the strong light produced by the weld arc, an optical filter is installed in front of visual camera. The visual camera captured images during the welding process.

Demarcate of CCD system

The CCD system of the welding platform includes the CCD visual sensor and the camera. The visual sensor is SANYO’s VCC-6570P sensor, whose chip scale is 1/3″. The total pixels of the sensor are 795 × 596. The T10Z0513CS of the Japanese company, Computar, is used as the camera lens for the visual sensor. The scale of the lens is 1/3″, which is fit to the scale of the CCD sensor. The detailed parameters of lens is as follows:

Size of T10Z0513CS lens

Item Numerical value

Scale 1/3″
Focal length 5~50 mm
Aperture F1.3-C
Angle of view 51.8–56°
Nearest object distance 0.8 m

In the application of visual sensors, there exists distortion of the lens. However, the effect of distortion is very minimal in the setup of the penetration model. Therefore, the lens distortion is ignored in this paper. The detail demarcating process first involves placing a ruler on the welding table. The CCD sensor is then used to capture the image, which is shown in Figure 2.

Fig. 2

Demarcate of visual camera

The position of the demarcate line is shown in Figure 2. The distance between the adjacent lines is 4 mm, and the pixels are 29, 32, 30, 30, 29, 31, 30 and 29. The pixel equivalent can therefore be calculated as follows: k=1×829+32+30+30+29+31+30+290.033mm/pixel k = {{1 \times 8} \over {29 + 32 + 30 + 30 + 29 + 31 + 30 + 29}} \approx 0.033{\rm{mm}}/{\rm{pixel}}

Software of welding system

The motion control and image capture cards in the controller communicate using a PCI bus. An open software for the welding system can be developed using the C++ programming language. The software system can control the weld table moving along the axis. It can also capture welding images, save them onto the harddisk of a computer and perform some image processing operations at the same time. The software of the welding system is shown in Figure 3.

Fig. 3

Software of welding experimental system

Obtainment of pool image

Clear pool images is a key factor for penetration control. There is a strong arc-light during the welding. Since the light will cover the weld pool, optical filters can be used to reduce the light disturbance. Figure 4a shows the result of the narrow band filter method. It can be found that the arc-light is still very strong. The weld pool has been covered by the strong weld arc. Some information that is very important for weld penetration control, such as the shape of pool, the width and length of pool, can’t be distinguished.

Fig. 4

Results of narrow band filtering and neutral filtering

However, Figure 4b shows the result of the traditional neutral light reducing method. It can be found that the contrast of weld pool images has become very weak. The whole pool image has become very vague. The details of the weld pool can’t be detected. Some edge of pool image has been lost.

From Figure 4, it can be found that the two filtering methods have their own merits and demerits. In order to combine the advantages of narrow band filtering with the neutral filtering method, a composite system is used for light filtering in this paper, composed of a 650 nm narrow band filter and a 7# neutral filter. The filtering result is shown in Figure 5. It can be found that the composite filter method can reduce the arc-light disturbance more effectively allowing clearer weld pool images to be obtained.

Fig. 5

Filtering result of composite filter

Weld pool image processing

A 3 × 3 template median filter operation is performed to reduce the noise disturbance, which is shown in Figure 6a. Then the gray transform process is performed. The contrast between weld pool and seam has become more obvious. Clearer weld pool images could be obtained. Figure 6b showed the processing results for weld pool images.

Fig. 6

Procession result of median filtering and gray enhancing

Welding penetration status

The penetration condition of the steel plate can’t be detected directly. However, the penetration status can be speculated by means of assessing other parameters. Many researchers choose the width of weld seam at the back for the weld penetration conditions determination [12, 13, 14, 15]. In this paper, using the width values of weld seam at the back, the penetration levels were divided into unfused, fused and over fused status, which are shown in Table 2.

Weld penetration status

Penetration status Unfused Fused Over fused
Width values of weld seam at the back <1.82 mm >1.82 mm and <2.64 mm >2.64 mm
Set up of weld penetration model
Obtainment of welding data

Using the experimental system, several welding conditions were inferred. Table 3 shows the welding conditions. Several groups of weld pool images were then acquired, which can be used to train the model for weld penetration measurement.

Welding experiment conditions

Material and size (mm × mm × mm) Argon flow (L/min) Weld current (A) Weld speed (mm/s) Sample time (ms)
Q535(200 × 150 × 2) 9 80 2.50 40

At the beginning of the welding process, the torch point is set to the seam. Then the feeding axis drives the working table to move. The visual camera captured the images in real time. About 400 welding images can be acquired in one instance of welding. The front side of the welding seam is shown in Figure 7, and the back side of welding seam is shown in Figure 8. The width of weld seam on the back side is shown in Figure 9.

Fig. 7

Front side of welding seam

Fig. 8

Back side of welding seam

Fig. 9

Widths of weld seam on the back side

Set up of weld penetration model

In actual application, the widths of seam at the back side are usually chosen to be the parameters for weld penetration determination. From the captured pool images, it can be easily inferred that the width of seam at the back side is greatly influenced by the width of the front weld pool. In another words, the welding penetration status can be indirectly inferred by the width of weld pool on the front side. In the welding process, the outer and inner weld pools will be formed, which is due to the uneven heating conditions under the weld arc and its surrounding areas. The outer and inner weld pool are shown in Figure 6. The widths of inner pool xn and widths of outer pool xw are shown in Figure 10.

Fig. 10

Widths of inner pool and outer pool

In this paper, the widths of inner pool xn, the widths of outer pool xw, the difference values between the inner and outer weld pool widths e, ratios of the inner pool widths between two adjacent images Rn and ratios of the outer pool widths between two adjacent images Rw are determined to be the input vectors. Weld penetration status of the weld assembly p is set to be the output parameter, which is determined according to Table 2. A neural network is constructed, which can be used to build the relationship between the input vector and the output parameter.

BP network is a multilayer forward neural network [12, 13, 14, 15]. Figure 10 shows the structure of the BP neural network setup used in this paper. It contains the input, hidden and output layers. The input contains five parameters, which are the width of inner pool xn, the width of outer pool xw, difference value between inner and outer welding pool width e, ratio of the inner pool widths between two adjacent images Rn and ratio of the outer pool widths between two adjacent images Rw. The number of neurons in the hidden layer is 20. The output layer contains only one neuron, which is the welding penetration situation of weld assembly p. The train way is the elastic gradient descent method. The transfer functions for the hidden and output layer are the tansig function and the purelin function, which are shown in formula (2) and formula (3), respectively [8, 9]. f(x)=2(1+e2x)1 f(x) = {2 \over {(1 + {e^{- 2x}}) - 1}} f(x)=x f(x) = x

The expression of the seam penetration model can be expressed as follows: {μH=ωH×XT+bH,νH=1eμH1+eμHμO=[νH]T×ωO+bo,y=μO \left\{{\matrix{{{{\bf{\mu}}^{\rm{H}}} = {{\bf{\omega}}^{\rm{H}}} \times {{\bf{X}}^{\rm{T}}} + {{\bf{b}}^{\rm{H}}},\quad {{\bf{\nu}}^{\rm{H}}} = {{1 - {{\bf{e}}^{- {\mu ^{\rm{H}}}}}} \over {1 + {{\bf{e}}^{- {{\bf{\mu}}^{\rm{H}}}}}}}} \hfill \cr {{{\bf{\mu}}^{\rm{O}}} = {{\left[ {{{\bf{\nu}}^{\rm{H}}}} \right]}^{\rm{T}}} \times {{\bf{\omega}}^{\rm{O}}} + {{\bf{b}}^{\rm{o}}},\quad {\bf{y}} = {\mu ^{\rm{O}}}} \hfill \cr}} \right.

In (4) shown above, the input vector is X. It has five parameters, which are the width of inner pool xn, the width of outer pool xw, the difference value between the inner and outer welding pool width e, ratio of the inner pool widths between two adjacent images Rn and ratio of the outer pool widths between two adjacent images Rw. WH and bH stand for the weight coefficients and threshold value of hidden layer. WO and bO are the weight coefficients and the threshold value of output layer. y is the output value, which is 1, 2 or 3. It stands for the three welding penetration status – unfused, fused and over fused.

Groups of data were selected and input to the setup network. The training results are shown in Figure 12.

Fig. 11

Structure of the neural network for weld penetration prediction

Fig. 12

Training results

The parameters of model can be shown as follows: ωH=[0.17720.06360.334830.543021.08232.67412.23414.814110.162717.24580.02850.00800.122334.041720.72411.17060.28091.306011.374929.13610.03830.04130.055912.980014.5041]bH=[6.376813.201624.747931.927612.3297]ωO=[0.01590.00683.01370.03265.1854]bO=3.5833 \matrix{{{{\bf{\omega}}^H} = \left[ {\matrix{{- 0.1772} & {0.0636} & {0.3348} & {- 30.5430} & {21.0823} \cr {2.6741} & {- 2.2341} & {- 4.8141} & {10.1627} & {- 17.2458} \cr {- 0.0285} & {- 0.0080} & {0.1223} & {34.0417} & {- 20.7241} \cr \vdots & \vdots & \vdots & \vdots & \vdots \cr {1.1706} & {- 0.2809} & {- 1.3060} & {- 11.3749} & {- 29.1361} \cr {0.0383} & {- 0.0413} & {- 0.0559} & {- 12.9800} & {14.5041} \cr}} \right]\quad {{\bf{b}}^H} = \left[ {\matrix{{6.3768} \cr {13.2016} \cr {- 24.7479} \cr \vdots \cr {31.9276} \cr {12.3297} \cr}} \right]\quad {{\bf{\omega}}^O} = \left[ {\matrix{{- 0.0159} \cr {0.0068} \cr {- 3.0137} \cr \vdots \cr {0.0326} \cr {- 5.1854} \cr}} \right]} \hfill \cr {{{\bf{b}}^O} = 3.5833} \hfill \cr}

Testing experiments

To verify the accuracy, other groups of data were selected and input into the model. The verification result is shown in Figure 12. The average error of model can be defined as follows: error¯=lL \overline {{\rm{error}}} = {l \over L}

In (5) shown above, L is the total detected points. In this paper, this number is 100. l are the points whose values are fit to the measured values.

In Figure 13, it can be found that there are 4 failed points. This may be due to the disturbance from weld arc. The final precision is 96%. This shows that the setup model can predict the welding penetration status well.

Fig. 13

Verification result for the penetration prediction model

Conclusions

A new way for weld penetration prediction has been proposed. It uses the weld pool image. Groups of weld pool images can be captured for training data by the experimental system. Several image processes have been performed to the pool image, namely the median filtering and gray transformation operations. A BP neural network has been setup, which contains three layers. Then the widths of inner pool xn, the widths of outer pool xw, difference values between the inner and outer welding pool width e, ratios of the inner pool widths between two adjacent images Rn and ratios of the outer pool widths between two adjacent images Rw are determined to be the input parameters. Welding penetration status p is set to be output parameter. In this way, the prediction model for weld penetration detection can be set up. The final verification experiments show that the precision of the model is up to 96%. However, 4 points failed. The setup model can predict the welding penetration status well.

Fig. 1

Hardware structure of experimental system
Hardware structure of experimental system

Fig. 2

Demarcate of visual camera
Demarcate of visual camera

Fig. 3

Software of welding experimental system
Software of welding experimental system

Fig. 4

Results of narrow band filtering and neutral filtering
Results of narrow band filtering and neutral filtering

Fig. 5

Filtering result of composite filter
Filtering result of composite filter

Fig. 6

Procession result of median filtering and gray enhancing
Procession result of median filtering and gray enhancing

Fig. 7

Front side of welding seam
Front side of welding seam

Fig. 8

Back side of welding seam
Back side of welding seam

Fig. 9

Widths of weld seam on the back side
Widths of weld seam on the back side

Fig. 10

Widths of inner pool and outer pool
Widths of inner pool and outer pool

Fig. 11

Structure of the neural network for weld penetration prediction
Structure of the neural network for weld penetration prediction

Fig. 12

Training results
Training results

Fig. 13

Verification result for the penetration prediction model
Verification result for the penetration prediction model

Welding experiment conditions

Material and size (mm × mm × mm) Argon flow (L/min) Weld current (A) Weld speed (mm/s) Sample time (ms)
Q535(200 × 150 × 2) 9 80 2.50 40

Size of T10Z0513CS lens

Item Numerical value

Scale 1/3″
Focal length 5~50 mm
Aperture F1.3-C
Angle of view 51.8–56°
Nearest object distance 0.8 m

Weld penetration status

Penetration status Unfused Fused Over fused
Width values of weld seam at the back <1.82 mm >1.82 mm and <2.64 mm >2.64 mm

Ma Zengqiang, Qian Rongwei, Xu Dandan, Du Wei. Denoising of line structured light welded seams image based on adaptive top-hat transform[J]. Transactions of the China welding institution, 2021, 42(2):8–15. MaZengqiang QianRongwei XuDandan DuWei Denoising of line structured light welded seams image based on adaptive top-hat transform[J] Transactions of the China welding institution 2021 42 2 8 15 Search in Google Scholar

Dai Xinxin, Gao Xiangdong, Zheng Qiaoqiao, Ji Yukun. A method of fuzzy clustering identification for weld defects by magneto-optical imaging[J]. Transactions of the China welding institution, 2021, 42(1):54–57. DaiXinxin GaoXiangdong ZhengQiaoqiao JiYukun A method of fuzzy clustering identification for weld defects by magneto-optical imaging[J] Transactions of the China welding institution 2021 42 1 54 57 Search in Google Scholar

Agarwal G, Gao H, Amirthalingam M, et al. In situ strain investigation during laser welding using digital image correlation and finite-element-based numerical simulation[J]. Science and Technology of Welding and Joining, 2018, 23(2):134–139. AgarwalG GaoH AmirthalingamM In situ strain investigation during laser welding using digital image correlation and finite-element-based numerical simulation[J] Science and Technology of Welding and Joining 2018 23 2 134 139 10.1080/13621718.2017.1344373 Search in Google Scholar

Lei Zhenglong, Guo Hengtongl, Zhang Dengming, Li Qian. Study on Melt Flow and Grain Refining Ultrasonic-assisted Laser Filler Wire Welding Process of 5A06 Aluminum Alloy[J]. Journal of Mechanical Engineering, 2021, 57(6):78–84. LeiZhenglong GuoHengtongl ZhangDengming LiQian Study on Melt Flow and Grain Refining Ultrasonic-assisted Laser Filler Wire Welding Process of 5A06 Aluminum Alloy[J] Journal of Mechanical Engineering 2021 57 6 78 84 10.3901/JME.2021.06.078 Search in Google Scholar

Wu Shengchuan, Xie Cheng, Hu Yanan, et al. Defect tolerance assessment method of fusion welded medium and high strength Al alloy joints[J]. Journal of Mechanical Engineering, 2020, 56(8):46–59. WuShengchuan XieCheng HuYanan Defect tolerance assessment method of fusion welded medium and high strength Al alloy joints[J] Journal of Mechanical Engineering 2020 56 8 46 59 10.3901/JME.2020.08.046 Search in Google Scholar

Xue Kunxi, Wang Zhijiang, Shen Junqi, et al. Robotic seam tracking system based on vision sensing and human-machine interaction for multi-pass MAG welding[J]. Journal of Manufacturing Processes, 2021, 63:48–59. XueKunxi WangZhijiang ShenJunqi Robotic seam tracking system based on vision sensing and human-machine interaction for multi-pass MAG welding[J] Journal of Manufacturing Processes 2021 63 48 59 10.1016/j.jmapro.2020.02.026 Search in Google Scholar

Cheng Yongchao, Chen Shujun, Xiao Jun, Zhang YuMing. Dynamic estimation of joint penetration by deep learning from weld pool image[J]. Science and Technology of Welding and Joining, 2021, 26(4):279–285. ChengYongchao ChenShujun XiaoJun ZhangYuMing Dynamic estimation of joint penetration by deep learning from weld pool image[J] Science and Technology of Welding and Joining 2021 26 4 279 285 10.1080/13621718.2021.1896141 Search in Google Scholar

Zitouni Abdel Halim, Spiteri Pierre, Aissani Mouloud, et al. Heat Transfer Mode and Effect of Fluid Flow on the Morphology of the Weld Pool[J]. Defect and Diffusion Forum, 2021, 6112:66–77. ZitouniAbdel Halim SpiteriPierre AissaniMouloud Heat Transfer Mode and Effect of Fluid Flow on the Morphology of the Weld Pool[J] Defect and Diffusion Forum 2021 6112 66 77 10.4028/www.scientific.net/DDF.406.66 Search in Google Scholar

Ding Dukun. Visual Neural Network Model for Welding Deviation Prediction Based on Weld Pool Centroid[J]. International Journal of Pattern Recognition and Artificial Intelligence, 2018, 32(8). DingDukun Visual Neural Network Model for Welding Deviation Prediction Based on Weld Pool Centroid[J] International Journal of Pattern Recognition and Artificial Intelligence 2018 32 8 10.1142/S0218001418590140 Search in Google Scholar

Li Lidong, Cheng Fangjie, Wu Shaojie. An LSTM-based measurement method of 3D weld pool surface in GTAW[J]. Measurement, 2021, 171:32–34. LiLidong ChengFangjie WuShaojie An LSTM-based measurement method of 3D weld pool surface in GTAW[J] Measurement 2021 171 32 34 10.1016/j.measurement.2020.108809 Search in Google Scholar

Boone Nicholas, Davies Matthew, Willmott Jon Raffe, et al. High-Resolution Thermal Imaging and Analysis of TIG Weld Pool Phase Transitions[J]. Sensors, 2020, 20(23):1–11. BooneNicholas DaviesMatthew WillmottJon Raffe High-Resolution Thermal Imaging and Analysis of TIG Weld Pool Phase Transitions[J] Sensors 2020 20 23 1 11 10.3390/s20236952773144733291394 Search in Google Scholar

Yang Shuyin. Pattern recognition and intelligent computing[M]. Beijing: Electronic Industry Press, 2015. YangShuyin Pattern recognition and intelligent computing[M] Beijing Electronic Industry Press 2015 Search in Google Scholar

Kang Lishan. Computational intelligence[M]. Beijing: Science press, 2016. LishanKang Computational intelligence[M] Beijing Science press 2016 Search in Google Scholar

Zhu Daqi, Shi Hui. The principle and application of artificial neural network[M]. Beijing: Science press, 2006. ZhuDaqi ShiHui The principle and application of artificial neural network[M] Beijing Science press 2006 Search in Google Scholar

Gao Juan. The principle and simulation example of artificial neural network[M]. Beijing: China machine press, 2003. GaoJuan The principle and simulation example of artificial neural network[M] Beijing China machine press 2003 Search in Google Scholar

Articles recommandés par Trend MD

Planifiez votre conférence à distance avec Sciendo