Welding seam tracking control method of welding seam tracking system of underwater welding robot
Technical Field
The invention relates to the technical field of underwater welding seam tracking control, in particular to a welding seam tracking control method of a welding seam tracking system of an underwater welding robot.
Background
With increasingly frequent foreign economic trade activities in China and continuous and serious loss of marine resources on China's land, people pay more attention to comprehensive utilization of marine resources and comprehensive research and development of seabed renewable energy resources, large-scale ocean cruise ships, seabed petroleum pipelines, drilling construction platforms and the like for promoting the research are more and more, and the planning design and the construction of the large-scale ocean cruise ships, the seabed petroleum pipelines, the drilling construction platforms and the like are very independent of an underwater welding manufacturing technology. The technical basis and the important key for realizing welding automation are the automatic weld seam tracing technology, and the world coordinates of underwater weld seams must be obtained on the premise, which is also the difficulty of welding automation. The visual recognition is taken as a typical representative in a non-contact sensor, and has the advantages of strong adaptability to the working environment, high recognition flexibility and the like, and the underwater weld joint recognition has serious attenuation of underwater light rays and brings great influence on the weld joint recognition and tracking.
How to solve the above technical problems is the subject of the present invention.
Disclosure of Invention
The invention aims to provide a welding seam tracking control method of a welding seam tracking system of an underwater welding robot, and the underwater welding robot is an underwater welding seam tracking control method formed by a laser sensor and is suitable for the underwater welding robot.
The invention is realized by the following measures: a welding seam tracking control method of a welding seam tracking system of an underwater welding robot comprises the following steps:
step one, building an underwater welding tracking system;
step two, extracting the optical central line of the laser stripe line structure;
step three, positioning a welding starting point;
and step four, confirming a welding seam tracking track.
Further, in the first step, the whole underwater welding tracking system mainly comprises a welding gun, a CCD camera, a linear laser and a welding seam 4, wherein the welding gun is mainly used for welding the welding seam, the CCD camera is used for photographing the underwater line structured light, and the linear laser is mainly used for generating the linear structured light.
Further, in the second step, the adopted laser emits line structured light, and an underwater camera collects line structured light images to extract a central line of the weld joint, and the method specifically comprises the following steps:
2-1), first by means of a differential operator:
▽2f (x, y) ═ f (x +1, y) + (x-1, y) + f (x, y +1) + f (x, y-1) -4f (x, y) increases the difference of pixels between domains to obtain an image after Laplace transformation, wherein f (x, y) is an original picture ^ f2f (x, y) is a picture processed by Laplace;
2-2), adopting the formula: i (x, y) ═ f (x, y) + α ∑2f (x, y), overlapping the original image and the Laplace image, and highlighting the edge information of the original image, wherein I (x, y) is the image which finally keeps the original image and enhances the edge, and alpha is an overlapping coefficient;
2-3), making a certain number of welding seam templates, establishing a welding seam template library, and storing a pixel matrix T corresponding to each template
iUsing the formula
Calculating the matching degree value corresponding to the position (x, y) in the original image I, wherein x ', y' are the positions of all pixels of the template image, T
i(x ', y') is the pixel value of the template I at the x ', y' position, I (x + x ', y + y') is the pixel value corresponding to the position (x + x ', y + y') in the image I, the larger R (x, y) is, the better the matching degree of the image I with the template at (x, y) is, and the pixel matrix [ I (x + x ', y + y')]
{x',y'}The central position of the welding line is the central point of the welding line, so that the welding line identification is completed.
Further, in the third step, the welding starting point is identified and positioned by using a structured light scanning method, which specifically comprises the following steps:
3-1), initializing hardware equipment including a camera and a laser to ensure the normal work of the hardware equipment;
3-2) calculating distance deviation between the welding line and the welding gun by utilizing the welding line coordinate and the welding gun coordinate under a Cartesian coordinate system, resolving motion parameters of the mechanical arm through a robot inverse kinematics theory to obtain motion variables of each joint, controlling each joint to move to change the tail end posture of the mechanical arm, and ensuring that a welding line image is in the central position of a camera;
3-3) identifying the trend of the welding line in the visual field, calculating the space coordinate of the welding line, controlling a mechanical arm to enable the laser to move along the vertical direction of the welding line until the whole welding line is scanned, and determining the coordinate position of the starting point of the welding line.
Further, in the fourth step, the line point coordinates in the line structure are sequentially stored in the circular queue, and the size of the circular queue satisfies
F is the sight distance from the current laser center point to the tail end of the welding gun, v is the welding speed, t is the image processing time of each frame, and the advancing direction and the advancing speed of the welding gun are controlled by updating the queue information of the sight distance before the welding gun is updated, so that the tracking of the welding seam is realized.
Compared with the prior art, the invention has the beneficial effects that: (1) the method utilizes the green characteristic of the laser stripes to divide the target area from the green channel, thereby improving the accuracy of dividing the target area; (2) by utilizing the digital image processing technology, including a dark channel method, a binarization process, a linear template matching method and the like, under the condition of ensuring the identification accuracy, the accuracy of a training result is improved, the detection principle is simple, the detection speed is high, and the tracking effect is accurate.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description serve to explain the principles of the invention and not to limit the invention.
Fig. 1 is an overall flowchart of a weld tracking control method of the weld tracking system of the underwater welding robot provided by the invention.
Fig. 2 is a schematic view of an underwater welding tracking system of a weld tracking control method of the weld tracking system of the underwater welding robot provided by the invention.
Fig. 3 is a flowchart of extracting the center of the line laser stripe in the seam tracking control method of the seam tracking system of the underwater welding robot provided by the invention.
Fig. 4 is an original image of a weld joint irradiated by laser stripes in the weld joint tracking control method of the weld joint tracking system of the underwater welding robot provided by the invention.
Fig. 5 is a flowchart of a center line extraction result in the seam tracking control method of the seam tracking system of the underwater welding robot provided by the invention.
Fig. 6 is a flow chart of positioning a welding start point in a weld tracking control method of the weld tracking system of the underwater welding robot provided by the invention.
Fig. 7 is a schematic diagram of a weld tracking circular queue in the weld tracking control method of the weld tracking system of the underwater welding robot provided by the invention.
Wherein the reference numerals are: 1. a welding gun; 2. a CCD camera; 3. a linear laser; 4. and (7) welding seams.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings and embodiments. Of course, the specific embodiments described herein are merely illustrative of the invention and are not intended to be limiting.
Example 1
Referring to fig. 1 to 7, the present invention provides a technical solution that a weld tracking control method of a weld tracking system of an underwater welding robot includes the following four steps:
step 1, building an underwater welding tracking system;
step 2, extracting the optical central line of the laser stripe line structure;
step 3, positioning a welding starting point;
and 4, confirming a welding seam tracking track.
As shown in fig. 2, the whole underwater welding tracking system mainly comprises a welding gun 1, a CCD camera 2, a linear laser 3 and a welding seam 4, wherein the welding gun 1 is mainly used for welding the welding seam, the CCD camera 2 is used for photographing underwater line structured light, and the linear laser 2 is mainly used for generating line structured light.
As shown in fig. 3, the adopted laser emits line structured light, and an underwater camera collects line structured light images, as shown in fig. 4, and the extraction of the center line of the weld is completed by the following steps, and the result is shown in fig. 5:
1) firstly, through a differential operator:
▽2f (x, y) ═ f (x +1, y) + (x-1, y) + f (x, y +1) + f (x, y-1) -4f (x, y) increases the difference of pixels between domains to obtain an image after Laplace transformation, wherein f (x, y) is an original picture ^ f2f (x, y) is a picture processed by Laplace;
2) and adopting a formula: i (x, y) ═ f (x, y) + α ∑2f (x, y), overlapping the original image and the Laplace image, and highlighting the edge information of the original image, wherein I (x, y) is the image which finally keeps the original image and enhances the edge, and alpha is an overlapping coefficient;
3) making a certain number of welding seam templates, establishing a welding seam template library, and storing a pixel matrix T corresponding to each template
iUsing the formula
Calculating the matching degree value corresponding to the position (x, y) in the original image I, wherein x ', y' are template imagesPosition of each pixel, T
i(x ', y') is the pixel value of the template I at the x ', y' position, I (x + x ', y + y') is the pixel value corresponding to the position (x + x ', y + y') in the image I, the larger R (x, y) is, the better the matching degree of the image I with the template at (x, y) is, and the pixel matrix [ I (x + x ', y + y')]
{x',y'}The central position of the welding line is the central point of the welding line, so that the welding line identification is completed.
The adopted laser emits line structured light, an underwater camera collects line structured light images as shown in fig. 4, sharpening is carried out through a Laplace operator, image edge information is enhanced, and finally a laser streak line structured light center line is obtained through a linear template matching method as shown in fig. 5.
As shown in fig. 6, the method for identifying and positioning the welding start point by using the structured light scanning method specifically includes the following steps:
1) initializing hardware equipment including a camera and a laser to ensure normal operation of the hardware equipment;
2) calculating distance deviation between a welding line and a welding gun by using a welding line coordinate and a welding gun coordinate under a Cartesian coordinate system, resolving motion parameters of a mechanical arm through a robot inverse kinematics theory to obtain motion variables of each joint, controlling each joint to move to change the tail end posture of the mechanical arm, and ensuring that a welding line image is in the central position of a camera;
3) and identifying the trend of the welding seam in the visual field, calculating the space coordinate of the welding seam, controlling a mechanical arm to enable the laser to move along the vertical direction of the welding seam until the whole welding seam is scanned, and determining the coordinate position of the starting point of the welding seam.
Because structured light projected by the optical vision sensor has a certain forward-looking distance with the tail end of the welding gun, the track point coordinates acquired by the sensor cannot be used immediately in the tracking process, the invention adopts a circular queue to sequentially store the line structure central line point coordinates of the optical welding seam, and the size of the circular queue meets the requirement of the circular queue
Wherein F is the sight distance from the current laser central point to the tail end of the welding gun, v is the welding speed, and t is the image processing time of each frameAnd the advancing direction and speed of the welding gun are controlled by updating the queue information of the front sight distance, so that the tracking of the welding seam is realized.
As shown in fig. 7, S is a welding seam, a point p is a current welding gun point, l is a straight line formed by the projection of the structured light on the welding plate, and intersects with S at a point p', the point is collected to enter the tail end of the welding tracking queue, and the direction of v is the current moving direction of the tail end of the manipulator, and the direction points to the next point in the tracking queue. During welding, the robot tip will move in the direction v, at which point the weld has left the center of the field of view, and continuing to move in the direction v will cause p' to leave the measurement volume for the structured light.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like that fall within the spirit and principle of the present invention are intended to be included therein.