CN115409808A - Weld joint recognition method and device, welding robot and storage medium - Google Patents

Weld joint recognition method and device, welding robot and storage medium Download PDF

Info

Publication number
CN115409808A
CN115409808A CN202211063959.0A CN202211063959A CN115409808A CN 115409808 A CN115409808 A CN 115409808A CN 202211063959 A CN202211063959 A CN 202211063959A CN 115409808 A CN115409808 A CN 115409808A
Authority
CN
China
Prior art keywords
cloud data
point
point cloud
points
plane
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211063959.0A
Other languages
Chinese (zh)
Inventor
植美浃
陈泓亨
许曦
李俊渊
高建文
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Qianhai Ruiji Technology Co ltd
China International Marine Containers Group Co Ltd
CIMC Containers Holding Co Ltd
Original Assignee
Shenzhen Qianhai Ruiji Technology Co ltd
China International Marine Containers Group Co Ltd
CIMC Containers Holding Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Qianhai Ruiji Technology Co ltd, China International Marine Containers Group Co Ltd, CIMC Containers Holding Co Ltd filed Critical Shenzhen Qianhai Ruiji Technology Co ltd
Priority to CN202211063959.0A priority Critical patent/CN115409808A/en
Publication of CN115409808A publication Critical patent/CN115409808A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30152Solder
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30164Workpiece; Machine component

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Quality & Reliability (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The application provides a welding seam identification method and device, a welding robot and a storage medium. The weld joint identification method comprises the following steps: acquiring three-dimensional point cloud data of a right-angle workpiece; if the intersection straight lines exist among the planes based on the three-dimensional point cloud data, extracting a plurality of boundary points with adjacent relations from the three-dimensional point cloud data aiming at the planes with the intersection straight lines; aiming at a plurality of boundary points with adjacent relation, respectively determining the vertical feet of the boundary points on the corresponding intersecting straight lines; searching a target point closest to the end point from the three-dimensional point cloud data based on the end point corresponding to the maximum foot hanging distance on the intersected straight line; and correcting the end points based on the target points, and identifying the corrected points as weld joint characteristic points for the right-angle workpiece. The method and the device can improve the accuracy of weld joint identification on the right-angle workpiece, and improve the efficiency of weld joint identification on the right-angle workpiece.

Description

Weld joint recognition method and device, welding robot and storage medium
Technical Field
The application relates to the technical field of welding, in particular to a welding seam identification method and device, a welding robot and a storage medium.
Background
In the robot welding technology, in order to weld a right-angled workpiece, it is necessary to recognize a weld of the right-angled workpiece. At present, linear laser scanning robot welding and manual teaching programming robot welding are mainly adopted. In the welding process of the linear laser scanning robot, the welding seam is extracted based on the two-dimensional camera, so that the position of the welding seam is easily deviated and found wrongly, and the accuracy is poor. The manual teaching programming robot consumes great energy and has low efficiency depending on manual operation in the welding process. Therefore, the efficiency and the accuracy cannot be considered in the conventional welding seam identification mode aiming at the right-angle workpiece.
Disclosure of Invention
An object of the present application is to provide a weld recognition method, a device, a welding robot, and a storage medium, which improve the accuracy of weld recognition on a right-angled workpiece and the efficiency of weld recognition on a right-angled workpiece, at least to some extent.
According to an aspect of an embodiment of the present application, there is provided a weld identifying method including:
acquiring three-dimensional point cloud data of a right-angle workpiece;
if the intersecting straight lines exist among the planes based on the three-dimensional point cloud data, extracting a plurality of boundary points with adjacent relations from the three-dimensional point cloud data aiming at the planes with the intersecting straight lines; the plurality of boundary points in the adjacent relationship are a plurality of boundary points adjacent to each other between the boundary points of the two planes;
for the plurality of boundary points with adjacent relations, respectively determining the vertical feet of the boundary points on the corresponding intersecting straight lines;
searching a target point closest to the end point from the three-dimensional point cloud data based on the end point corresponding to the maximum foot hanging distance on the intersecting straight line; the maximum foot drop distance is the maximum value of the distance between every two foot drops on the intersecting straight line;
correcting the end point based on the target point, and identifying a corrected point obtained by correction as a weld joint feature point aiming at the right-angle workpiece; the weld joint characteristic points are characteristic points at the weld joint of the right-angle workpiece.
According to an aspect of an embodiment of the present application, there is provided a weld identifying apparatus including:
the acquisition module is used for acquiring three-dimensional point cloud data of the right-angle workpiece;
a boundary point extracting module, configured to, if it is detected that an intersecting straight line exists between multiple planes based on the three-dimensional point cloud data, extract multiple boundary points having an adjacent relationship from the three-dimensional point cloud data for the planes having the intersecting straight line; the plurality of boundary points in the adjacent relationship are a plurality of boundary points adjacent to each other between the boundary points of the two planes;
the foot drop determining module is used for respectively determining the foot drops of the boundary points on the corresponding intersecting straight lines aiming at the boundary points with adjacent relations;
the target point searching module is used for searching a target point which is closest to the end point from the three-dimensional point cloud data based on the end point corresponding to the maximum foot hanging distance on the intersected straight line; the maximum foot drop distance is the maximum value of the distance between every two foot drops on the intersecting straight line;
the welding seam identification module is used for correcting the end point based on the target point and identifying a corrected point obtained through correction as a welding seam characteristic point aiming at the right-angle workpiece; the weld joint characteristic points are characteristic points at the weld joint of the right-angle workpiece.
In some embodiments of the present application, based on the above technical solutions, the weld identifying apparatus is configured to:
acquiring three points which are closest to each other from the target points;
performing ball fitting on the three points with the shortest distance to each other based on the minimum bounding ball to obtain ball data;
extracting a surrounding ball center in the ball data;
and taking the spherical center of the surrounding ball and the target point as weld joint characteristic points for the right-angle workpiece.
In some embodiments of the present application, based on the above technical solutions, the weld identifying apparatus is configured to:
performing plane segmentation on the three-dimensional point cloud data to obtain a first plane, a second plane and a third plane; the first plane is a plane corresponding to the right-angle workpiece and parallel to an XY plane, the second plane is a plane corresponding to the right-angle workpiece and parallel to a YZ plane, the third plane is a plane corresponding to the right-angle workpiece and parallel to an XZ plane, the XY plane is a plane formed by an X axis and a Y axis, the YZ plane is a plane formed by a Y axis and a Z axis, and the XZ plane is a plane formed by an X axis and a Z axis;
detecting whether an intersecting straight line exists between every two planes aiming at the first plane, the second plane and the third plane;
if an intersecting straight line exists between every two planes, extracting a first boundary point of the first plane, a second boundary point of the second plane and a third boundary point of the third plane from the three-dimensional point cloud data;
respectively taking the first boundary point, the second boundary point and the third boundary point as reference points, and detecting the distance between the reference points and non-reference points to obtain the distance between each reference point and the non-reference point; the non-reference point is a boundary point other than the reference point among the first boundary point, the second boundary point, and the third boundary point;
acquiring a reference point with the distance smaller than a preset distance threshold value and a non-reference point with the distance smaller than the preset distance threshold value; the preset distance threshold is a preset distance indicating that the non-reference point is adjacent to the reference point;
and taking the reference point with the distance smaller than a preset distance threshold value and the non-reference point with the distance smaller than the preset distance threshold value as the plurality of boundary points with the adjacent relation.
In some embodiments of the present application, based on the above technical solutions, the weld identifying apparatus is configured to:
determining the distance between every two vertical feet on the intersecting straight line;
comparing the distance between every two vertical feet on the intersecting straight line to obtain the maximum vertical foot distance;
acquiring an end point corresponding to the maximum foot hanging distance on the intersecting straight line;
calculating the Euclidean distance between each point in the three-dimensional point cloud data and the end point to obtain the distance between each point and the end point;
and taking the point in the three-dimensional point cloud data with the maximum distance from the endpoint as the target point.
In some embodiments of the present application, based on the above technical solutions, the weld identifying apparatus is configured to:
acquiring original three-dimensional point cloud data for the right-angle workpiece;
performing coordinate conversion on the original three-dimensional point cloud data to obtain actual three-dimensional point cloud data under a welding robot base coordinate system;
and clustering the actual three-dimensional point cloud data to obtain the three-dimensional point cloud data.
In some embodiments of the present application, based on the above technical solutions, the weld identifying apparatus is configured to:
acquiring a value range of the actual three-dimensional point cloud data on a preset dimension;
traversing each point in the actual three-dimensional point cloud data, and determining the value of the point on the preset dimension; the preset dimension is a preset dimension used for indicating the direction towards the ground;
acquiring all points of which the values are not in the value range;
filtering all points with values not in the value range to obtain real three-dimensional point cloud data;
sampling the real three-dimensional point cloud data to obtain sampled three-dimensional point cloud data;
and clustering the sampled three-dimensional point cloud data to obtain a spatial noise point and target three-dimensional point cloud data, and taking the target three-dimensional point cloud data as the three-dimensional point cloud data.
In some embodiments of the present application, based on the above technical solutions, the weld identifying apparatus is configured to:
and performing coordinate conversion on the original three-dimensional point cloud data based on the attitude matrix of the tool central point at the end of the welding robot and the robot eye matrix to obtain actual three-dimensional point cloud data under the welding robot base coordinate system.
According to an aspect of an embodiment of the present application, there is provided a welding robot including: one or more processors; a storage device for storing one or more programs that, when executed by the one or more processors, cause the welding robot to implement the methods provided in the various alternative implementations described above.
According to an aspect of embodiments of the present application, there is provided a computer program medium having stored thereon computer readable instructions, which, when executed by a processor of a computer, cause the computer to perform the method provided in the above various alternative implementations.
According to an aspect of an embodiment of the present application, there is provided a computer program product or a computer program comprising computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions, so that the computer device executes the method provided in the above-mentioned various alternative implementation modes.
In the technical scheme provided by the embodiment of the application, a plane with an intersecting straight line is detected according to three-dimensional point cloud data of a right-angle workpiece, boundary points adjacent to other planes in the boundary points of the plane are extracted, two target foot pendants with the farthest distance are obtained based on the foot pendants of the boundary points on the intersecting straight line, a target point corresponding to the target foot pendants is obtained from the three-dimensional point cloud data, and the end points corresponding to the target foot pendants are corrected based on the target point to obtain the weld joint feature points of the right-angle workpiece. On one hand, the characteristics of the workpiece in a three-dimensional space can be reflected by processing the three-dimensional data, so that the accuracy of the weld characteristic point identification can be improved compared with the mode of two-dimensional camera extraction. On the other hand, the three-dimensional data are processed in the mode, and compared with manual operation relying on manual teaching, the efficiency of identifying the weld joint feature points can be improved.
Other features and advantages of the present application will be apparent from the following detailed description, or may be learned by practice of the application.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the application.
Drawings
The above and other objects, features and advantages of the present application will become more apparent by describing in detail exemplary embodiments thereof with reference to the attached drawings.
Fig. 1 shows a schematic flow chart of a weld joint identification method according to a first embodiment of the present application.
Fig. 2 shows a schematic flow chart of a weld joint identification method according to the second embodiment of the present application.
Fig. 3 shows a schematic diagram of the relationship between three planes according to the second embodiment of the present application.
Fig. 4 shows a relationship diagram between three intersecting straight lines according to the second embodiment of the present application.
Fig. 5 is a schematic diagram illustrating a relationship between corresponding boundary points of each plane according to the second embodiment of the present application.
Fig. 6 shows a relationship diagram between adjacent boundary points according to the second embodiment of the present application.
Fig. 7 is a schematic diagram illustrating a relationship between weld feature points according to a second embodiment of the present application.
FIG. 8 is a flowchart illustrating steps involved in a specific weld identification process according to example two of the present application.
Fig. 9 is a schematic structural diagram of a weld joint identification device according to a third embodiment of the present application.
Fig. 10 shows a schematic structural diagram of a welding robot according to the fourth embodiment of the present application.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. Example embodiments may, however, be embodied in many different forms and should not be construed as limited to the examples set forth herein; rather, these example embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art. The drawings are merely schematic illustrations of the present application and are not necessarily drawn to scale. The same reference numerals in the drawings denote the same or similar parts, and thus their repetitive description will be omitted.
Furthermore, the described features, structures, or characteristics may be combined in any suitable manner in one or more example embodiments. In the following description, numerous specific details are provided to give a thorough understanding of example embodiments of the application. One skilled in the relevant art will recognize, however, that the subject matter of the present application can be practiced without one or more of the specific details, or with other methods, components, steps, and so forth. In other instances, well-known structures, methods, implementations, or operations are not shown or described in detail to avoid obscuring aspects of the application.
Some of the block diagrams shown in the figures are functional entities and do not necessarily correspond to physically or logically separate entities. These functional entities may be implemented in the form of software, or in one or more hardware modules or integrated circuits, or in different networks and/or processor devices and/or microcontroller devices.
Fig. 1 shows a schematic flow chart of a weld joint identification method according to a first embodiment of the present application. The weld joint identification method comprises the following steps:
step S101: acquiring three-dimensional point cloud data of a right-angle workpiece;
the three-dimensional point cloud data of the right-angle workpiece can be obtained by shooting the right-angle workpiece through the three-dimensional camera, and the three-dimensional camera can be arranged at the tail end of a welding gun of the welding robot.
As an optional implementation manner, the original three-dimensional point cloud data of the right-angle workpiece photographed by the three-dimensional camera is used as the three-dimensional point cloud data to obtain the three-dimensional point cloud data quickly.
As an optional implementation manner, in order to make the three-dimensional point cloud data more accurate, ground point cloud and spatial noise may be filtered to obtain the three-dimensional point cloud data, so as to further improve the accuracy of the weld characteristic point identification.
As an optional implementation manner, in order to enable the weld joint identification process to be fast and the identification result to be high in accuracy, the point cloud may be sampled on the basis of filtering the ground point cloud and the spatial noise, and a part of point cloud data is obtained as the three-dimensional point cloud data and is processed to obtain the weld joint feature points.
Step S102: if the fact that intersecting straight lines exist among the planes is detected based on the three-dimensional point cloud data, extracting a plurality of boundary points with adjacent relations from the three-dimensional point cloud data aiming at the planes with the intersecting straight lines; the plurality of boundary points in which the adjacent relationship exists are a plurality of boundary points adjacent to each other between the boundary points of the two planes;
as an alternative embodiment, the plurality of planes includes three planes, and three intersecting straight lines exist between the three planes. For right-angled workpieces, the three planes intersect one another, there is a boundary point for each plane in the vicinity of the intersection line, and the boundary points of the two planes on either side of the intersection line are adjacent. In order to obtain adjacent boundary points of two planes, a specific distance range for indicating the adjacency may be preset, and if the distance between two boundary points is within the distance range, it is determined that the two boundary points are adjacent. It should be noted that, when extracting boundary points having an adjacent relationship, the distance between the boundary points of the two planes is determined, and corresponding adjacent boundary points are obtained between the two planes; for boundary points in the same plane, even if the distance between two boundary points is within this range, they are not regarded as boundary points having an adjacent relationship.
As an optional implementation manner, if it is detected that there is an intersecting straight line between multiple planes based on the three-dimensional point cloud data, extracting multiple boundary points having an adjacent relationship from the three-dimensional point cloud data for the planes having the intersecting straight line includes: performing plane segmentation on the three-dimensional point cloud data to obtain a first plane, a second plane and a third plane; the first plane is a plane parallel to the XY plane corresponding to the right-angle workpiece, the second plane is a plane parallel to the YZ plane corresponding to the right-angle workpiece, the third plane is a plane parallel to the XZ plane corresponding to the right-angle workpiece, the XY plane is a plane formed by the X axis and the Y axis, the YZ plane is a plane formed by the Y axis and the Z axis, and the XZ plane is a plane formed by the X axis and the Z axis; detecting whether an intersecting straight line exists between every two planes aiming at the first plane, the second plane and the third plane; if an intersecting straight line exists between every two planes, extracting a first boundary point of the first plane, a second boundary point of the second plane and a third boundary point of the third plane from the three-dimensional point cloud data; respectively taking the first boundary point, the second boundary point and the third boundary point as reference points, and detecting the distance between the reference points and the non-reference points to obtain the distance between each reference point and the non-reference point; the non-reference point is a boundary point other than the reference point among the first boundary point, the second boundary point, and the third boundary point; acquiring a reference point with a distance smaller than a preset distance threshold and a non-reference point with a distance smaller than a preset distance threshold; the preset distance threshold is a preset distance indicating that the non-reference point is adjacent to the reference point; and taking the reference points with the distances smaller than the preset distance threshold value and the non-reference points with the distances smaller than the preset distance threshold value as a plurality of boundary points with adjacent relation.
The first boundary point is a boundary point on a first plane, the second boundary point is a boundary point on a second plane, and the third boundary point is a boundary point on a third plane. The reference point is a boundary point that is a reference and detects a distance to another boundary point.
The distance detection may be performed using the first boundary point as a reference point and the second and third boundary points as non-reference points, to obtain a distance between the reference point and the non-reference point. And taking the second boundary point as a reference point, taking the first boundary point and the third boundary point as non-reference points, and performing distance detection to obtain the distance between the reference point and the non-reference points. And taking the third boundary point as a reference point, taking the first boundary point and the second boundary point as non-reference points, and performing distance detection to obtain the distance between the reference point and the non-reference points.
Step S103: aiming at a plurality of boundary points with adjacent relation, respectively determining the vertical feet of the boundary points on the corresponding intersecting straight lines;
the intersecting straight line between two boundary points with adjacent relationship is the intersecting straight line corresponding to the two boundary points.
Step S104: searching a target point closest to the end point from the three-dimensional point cloud data based on the end point corresponding to the maximum foot hanging distance on the intersected straight line; the maximum foot drop distance is the maximum value of the distance between every two foot drops on the intersecting straight line;
the point on the intersecting straight line of the two drop feet with the largest drop foot distance is the end point. The target point is the point in the three-dimensional point cloud data closest to the endpoint.
As an optional implementation manner, searching a target point closest to an end point from the three-dimensional point cloud data based on the end point corresponding to the maximum drop foot distance on the intersecting straight line, includes: determining the distance between every two vertical feet on the intersecting straight lines; comparing the distance between every two vertical feet on the intersecting straight line to obtain the maximum vertical foot distance; acquiring an end point corresponding to the maximum foot drop distance on the intersected straight line; calculating the Euclidean distance between each point and an end point in the three-dimensional point cloud data to obtain the distance between each point and the end point; and taking the point in the three-dimensional point cloud data with the maximum distance from the end point as a target point.
Step S105: correcting the end points based on the target points, and identifying the corrected points as weld joint feature points aiming at the right-angle workpiece; the weld joint characteristic points are characteristic points at the weld joint of the right-angle workpiece.
As an optional embodiment, correcting the end point based on the target point, and identifying a corrected point as a weld seam feature point for a right-angle workpiece, includes: acquiring three points with the shortest distance from each other from the target points; performing ball fitting on three points which are closest to each other based on the minimum bounding ball to obtain ball data; extracting a surrounding ball center in the ball data; and taking the spherical center of the surrounding ball and the target point as the characteristic point of the weld joint for the right-angle workpiece.
The target points may specifically include two target points corresponding to two end points of the first intersecting straight line, two target points corresponding to two end points of the second intersecting straight line, and two target points corresponding to two end points of the third intersecting straight line. The first intersecting straight line is an intersecting straight line between the first plane and the second plane. The second intersecting straight line is an intersecting straight line between the second plane and the third plane. The third intersecting straight line is an intersecting straight line between the first plane and the third plane. The distances between every three target points in the six target points can be detected respectively, and the three target points with the shortest distances can be obtained through comparison. The distances between the three target points can be detected in the following manner: and calculating the distance between every two target points, and taking the sum of the distances between every two target points corresponding to the three target points as the distance between the three target points. Wherein the three points that are closest together are located near the intersection of the three planes, and the other three points are located further away from the intersection.
Whether the spherical center of the surrounding ball can reflect the characteristics of the welding seam of the right-angle workpiece or not is judged, and the target point and the spherical center of the surrounding ball are used as characteristic points of the welding seam. Thereby making the weld characteristic point more complete.
As an optional embodiment, correcting the end point based on the target point, and identifying a corrected point as a weld seam feature point for a right-angle workpiece, includes: and taking the target point as a weld joint characteristic point of the right-angle workpiece to finish the correction of the end point.
As an optional embodiment, after correcting the end point based on the target point and identifying the corrected point as a weld feature point for a right-angled workpiece, the method further includes: and welding the right-angle workpiece according to the welding seam characteristic points of the right-angle workpiece.
In this embodiment, a plane having an intersecting straight line is detected for three-dimensional point cloud data of a right-angle workpiece, boundary points adjacent to other planes in boundary points of the plane are extracted, two target pendants with the farthest distance are obtained based on the pendants of the boundary points on the intersecting straight line, a target point corresponding to the target pendants is obtained from the three-dimensional point cloud data, and an end point corresponding to the target pendants is corrected based on the target point to obtain weld characteristic points of the right-angle workpiece. On one hand, the characteristics of the workpiece in a three-dimensional space can be reflected by processing the three-dimensional data, so that the accuracy of the weld joint feature point identification can be improved compared with a two-dimensional camera extraction mode. On the other hand, the three-dimensional data are processed in the mode, and compared with manual operation relying on manual teaching, the efficiency of identifying the characteristic points of the welding seam can be improved.
Fig. 2 shows a schematic flow chart of a weld joint identification method according to the second embodiment of the present application. The weld joint identification method comprises the following steps:
step S201: acquiring original three-dimensional point cloud data for a right-angle workpiece;
in the implementation, how to obtain more accurate three-dimensional point cloud data is explained, so that the accuracy of the identified weld joint feature points is higher than that of weld joint identification directly by adopting originally acquired three-dimensional point cloud data.
Step S202: carrying out coordinate conversion on the original three-dimensional point cloud data to obtain actual three-dimensional point cloud data under a welding robot base coordinate system;
as an optional implementation manner, performing coordinate transformation on the original three-dimensional point cloud data to obtain actual three-dimensional point cloud data in a welding robot base coordinate system, includes: and performing coordinate conversion on the original three-dimensional point cloud data based on the attitude matrix of the tool center point at the tail end of the welding robot and the robot eye matrix to obtain actual three-dimensional point cloud data under the welding robot base coordinate system.
As an alternative embodiment, the tool center point is the welding robot torch tip point.
Step S203: clustering actual three-dimensional point cloud data to obtain three-dimensional point cloud data;
as an optional implementation manner, clustering actual three-dimensional point cloud data to obtain three-dimensional point cloud data includes: acquiring a value range of actual three-dimensional point cloud data on a preset dimension; traversing each point in the actual three-dimensional point cloud data, and determining the value of the point on a preset dimension; the preset dimension is a preset dimension used for indicating the direction towards the ground; acquiring all points of which the values are not in the value range; filtering all points with values not within the value range to obtain real three-dimensional point cloud data; sampling the real three-dimensional point cloud data to obtain sampled three-dimensional point cloud data; and clustering the sampled three-dimensional point cloud data to obtain a spatial noise point and target three-dimensional point cloud data, and taking the target three-dimensional point cloud data as the three-dimensional point cloud data.
The real three-dimensional point cloud data is three-dimensional point cloud data from which the ground point cloud data is removed. The sampled three-dimensional point cloud data is data partially sampled in the real three-dimensional point cloud data. The target three-dimensional point cloud data is point cloud data except spatial noise points in the sampled three-dimensional point cloud data.
By adopting the mode, ground point cloud and space noise points can be removed, and more accurate three-dimensional point cloud data of the workpiece can be obtained. And the computational power of welding seam identification can be reduced through sampling, and compared with the method for processing all points, the welding seam identification efficiency can be improved.
Step S204: if the fact that intersecting straight lines exist among the planes is detected based on the three-dimensional point cloud data, extracting a plurality of boundary points with adjacent relations from the three-dimensional point cloud data aiming at the planes with the intersecting straight lines; the plurality of boundary points in which the adjacent relationship exists are a plurality of boundary points adjacent to each other between the boundary points of the two planes;
step S205: aiming at a plurality of boundary points with adjacent relation, respectively determining the vertical feet of the boundary points on the corresponding intersecting straight lines;
step S206: searching a target point closest to the end point from the three-dimensional point cloud data based on the end point corresponding to the maximum foot hanging distance on the intersected straight line; the maximum foot drop distance is the maximum value of the distance between every two foot drops on the intersecting straight lines;
step S207: correcting the end points based on the target points, and identifying the corrected points as weld joint feature points aiming at the right-angle workpiece; the weld joint characteristic points are characteristic points at the weld joint of the right-angle workpiece.
Reference is now made to fig. 3 to 8. The process of weld identification of right-angle workpieces in a particular scene is explained. FIG. 8 is a flowchart of the steps of the weld identification process.
Acquiring original three-dimensional point cloud data of a right-angle workpiece: and (3) carrying out point cloud collection pcd0 on the right-angle workpiece by adopting a three-dimensional data camera.
Converting the original three-dimensional point cloud data into a robot base coordinate system: and (3) converting the pcd0 into a welding robot base coordinate system through a toolPos attitude matrix and a robot eye matrix hand eye of a TCP at the tail end of the welding robot to obtain a right-angle workpiece actual point cloud pcd1. The conversion relationship is as follows:
pcd1=transform(pcd0,toolPos*handEye)。
filtering the ground point cloud: and filtering out the point cloud irrelevant to the right-angle workpiece in the pcd1, namely the ground point cloud, through a filter to obtain a real point cloud pcd2 of the right-angle workpiece.
Sampling the point cloud of the right-angle workpiece: the point cloud pcd2 of the right-angle workpiece is uniformly sampled to obtain the point cloud pcd3, and the point cloud number of the actual point cloud can be reduced in the process, so that the welding seam extraction computing power of the right-angle workpiece is reduced, and the welding seam extraction efficiency is improved.
And (3) clustering the right-angle workpiece point clouds, namely clustering and dividing the right-angle workpiece actual point cloud pcd3, removing spatial noise points except the right-angle workpiece actual point cloud, only keeping the point cloud pcd4 related to the right-angle welding seam, and avoiding interference on the extraction of the right-angle welding seam.
And performing plane segmentation on the right-angle workpiece point cloud to obtain three independent point cloud planes, namely performing plane segmentation on the right-angle workpiece actual point cloud pcd4 to respectively obtain a point cloud plane1, a plane2 and a plane3. The relationship among the planes 1, 2 and 3 is shown in fig. 3.
And (3) acquiring intersecting straight lines between every two point cloud planes of the right-angle workpiece, traversing all the point cloud planes 1, 2 and 3 of the segmented planes, judging whether intersecting straight lines exist between every two point cloud planes, and if so, solving corresponding straight line equations to be L1, L2 and L3. The relationship among the straight lines L1, L2, and L3 is shown in fig. 4.
And extracting boundary points of the three independent point cloud planes, namely respectively solving boundary points b1, b2 and b3 corresponding to the plane point clouds plane1, plane2 and plane3. The relationship between the boundary points b1, b2, and b3 is shown in fig. 5.
And (3) acquiring adjacent point clouds between every two boundary points, namely traversing all the boundary points b1, b2 and b3, and calculating the adjacent point clouds p1, p2 and p3 of every two boundary points. The relationship between point clouds p1, p2 and p3 adjacent to each other by two boundary points is shown in fig. 6.
And (3) solving the foot hanging of the intersecting straight line of the adjacent points on the point cloud plane, namely solving the foot hanging of the adjacent boundary points p1, p2 and p3 on the corresponding intersecting straight lines L1, L2 and L3.
And (3) calculating the end point and the correction point of the drop foot on the straight line: and calculating points pt1, pt2, pt3, pt4, pt5 and pt6 with the maximum Euclidean distance between the vertical feet on the straight line, wherein the points are the end points of the welding seam. The euclidean distance is calculated as follows:
d=sqrt((x1-x2)^2+(y1-y2)^2+(z1-z2)^2)。
in the actual point cloud pcd4 of the right-angle workpiece, points pt1', pt2', pt3', pt4', pt5' and pt6' which are respectively closest to Euclidean distances among endpoints pt1, pt2, pt3, pt4, pt5 and pt6 are searched for correcting the endpoints, then a minimum ball surrounding method is adopted to carry out ball fitting on pt4', pt5' and pt6', and the center cir0 of the ball is obtained.
Outputting final weld characteristic points: and finally outputting true point positions pt1', pt2', pt3' and a foot cir0 of the right-angle workpiece after correction. The relationships between pt1', pt2', pt3', pt4', pt5', pt6' and the center of sphere cir0 are shown in fig. 7.
By adopting the mode, the identification accuracy and efficiency of the welding seam of the right-angle workpiece can be improved.
Fig. 9 shows a weld identifying apparatus according to a third embodiment of the present application, the apparatus including:
the acquisition module 301 is used for acquiring three-dimensional point cloud data of the right-angle workpiece;
a boundary point extracting module 302, configured to, if it is detected that an intersecting straight line exists between multiple planes based on the three-dimensional point cloud data, extract multiple boundary points having an adjacent relationship from the three-dimensional point cloud data for the planes having the intersecting straight line; the plurality of boundary points in which the adjacent relationship exists are a plurality of boundary points adjacent to each other between the boundary points of the two planes;
a foot drop determining module 303, configured to determine, for each of the plurality of boundary points having an adjacent relationship, a foot drop of the boundary point on the corresponding intersection straight line;
a target point searching module 304, configured to search, based on an end point corresponding to the maximum foot drop distance on the intersecting straight line, a target point closest to the end point from the three-dimensional point cloud data; the maximum foot drop distance is the maximum value of the distance between every two foot drops on the intersecting straight line;
a weld joint identification module 305, configured to correct the end point based on the target point, and identify a corrected point obtained through correction as a weld joint feature point for the right-angle workpiece; the weld joint characteristic points are characteristic points at the weld joint of the right-angle workpiece.
In an exemplary embodiment of the present application, the weld identifying apparatus is configured to:
acquiring three points with the shortest distance from each other from the target points;
performing ball fitting on three points which are closest to each other based on the minimum bounding ball to obtain ball data;
extracting a surrounding ball center in the ball data;
and taking the spherical center of the surrounding ball and the target point as the characteristic point of the weld joint for the right-angle workpiece.
In an exemplary embodiment of the present application, the weld identifying apparatus is configured to:
performing plane segmentation on the three-dimensional point cloud data to obtain a first plane, a second plane and a third plane; the first plane is a plane parallel to the XY plane corresponding to the right-angle workpiece, the second plane is a plane parallel to the YZ plane corresponding to the right-angle workpiece, the third plane is a plane parallel to the XZ plane corresponding to the right-angle workpiece, the XY plane is a plane formed by the X axis and the Y axis, the YZ plane is a plane formed by the Y axis and the Z axis, and the XZ plane is a plane formed by the X axis and the Z axis;
detecting whether an intersecting straight line exists between every two planes aiming at the first plane, the second plane and the third plane;
if an intersecting straight line exists between every two planes, extracting a first boundary point of the first plane, a second boundary point of the second plane and a third boundary point of the third plane from the three-dimensional point cloud data;
respectively taking the first boundary point, the second boundary point and the third boundary point as reference points, and detecting the distance between the reference points and the non-reference points to obtain the distance between each reference point and the non-reference point; the non-reference point is a boundary point other than the reference point among the first boundary point, the second boundary point and the third boundary point;
acquiring a reference point with a distance smaller than a preset distance threshold and a non-reference point with a distance smaller than a preset distance threshold; the preset distance threshold is a preset distance indicating that the non-reference point is adjacent to the reference point;
and taking the reference points with the distances smaller than the preset distance threshold value and the non-reference points with the distances smaller than the preset distance threshold value as a plurality of boundary points with adjacent relation.
In an exemplary embodiment of the present application, the weld identifying apparatus is configured to:
determining the distance between every two drop feet on the intersecting straight lines;
comparing the distance between every two vertical feet on the intersecting straight lines to obtain the maximum vertical foot distance;
acquiring an end point corresponding to the maximum foot drop distance on the intersecting straight line;
calculating the Euclidean distance between each point and an end point in the three-dimensional point cloud data to obtain the distance between each point and the end point;
and taking the point in the three-dimensional point cloud data with the maximum distance from the end point as a target point.
In an exemplary embodiment of the present application, the weld identifying apparatus is configured to:
acquiring original three-dimensional point cloud data for a right-angle workpiece;
carrying out coordinate conversion on the original three-dimensional point cloud data to obtain actual three-dimensional point cloud data under a welding robot base coordinate system;
and clustering the actual three-dimensional point cloud data to obtain the three-dimensional point cloud data.
In an exemplary embodiment of the present application, the weld identifying apparatus is configured to:
acquiring a value range of actual three-dimensional point cloud data on a preset dimension;
traversing each point in the actual three-dimensional point cloud data, and determining the value of the point on a preset dimension; the preset dimension is a preset dimension used for indicating the direction towards the ground;
acquiring all points of which the values are not in the value range;
filtering all points with values not within the value range to obtain real three-dimensional point cloud data;
sampling the real three-dimensional point cloud data to obtain sampled three-dimensional point cloud data;
and clustering the sampled three-dimensional point cloud data to obtain a spatial noise point and target three-dimensional point cloud data, and taking the target three-dimensional point cloud data as the three-dimensional point cloud data.
In an exemplary embodiment of the present application, the weld identifying apparatus is configured to:
and performing coordinate conversion on the original three-dimensional point cloud data based on the attitude matrix of the tool central point at the end of the welding robot and the robot hand-eye matrix to obtain actual three-dimensional point cloud data under the welding robot base coordinate system.
A welding robot 40 according to the fourth embodiment of the present application is described below with reference to fig. 10. The welding robot 40 shown in fig. 10 is only an example, and should not bring any limitation to the function and the range of use of the embodiment of the present application.
As shown in fig. 10, the welding robot 40 is in the form of a general purpose computing device. The components of the welding robot 40 may include, but are not limited to: the at least one processing unit 410, the at least one memory unit 420, and a bus 430 that couples various system components including the memory unit 420 and the processing unit 410.
Wherein the storage unit stores a program code, which can be executed by the processing unit 410, to cause the processing unit 410 to perform the steps according to various exemplary embodiments of the present application described in the description part of the above exemplary methods of the present specification. For example, processing unit 410 may perform various steps as shown in fig. 1.
The storage unit 420 may include readable media in the form of volatile storage units, such as a random access memory unit (RAM) 4201 and/or a cache memory unit 4202, and may further include a read only memory unit (ROM) 4203.
The storage unit 420 may also include a program/utility 4204 having a set (at least one) of program modules 4205, such program modules 4205 including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each of which, or some combination thereof, may comprise an implementation of a network environment.
Bus 430 may be any bus representing one or more of several types of bus structures, including a memory unit bus or memory unit controller, a peripheral bus, an accelerated graphics port, a processing unit, or a local bus using any of a variety of bus architectures.
The welding robot 40 can also communicate with one or more devices that enable a user to interact with the welding robot 40, and/or with any device (e.g., router, modem, etc.) that enables the welding robot 40 to communicate with one or more other computing devices. Such communication may occur via input/output (I/O) interfaces 450. An input/output (I/O) interface 450 is connected to the display unit 440. Also, the welding robot 40 can communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network, such as the internet) via the network adapter 460. As shown, the network adapter 460 communicates with the other modules of the welding robot 40 via the bus 430. It should be understood that although not shown in the figures, other hardware and/or software modules may be used in conjunction with the welding robot 40, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data backup storage systems, among others.
Through the above description of the embodiments, those skilled in the art will readily understand that the exemplary embodiments described herein may be implemented by software, or by software in combination with necessary hardware. Therefore, the technical solution according to the embodiments of the present application can be embodied in the form of a software product, which can be stored in a non-volatile storage medium (which can be a CD-ROM, a usb disk, a removable hard disk, etc.) or on a network, and includes several instructions to make a computing device (including a welding robot) execute the method according to the embodiments of the present application.
In an exemplary embodiment of the present application, there is also provided a computer-readable storage medium having stored thereon computer-readable instructions which, when executed by a processor of a computer, cause the computer to perform the method described in the above method embodiment section.
According to an embodiment of the present application, there is also provided a program product for implementing the method in the above-described method embodiment, which can employ a portable compact disc read only memory (CD-ROM) and include program code, and can be run on a welding robot. However, the program product of the present application is not limited thereto, and in this document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
A computer readable signal medium may include a propagated data signal with readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A readable signal medium may be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Program code for carrying out operations of the present application may be written in any combination of one or more programming languages, including an object oriented programming language such as JAVA, C + +, or the like, as well as conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device and partly on a remote computing device, or entirely on the remote computing device or server. In situations involving remote computing devices, the remote computing devices may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to external computing devices (e.g., through the internet using an internet service provider).
It should be noted that although in the above detailed description several modules or units of the device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit, according to embodiments of the application. Conversely, the features and functions of one module or unit described above may be further divided into embodiments by a plurality of modules or units.
Moreover, although the steps of the methods in this application are depicted in the drawings in a particular order, this does not require or imply that these steps must be performed in this particular order, or that all of the depicted steps must be performed, to achieve desirable results. Additionally or alternatively, certain steps may be omitted, multiple steps combined into one step execution, and/or one step broken into multiple step executions, etc.
Through the above description of the embodiments, those skilled in the art will readily understand that the exemplary embodiments described herein may be implemented by software, or by software in combination with necessary hardware. Therefore, the technical solution according to the embodiments of the present application can be embodied in the form of a software product, which can be stored in a non-volatile storage medium (which can be a CD-ROM, a usb disk, a removable hard disk, etc.) or on a network, and includes several instructions to make a computing device (including a welding robot) execute the method according to the embodiments of the present application.
Other embodiments of the present application will be apparent to those skilled in the art from consideration of the specification and practice of the invention disclosed herein. This application is intended to cover any variations, uses, or adaptations of the invention following, in general, the principles of the application and including such departures from the present disclosure as come within known or customary practice within the art to which the invention pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the application being indicated by the following claims.

Claims (10)

1. A weld recognition method, comprising:
acquiring three-dimensional point cloud data of a right-angle workpiece;
if the fact that intersecting straight lines exist among the planes is detected based on the three-dimensional point cloud data, extracting a plurality of boundary points with adjacent relations from the three-dimensional point cloud data aiming at the planes with the intersecting straight lines; the plurality of boundary points with the adjacent relation are a plurality of boundary points adjacent to each other between the boundary points of the two planes;
for the plurality of boundary points with adjacent relations, respectively determining the vertical feet of the boundary points on the corresponding intersecting straight lines;
searching a target point closest to the endpoint distance from the three-dimensional point cloud data based on the endpoint corresponding to the maximum drop foot distance on the intersecting straight line; the maximum foot drop distance is the maximum value of the distance between every two foot drops on the intersecting straight line;
correcting the end point based on the target point, and identifying a corrected point obtained by correction as a weld joint feature point aiming at the right-angle workpiece; the weld joint characteristic points are characteristic points at the weld joint of the right-angle workpiece.
2. The method of claim 1, wherein correcting the end point based on the target point, and identifying a corrected correction point as a weld feature point for the right-angled workpiece comprises:
acquiring three points which are closest to each other from the target points;
performing ball fitting on the three points with the shortest distance to each other based on the minimum bounding ball to obtain ball data;
extracting a surrounding ball center in the ball data;
and taking the spherical center of the surrounding ball and the target point as weld joint characteristic points for the right-angle workpiece.
3. The method of claim 1, wherein if it is detected that an intersecting straight line exists between a plurality of planes based on the three-dimensional point cloud data, extracting a plurality of boundary points having an adjacent relationship from the three-dimensional point cloud data for the planes having the intersecting straight line, comprises:
performing plane segmentation on the three-dimensional point cloud data to obtain a first plane, a second plane and a third plane; the first plane is a plane parallel to an XY plane corresponding to the right-angle workpiece, the second plane is a plane parallel to a YZ plane corresponding to the right-angle workpiece, the third plane is a plane parallel to an XZ plane corresponding to the right-angle workpiece, the XY plane is a plane formed by an X axis and a Y axis, the YZ plane is a plane formed by a Y axis and a Z axis, and the XZ plane is a plane formed by an X axis and a Z axis;
detecting whether an intersecting straight line exists between every two planes aiming at the first plane, the second plane and the third plane;
if an intersecting straight line exists between every two planes, extracting a first boundary point of the first plane, a second boundary point of the second plane and a third boundary point of the third plane from the three-dimensional point cloud data;
respectively taking the first boundary point, the second boundary point and the third boundary point as reference points, and detecting the distance between the reference points and non-reference points to obtain the distance between each reference point and the non-reference point; the non-reference point is a boundary point other than the reference point among the first boundary point, the second boundary point, and the third boundary point;
acquiring the reference point with the distance smaller than a preset distance threshold and the non-reference point with the distance smaller than the preset distance threshold; the preset distance threshold is a preset distance indicating that the non-reference point is adjacent to the reference point;
and taking the reference point with the distance smaller than a preset distance threshold value and the non-reference point with the distance smaller than the preset distance threshold value as the plurality of boundary points with the adjacent relation.
4. The method of claim 1, wherein finding the target point closest to the end point from the three-dimensional point cloud data based on the end point corresponding to the maximum drop foot distance on the intersecting straight line comprises:
determining the distance between every two vertical feet on the intersecting straight line;
comparing the distance between every two vertical feet on the intersecting straight line to obtain the maximum vertical foot distance;
acquiring an end point corresponding to the maximum foot hanging distance on the intersecting straight line;
calculating the Euclidean distance between each point in the three-dimensional point cloud data and the end point to obtain the distance between each point and the end point;
and taking the point in the three-dimensional point cloud data with the maximum distance from the endpoint as the target point.
5. The method of claim 1, wherein acquiring three-dimensional point cloud data of a right-angled workpiece comprises:
acquiring original three-dimensional point cloud data for the right-angle workpiece;
performing coordinate conversion on the original three-dimensional point cloud data to obtain actual three-dimensional point cloud data under a welding robot base coordinate system;
and clustering the actual three-dimensional point cloud data to obtain the three-dimensional point cloud data.
6. The method of claim 5, wherein clustering the actual three-dimensional point cloud data to obtain the three-dimensional point cloud data comprises:
acquiring a value range of the actual three-dimensional point cloud data on a preset dimension;
traversing each point in the actual three-dimensional point cloud data, and determining the value of the point on the preset dimension; the preset dimension is a preset dimension used for indicating the direction towards the ground;
acquiring all points of which the values are not in the value range;
filtering all points with values not in the value range to obtain real three-dimensional point cloud data;
sampling the real three-dimensional point cloud data to obtain sampled three-dimensional point cloud data;
and clustering the sampled three-dimensional point cloud data to obtain a spatial noise point and target three-dimensional point cloud data, and taking the target three-dimensional point cloud data as the three-dimensional point cloud data.
7. The method of claim 5, wherein performing a coordinate transformation on the original three-dimensional point cloud data to obtain actual three-dimensional point cloud data in a welding robot base coordinate system comprises:
and performing coordinate conversion on the original three-dimensional point cloud data based on the attitude matrix of the tool center point at the tail end of the welding robot and the robot eye matrix to obtain actual three-dimensional point cloud data under the welding robot base coordinate system.
8. A weld recognition device, comprising:
the acquisition module is used for acquiring three-dimensional point cloud data of the right-angle workpiece;
a boundary point extracting module, configured to, if it is detected that an intersecting straight line exists between multiple planes based on the three-dimensional point cloud data, extract multiple boundary points having an adjacent relationship from the three-dimensional point cloud data for the planes having the intersecting straight line; the plurality of boundary points in the adjacent relationship are a plurality of boundary points adjacent to each other between the boundary points of the two planes;
the foot hanging determination module is used for respectively determining the foot hanging of the boundary points on the corresponding intersected straight lines aiming at the plurality of boundary points with adjacent relations;
the target point searching module is used for searching a target point which is closest to the endpoint in the three-dimensional point cloud data based on the endpoint corresponding to the maximum foot hanging distance on the intersected straight line; the maximum foot drop distance is the maximum value of the distance between every two foot drops on the intersecting straight line;
the welding seam identification module is used for correcting the end point based on the target point and identifying a corrected point obtained through correction as a welding seam characteristic point aiming at the right-angle workpiece; the weld joint characteristic points are characteristic points at the weld joint of the right-angle workpiece.
9. A welding robot, comprising:
one or more processors;
storage means for storing one or more programs that, when executed by the one or more processors, cause the welding robot to implement the method of any one of claims 1 to 7.
10. A computer-readable storage medium having stored thereon computer-readable instructions which, when executed by a processor of a computer, cause the computer to perform the method of any one of claims 1 to 7.
CN202211063959.0A 2022-08-31 2022-08-31 Weld joint recognition method and device, welding robot and storage medium Pending CN115409808A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211063959.0A CN115409808A (en) 2022-08-31 2022-08-31 Weld joint recognition method and device, welding robot and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211063959.0A CN115409808A (en) 2022-08-31 2022-08-31 Weld joint recognition method and device, welding robot and storage medium

Publications (1)

Publication Number Publication Date
CN115409808A true CN115409808A (en) 2022-11-29

Family

ID=84162973

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211063959.0A Pending CN115409808A (en) 2022-08-31 2022-08-31 Weld joint recognition method and device, welding robot and storage medium

Country Status (1)

Country Link
CN (1) CN115409808A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115810133A (en) * 2023-02-09 2023-03-17 中建科技集团有限公司 Welding control method based on image processing and point cloud processing and related equipment
CN117830297A (en) * 2024-03-01 2024-04-05 法奥意威(苏州)机器人***有限公司 Weld joint identification method, welding device and electronic equipment

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115810133A (en) * 2023-02-09 2023-03-17 中建科技集团有限公司 Welding control method based on image processing and point cloud processing and related equipment
CN115810133B (en) * 2023-02-09 2023-05-23 中建科技集团有限公司 Welding control method based on image processing and point cloud processing and related equipment
CN117830297A (en) * 2024-03-01 2024-04-05 法奥意威(苏州)机器人***有限公司 Weld joint identification method, welding device and electronic equipment
CN117830297B (en) * 2024-03-01 2024-05-28 法奥意威(苏州)机器人***有限公司 Weld joint identification method, welding device and electronic equipment

Similar Documents

Publication Publication Date Title
CN109242903B (en) Three-dimensional data generation method, device, equipment and storage medium
CN115409808A (en) Weld joint recognition method and device, welding robot and storage medium
WO2020206666A1 (en) Depth estimation method and apparatus employing speckle image and face recognition system
JP2020042816A (en) Object detection method, device, apparatus, storage media, and vehicle
CN111612753B (en) Three-dimensional object detection method and device, electronic equipment and readable storage medium
CN112016638A (en) Method, device and equipment for identifying steel bar cluster and storage medium
CN110930442B (en) Method and device for determining positions of key points in robot hand-eye calibration based on calibration block
CN115781673A (en) Part grabbing method, device, equipment and medium
CN109255801B (en) Method, device and equipment for tracking edges of three-dimensional object in video and storage medium
CN110909713A (en) Method, system and medium for extracting point cloud data track
CN111368927A (en) Method, device and equipment for processing labeling result and storage medium
CN110530375B (en) Robot adaptive positioning method, positioning device, robot and storage medium
CN115409809A (en) Weld joint recognition method and device, welding robot and storage medium
Saadi et al. Optimizing rgb-d fusion for accurate 6dof pose estimation
CN109300322B (en) Guideline drawing method, apparatus, device, and medium
CN114757878A (en) Welding teaching method, device, terminal equipment and computer readable storage medium
CN112598784A (en) Method and device for adjusting contour scanning result, electronic equipment and storage medium
CN112085786B (en) Pose information determining method and device
CN113009908A (en) Motion control method, device, equipment and storage medium for unmanned equipment
CN112365500B (en) Contour data completion method and device, electronic equipment and storage medium
CN114202689A (en) Point location marking method and device, electronic equipment and storage medium
CN113009884A (en) Method, device, equipment and storage medium for controlling operation of unmanned equipment
CN112836558B (en) Mechanical arm tail end adjusting method, device, system, equipment and medium
CN115294374A (en) Method for butt joint of digital-analog and actual workpiece weld joint
CN113721618B (en) Plane determination method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination