CN115781673A - Part grabbing method, device, equipment and medium - Google Patents

Part grabbing method, device, equipment and medium Download PDF

Info

Publication number
CN115781673A
CN115781673A CN202211449195.9A CN202211449195A CN115781673A CN 115781673 A CN115781673 A CN 115781673A CN 202211449195 A CN202211449195 A CN 202211449195A CN 115781673 A CN115781673 A CN 115781673A
Authority
CN
China
Prior art keywords
candidate
grabbed
circle
determining
attribute information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211449195.9A
Other languages
Chinese (zh)
Inventor
云鹏辉
杨帆
刘博峰
许雄
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jieka Robot Co ltd
Original Assignee
Jieka Robot Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jieka Robot Co ltd filed Critical Jieka Robot Co ltd
Priority to CN202211449195.9A priority Critical patent/CN115781673A/en
Publication of CN115781673A publication Critical patent/CN115781673A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Manipulator (AREA)

Abstract

The invention discloses a part grabbing method, a part grabbing device, equipment and a medium. The method comprises the following steps: responding to a part grabbing request, controlling a camera to acquire an image of a part to be grabbed, and determining a target image and edge point cloud information of the target image; determining attribute information of candidate circles in the target image according to the edge point cloud information; grouping the candidate circles according to the attribute information, and determining a target circle associated with the corresponding part to be grabbed from each group of candidate circles; and determining the accurate pose corresponding to the part to be grabbed according to the attribute information of the target circle, and controlling a mechanical arm clamping jaw to grab the part to be grabbed. According to the technical scheme, the circular area suitable for grabbing the part can be accurately determined, so that the accurate pose of the part can be determined, accurate grabbing of the part is achieved, and grabbing efficiency of the industrial part is improved.

Description

Part grabbing method, device, equipment and medium
Technical Field
The invention relates to the field of robots, in particular to a part grabbing method, a part grabbing device, part grabbing equipment and a part grabbing medium.
Background
In the manufacturing production activity, the pose estimation of industrial parts is an important content, and only if the accurate positions and postures of the parts are obtained through calculation, subsequent automatic operation (such as grabbing) can be carried out on the parts through equipment such as a mechanical arm and the like, so that the productivity is greatly improved.
Therefore, how to analyze the accurate pose of the industrial parts based on the common characteristics of the industrial parts and realize the accurate grabbing of parts is a problem to be solved urgently at present.
Disclosure of Invention
The invention provides a part grabbing method, a part grabbing device, equipment and a medium, which can determine the accurate pose of a part, realize the accurate grabbing of the part and improve the grabbing efficiency of industrial parts.
According to an aspect of the present invention, there is provided a part gripping method including:
responding to a part grabbing request, controlling a camera to acquire an image of a part to be grabbed, and determining a target image and edge point cloud information of the target image;
determining attribute information of candidate circles in the target image according to the edge point cloud information; the candidate circle is a circular detection area on the part to be grabbed in the target image;
grouping the candidate circles according to the attribute information, and determining a target circle associated with the corresponding part to be grabbed from each group of candidate circles;
and determining the accurate pose corresponding to the part to be grabbed according to the attribute information of the target circle, and controlling a mechanical arm clamping jaw to grab the part to be grabbed.
According to another aspect of the present invention, there is provided a parts gripping apparatus comprising:
the point cloud information determining module is used for responding to a part grabbing request, controlling a camera to acquire an image of a part to be grabbed and determining a target image and edge point cloud information of the target image;
the attribute information determining module is used for determining the attribute information of the candidate circle in the target image according to the edge point cloud information; the candidate circle is a circular detection area positioned on the part to be grabbed in the target image;
the target circle determining module is used for grouping the candidate circles according to the attribute information and determining a target circle associated with the corresponding part to be grabbed from each group of candidate circles;
and the grabbing module is used for determining the accurate pose corresponding to the part to be grabbed according to the attribute information of the target circle, and controlling a mechanical arm clamping jaw to grab the part to be grabbed.
According to another aspect of the present invention, there is provided an electronic apparatus including:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,
the memory stores a computer program executable by the at least one processor, the computer program being executable by the at least one processor to enable the at least one processor to perform a part picking method according to any embodiment of the invention.
According to another aspect of the present invention, there is provided a computer-readable storage medium storing computer instructions for causing a processor to implement the part grabbing method according to any one of the embodiments of the present invention when executed.
According to the technical scheme, a camera is controlled to acquire images of parts to be grabbed in response to a part grabbing request, edge point cloud information of a target image and the target image is determined, attribute information of candidate circles in the target image is determined according to the edge point cloud information, the candidate circles are grouped according to the attribute information, target circles associated with the parts to be grabbed are determined from all groups of candidate circles, accurate poses of the parts to be grabbed are determined according to the attribute information of the target circles, and a mechanical arm clamping jaw is controlled to grab the parts to be grabbed. The target circle associated with the part to be grabbed is determined from the candidate circle in the target image, so that the circular area suitable for grabbing on the part can be accurately determined, the accurate pose of the part is convenient to determine, the part is grabbed accurately, and the grabbing efficiency of the industrial part is improved.
It should be understood that the statements in this section do not necessarily identify key or critical features of the embodiments of the present invention, nor do they necessarily limit the scope of the invention. Other features of the present invention will become apparent from the following description.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a flowchart of a part grabbing method according to an embodiment of the present invention;
fig. 2 is a flowchart of a part grabbing method according to a second embodiment of the present invention;
fig. 3 is a flowchart of a part grabbing method according to a third embodiment of the present invention;
fig. 4 is a block diagram illustrating a component pickup apparatus according to a fourth embodiment of the present invention;
fig. 5 is a schematic structural diagram of an electronic device according to a fifth embodiment of the present invention.
Detailed Description
In order to make those skilled in the art better understand the technical solutions of the present invention, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that the terms "first," "second," "target," "candidate," and the like in the description and claims of the invention and in the above-described drawings are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the invention described herein are capable of operation in sequences other than those illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
In the related technology, a complete three-dimensional model of a part needs to be registered, before the part is grabbed, the pose of the part is estimated based on the prestored three-dimensional model, and the part grabbing is realized, but the mode is more complicated, and the working efficiency of a part grabbing system is limited. Therefore, the invention recovers the posture of the whole part according to the information of the position, the normal direction and the like of the circle contained in the part, realizes the accurate grabbing of the part, simultaneously, does not need to register a complete three-dimensional model of the part, can save more deployment time in use, and is simpler and more effective. The specific part gripping scheme will be described in detail in the following embodiments.
Example one
Fig. 1 is a flowchart of a part grabbing method according to an embodiment of the present invention, where the embodiment is suitable for a situation where a control robot grabs a part placed in a work area, and the method may be executed by a part grabbing apparatus, and the apparatus may be implemented in a software and/or hardware manner, and may be integrated in an electronic device having a part grabbing function. As shown in fig. 1, the method includes:
s101, responding to a part grabbing request, controlling a camera to acquire an image of a part to be grabbed, and determining a target image and edge point cloud information of the target image.
The parts can be circular holes involved in industrial production, such as triangular iron parts, hub parts and the like. The part grabbing request is a request for controlling the mechanical arm to grab the part placed in the working area. The target image is a point cloud image containing a part to be grabbed. The edge point cloud information refers to point cloud information of edge points of a part in a target image, and the point cloud information of each point represents a geometric position coordinate of the point in the target image.
Optionally, when it is detected that the installation of the mechanical arm in the part grabbing system is completed, the charging basket is in the preset working area, and the calibration of the camera is completed, a part grabbing request for grabbing the part in the charging basket is generated, that is, the part grabbing request is detected; the part grabbing request can also be considered to be detected when a part grabbing command sent by a relevant person is received.
Optionally, the camera may be controlled to acquire an image of the part to be captured in response to the part capturing request, and the acquired image is directly determined as the target image; or, a preprocessing operation may be performed on the target image first, and the target image after the preprocessing operation is determined as the target image.
Optionally, the pre-processing operation may include at least one of: down-sampling, background rejection, noise removal and smoothing. Specifically, relatively minor points in the target image can be discarded on the basis of the principle that point cloud information forming the main shape of the part is not lost, and the working efficiency of the part grabbing system can be improved in such a way; the background plane irrelevant to the part to be grabbed in the target image can be removed by using a preset plane fitting method, and in such a way, invalid points in the target image can be screened, the point cloud number is reduced, and the subsequent calculation speed is accelerated.
Optionally, after the target image is determined, point cloud information of all points included in the target image may be analyzed to determine whether a preset detection condition is met, if not, the point cloud quality of the points included in the target image is considered to be poor, and point cloud smoothing may be performed.
Optionally, after the target image is determined, edge points of the part to be captured in the target image may be determined based on a preset edge extraction algorithm, so as to obtain edge point cloud information of the target image.
And S102, determining the attribute information of the candidate circle in the target image according to the edge point cloud information.
And the candidate circle is a circular detection area on the part to be grabbed in the target image. The number of the parts to be grabbed can be at least one, and each part to be grabbed can contain at least one candidate circle. The attribute information refers to information characterizing the attribute of the candidate circle. The attribute information of the candidate circle includes: radius, circle center coordinate, circle center normal direction and interior point quantity. The number of inliers refers to the number of points included inside the circle candidate region.
Optionally, after extracting edge points of the part to be captured and acquiring edge point cloud information, performing circle fitting on the edge points according to the edge point cloud information based on a preset circle fitting algorithm or a global circle search algorithm, and determining candidate circles in the target image and attribute information of the candidate circles, that is, determining the attribute information of the candidate circles in the target image; or inputting the edge point cloud information into a pre-trained model, and outputting the attribute information of the candidate circle in the target image, namely determining the attribute information of the candidate circle in the target image.
Optionally, if the parts to be grabbed are not stacked in disorder but are tiled regularly, the global circle search may be performed by using a hough circle detection method, and all candidate circles in the target image are searched in a 2D and 3D combined manner.
And S103, grouping the candidate circles according to the attribute information, and determining a target circle associated with the corresponding part to be grabbed from each group of candidate circles.
The target circle refers to a circle detection area satisfying a preset specification condition in each group of candidate circles. If there are a plurality of parts to be grasped, a target circle for subsequent grasping may be determined for each part to be grasped.
Optionally, the position relationship between the candidate circles may be analyzed according to the attribute information of the candidate circles, and the candidate circles whose distance between the candidate circles is smaller than the preset distance threshold are divided into a group, that is, the candidate circles are grouped according to the attribute information; or inputting the attribute information of each candidate circle into a pre-trained model, and outputting a grouping result, namely grouping the candidate circles according to the attribute information.
Optionally, for each group of candidate circles, at least one of the radius, the circle center coordinate, the circle center normal line and the number of inner points in the attribute information of the candidate circles may be analyzed, and a circle whose attribute information satisfies a preset screening rule is determined from the candidate circles as a target circle, that is, a target circle associated with the corresponding part to be grabbed is determined from each group of candidate circles.
Optionally, if the number of the candidate circles is 1, it may be determined that the number of the parts to be grabbed is also 1, and the candidate circles are directly determined as the target circles associated with the corresponding parts to be grabbed.
And S104, determining the accurate pose corresponding to the part to be grabbed according to the attribute information of the target circle, and controlling a mechanical arm clamping jaw to grab the part to be grabbed.
The accurate pose means pose information capable of representing the pose of the part to be grabbed with high precision. The accurate pose may include attribute information of a final circle captured by a final target on the part to be captured, such as a center coordinate of the final circle and a center normal direction.
Optionally, the attribute information of the target circle may be iteratively updated based on a preset part pose estimation algorithm, and a final circle, that is, a circular detection area on the part to be grabbed for final grabbing is determined, so as to determine an accurate pose of the part to be grabbed, that is, an accurate pose corresponding to the part to be grabbed is determined according to the attribute information of the target circle; and the attribute information of the target circle can be input into a pre-trained model, and the accurate pose corresponding to the part to be grabbed is output.
Optionally, after the accurate pose of the part to be grabbed is determined, the circle center coordinate and the circle center normal direction of the final circle associated with the accurate pose can be respectively used as the grabbing point coordinate and the grabbing pose of the mechanical arm clamping jaw, the mechanical arm clamping jaw is controlled, and the part to be grabbed is grabbed based on the grabbing point coordinate and the grabbing pose.
In general, for industrial parts based on a circle, a gripping point of a robot arm gripping jaw is determined using a target circle as a reference circle. For example, for a round hole part, the grabbing can be performed directly by controlling the two-finger clamping jaws of the mechanical arm to support inwards.
According to the technical scheme, a camera is controlled to acquire images of parts to be grabbed in response to a part grabbing request, edge point cloud information of a target image and the target image is determined, attribute information of candidate circles in the target image is determined according to the edge point cloud information, the candidate circles are grouped according to the attribute information, target circles associated with the parts to be grabbed are determined from all groups of candidate circles, accurate poses of the parts to be grabbed are determined according to the attribute information of the target circles, and a mechanical arm clamping jaw is controlled to grab the parts to be grabbed. The target circle associated with the part to be grabbed is determined from the candidate circle in the target image, so that the circular area suitable for grabbing on the part can be accurately determined, the accurate pose of the part is convenient to determine, the part is grabbed accurately, and the grabbing efficiency of the industrial part is improved.
Optionally, after determining attribute information of the candidate circle in the target image according to the edge point cloud information, the method further includes: and screening candidate circles meeting the grabbing requirement from the candidate circles according to the radius and the number of the inner points in the attribute information of the candidate circles.
The grabbing requirements may include preset actual size requirements and fitting accuracy requirements.
Optionally, the candidate circle in the target image may be screened based on the radius in the candidate circle attribute information, the candidate circle with the radius within the preset radius range is determined to be the candidate circle meeting the grabbing requirement, specifically, the preset radius range may be determined based on experience according to the radius of the circle on the actual part and the allowable error range, and if the candidate radius is within the preset radius range, the candidate circle is considered to be the candidate circle meeting the actual size requirement, that is, the candidate circle meeting the grabbing requirement is screened from the candidate circle.
Optionally, the candidate circle in the target image may be screened based on the number of interior points in the attribute of the candidate circle, and the candidate circle containing interior points whose number exceeds the preset interior point number threshold is determined as the candidate circle meeting the fitting accuracy requirement, that is, the candidate circle meeting the grabbing requirement is screened from the candidate circles.
It should be noted that, in this way, candidate circles that do not meet the grabbing requirement may be discarded in advance, which is helpful to improve the efficiency of subsequently determining the target circle from the candidate circles.
Preferably, the technical solution of the present invention can be implemented based on an identification and grabbing system, wherein the identification and grabbing system at least includes: arm, charging basket, camera and wait to snatch the part. Wherein, at least one part to be grabbed is placed in the charging basket. The charging basket is placed in the working area of the mechanical arm. In the actual operation process, the charging basket is placed at a proper position close to the mechanical arm, the part to be grabbed is placed in the charging basket, and the camera is fixed in the whole working space in a mode of being out of hand. When the grabbing system is identified to operate, the camera shoots images including the charging basket and the workpiece to be grabbed, and after the accurate pose of the workpiece to be grabbed is determined, specific grabbing poses and paths can be planned for the mechanical arm, so that all parts to be grabbed in the charging basket are grabbed finally.
Example two
Fig. 2 is a flowchart of a part grabbing method according to a second embodiment of the present invention, and this embodiment further explains in detail "determining an accurate pose of a corresponding part to be grabbed according to attribute information of a target circle" based on the above embodiment, and as shown in fig. 2, the method includes:
s201, responding to a part grabbing request, controlling a camera to acquire an image of a part to be grabbed, and determining a target image and edge point cloud information of the target image.
S202, determining attribute information of the candidate circle in the target image according to the edge point cloud information.
And S203, grouping the candidate circles according to the attribute information, and determining a target circle associated with the corresponding part to be grabbed from each group of candidate circles.
Optionally, the candidate circles may be grouped and the target circle may be determined from each group of candidate circles based on the circle center coordinates and the number of inner points in the attribute information, specifically, the candidate circles may be grouped according to the attribute information, and the target circle associated with the corresponding part to be grasped may be determined from each group of candidate circles, including: grouping the candidate circles according to the coordinates of the circle centers in the attribute information of the candidate circles based on a preset clustering algorithm; and for each group of candidate circles, determining a target circle associated with the corresponding part to be grabbed from the candidate circles according to the number of interior points in the attribute information of the candidate circles.
The preset clustering algorithm may be a circle center clustering algorithm for clustering based on circle center coordinates.
Optionally, each group of candidate circles is associated with one part to be grabbed; each set of candidate circles is considered to be a circle located on the same part to be grabbed. And determining a target circle in each group of candidate circles, wherein the number of the target circles is the same as that of the parts to be grabbed.
Optionally, the candidate circles may be clustered based on a preset clustering algorithm according to coordinates of centers of the candidate circles, and the candidate circles clustered into one group are determined as one group according to a clustering result, that is, grouping of the candidate circles is achieved.
It should be noted that the distances between the candidate circles on the same part to be grabbed are relatively close, so that the distribution of the candidate circles in the target image can be effectively determined based on the clustering algorithm, and the number of the parts to be grabbed can be determined according to the number of categories of the final clustering.
Optionally, the more the number of interior points in the candidate circle is, the higher the precision of circle fitting is, therefore, for each group of candidate circles, the candidate circle with the largest number of interior points in the candidate circle may be determined as the target circle, that is, the target circle associated with the corresponding part to be grabbed is determined from the candidate circles.
It should be noted that, if the number of the candidate circles is 1, the candidate circle may be directly determined as the target circle associated with the corresponding part to be grabbed.
For example, if there are three parts in the target image, the candidate circles are finally grouped into three categories, i.e., three groups, based on a preset clustering algorithm. Then, a circle with the largest number of inner points is selected in each class to serve as a circle with the highest precision, namely a target circle, so that a target circle can be determined on each part to be grabbed.
And S204, determining the initial pose of the corresponding part to be grabbed according to the attribute information of the target circle.
The initial pose refers to the preliminarily determined pose information of the part to be grabbed.
Optionally, the initial pose of the corresponding part to be grabbed may be determined based on the circle center coordinate and the circle center normal direction in the attribute information of the target circle, and specifically, the determining the initial pose of the corresponding part to be grabbed according to the attribute information of the target circle includes: and calculating the initial pose corresponding to the part to be grabbed according to the circle center coordinate and the circle center normal direction in the attribute information of the target circle.
For example, based on a preset calculation rule, a target circle attitude matrix, that is, an initial pose corresponding to a part to be grabbed, may be determined according to the circle center coordinates and the circle center normal direction of the target circle. The target circular attitude matrix may be a 4-row and 4-column matrix.
And S205, based on an iterative closest point algorithm, iteratively updating the initial pose, determining the accurate pose corresponding to the part to be grabbed, and controlling a mechanical arm clamping jaw to grab the part to be grabbed.
An Iterative Closest Point (ICP) algorithm is an algorithm for accurately estimating the pose of a part to be grabbed.
Optionally, the attitude matrix associated with the initial pose may be iteratively updated based on an iterative closest point algorithm in combination with point cloud information of relevant points around the target circle in the target image, so as to obtain an attitude matrix with higher precision, that is, the accurate pose corresponding to the part to be grabbed is determined, and further, according to the accurate circle center coordinate associated with the attitude matrix and the direction of the normal line of the circle center, the clamping jaw of the mechanical arm is controlled to grab the part to be grabbed.
According to the technical scheme, a camera is controlled to acquire an image of a part to be grabbed in response to a part grabbing request, a target image and edge point cloud information of the target image are determined, attribute information of candidate circles in the target image is determined according to the edge point cloud information, the candidate circles are grouped according to the attribute information, after a target circle associated with the part to be grabbed is determined from each group of candidate circles, an initial pose of the part to be grabbed is determined according to the attribute information of the target circle, the initial pose is updated iteratively based on an iterative closest point algorithm, an accurate pose of the part to be grabbed is determined, and a mechanical arm clamping jaw is controlled to grab the part to be grabbed. The initial pose of the part to be grabbed is determined according to the target circle, iterative updating is further carried out, the accurate pose of the part to be grabbed can be determined, follow-up accurate grabbing of the part to be grabbed is conveniently achieved, and work efficiency is improved.
EXAMPLE III
Fig. 3 is a flowchart of a part grabbing method according to a third embodiment of the present invention, which further explains "determining attribute information of a candidate circle in a target image according to edge point cloud information" in detail based on the third embodiment, and as shown in fig. 3, the method includes:
s301, responding to a part grabbing request, controlling a camera to acquire an image of a part to be grabbed, and determining a target image and edge point cloud information of the target image.
Optionally, determining the edge point cloud information of the target image includes: determining the curvature and/or normal direction of candidate points in the target image; and screening edge points from the candidate points according to the curvature and/or the normal direction of the candidate points, and determining the edge point cloud information of the target image.
The candidate points may be all pixel points in the target image. The normal direction of a point refers to the direction of a line perpendicular to the tangent plane of the point.
Optionally, a plane may be fitted according to point cloud information of all points included in a certain distance range of the candidate point, and a direction of a straight line perpendicular to the plane is determined as a normal direction of the candidate point. For example, plane fitting, i.e. determining the normal direction of candidate points in the target image, may be performed by the least squares method.
Optionally, a feature value of a covariance matrix formed by the candidate point and all points included in a certain distance range may be determined, and a ratio of the minimum feature value to a sum of all feature values is determined as a curvature of the corresponding candidate point, that is, a curvature of the candidate point in the target image is determined.
Optionally, for each candidate point, based on a preset distance threshold, taking a region located in the range of the distance threshold around the candidate point as a neighborhood of the candidate point, further determining a difference between the candidate point and other candidate points in the neighborhood and/or a normal direction, if the difference is greater than a certain threshold, determining that the candidate point is an edge point, and determining point cloud information of the point as edge point cloud information, that is, screening the edge point from the candidate points according to the curvature and/or the normal direction of the candidate point, and determining the edge point cloud information of the target image.
S302, extracting candidate circles in the target image according to the edge point cloud information based on a random sampling consistency algorithm, and determining attribute information of each candidate circle.
Among them, random sample consensus (RANSAC) is an algorithm for performing circle extraction.
Optionally, based on a random sampling consistency algorithm, according to the edge point cloud information, analyzing a position relationship between edge points in the target image to determine edge points satisfying a preset rule, where each group of edge points may form a circular detection area, that is, a candidate circle, and further according to a geometric relationship between edge points included in each candidate circle, determining information such as a circle center coordinate, a radius, a circle center normal direction, and an inner point number of each candidate circle, that is, determining attribute information of each candidate circle.
And S303, grouping the candidate circles according to the attribute information, and determining a target circle associated with the corresponding part to be grabbed from each group of candidate circles.
S304, determining the accurate pose corresponding to the part to be grabbed according to the attribute information of the target circle, and controlling a mechanical arm clamping jaw to grab the part to be grabbed.
According to the technical scheme, a camera is controlled to acquire an image of a part to be grabbed in response to a part grabbing request, a target image and edge point cloud information of the target image are determined, further, based on a random sampling consistency algorithm, candidate circles in the target image are extracted according to the edge point cloud information, attribute information of each candidate circle is determined, the candidate circles are grouped according to the attribute information, a target circle associated with the corresponding part to be grabbed is determined from each group of candidate circles, finally, according to the attribute information of the target circle, the accurate pose of the corresponding part to be grabbed is determined, and a mechanical arm clamping jaw is controlled to grab the part to be grabbed. By means of the random sampling consistency algorithm, the determined edge points can be effectively divided, candidate circles contained in the target image are determined, follow-up accurate grabbing of parts to be grabbed is facilitated, and working efficiency is improved.
Example four
Fig. 4 is a block diagram of a component grabbing device according to a fourth embodiment of the present invention, where the component grabbing device according to the fourth embodiment of the present invention is capable of executing a component grabbing method according to any embodiment of the present invention, and has corresponding functional modules and beneficial effects of the executing method.
As shown in fig. 4, the apparatus includes:
the point cloud information determining module 401 is configured to, in response to a part grabbing request, control a camera to perform image acquisition on a part to be grabbed, and determine a target image and edge point cloud information of the target image;
an attribute information determining module 402, configured to determine attribute information of a candidate circle in the target image according to the edge point cloud information; the candidate circle is a circular detection area on the part to be grabbed in the target image;
a target circle determining module 403, configured to group the candidate circles according to the attribute information, and determine a target circle associated with a corresponding part to be grabbed from each group of candidate circles;
and the grabbing module 404 is configured to determine an accurate pose corresponding to the part to be grabbed according to the attribute information of the target circle, and control the mechanical arm clamping jaw to grab the part to be grabbed.
According to the technical scheme, a camera is controlled to acquire images of parts to be grabbed in response to a part grabbing request, edge point cloud information of a target image and the target image is determined, attribute information of candidate circles in the target image is determined according to the edge point cloud information, the candidate circles are grouped according to the attribute information, target circles associated with the parts to be grabbed are determined from all groups of candidate circles, accurate poses of the parts to be grabbed are determined according to the attribute information of the target circles, and a mechanical arm clamping jaw is controlled to grab the parts to be grabbed. The target circle associated with the part to be grabbed is determined from the candidate circle in the target image, so that the circular area suitable for grabbing on the part can be accurately determined, the accurate pose of the part is convenient to determine, the part is grabbed accurately, and the grabbing efficiency of the industrial part is improved.
Further, the grabbing module 404 may include:
the initial pose determining unit is used for determining an initial pose of the corresponding part to be grabbed according to the attribute information of the target circle;
and the accurate pose determining unit is used for iteratively updating the initial pose based on an iterative closest point algorithm and determining the accurate pose of the corresponding part to be grabbed.
Further, the initial pose determining unit is specifically configured to:
and calculating the initial pose corresponding to the part to be grabbed according to the circle center coordinate and the circle center normal direction in the attribute information of the target circle.
Further, the target circle determining module 403 is specifically configured to:
grouping the candidate circles according to the coordinates of the circle centers in the attribute information of the candidate circles based on a preset clustering algorithm; wherein each group of candidate circles is associated with a part to be grabbed;
for each group of candidate circles, determining a target circle associated with the corresponding part to be grabbed from the candidate circles according to the number of inner points in the attribute information of the candidate circles; the number of the target circles is the same as the number of the parts to be grabbed.
Further, the point cloud information determining module 401 is specifically configured to:
determining the curvature and/or normal direction of candidate points in the target image;
and screening edge points from the candidate points according to the curvatures and/or normal directions of the candidate points, and determining the edge point cloud information of the target image.
Further, the attribute information determining module 402 is specifically configured to:
and based on a random sampling consistency algorithm, extracting candidate circles in the target image according to the edge point cloud information, and determining attribute information of each candidate circle.
Further, the above apparatus is further configured to:
and screening candidate circles meeting the grabbing requirements from the candidate circles according to the radius and the number of the inner points in the attribute information of the candidate circles.
EXAMPLE five
Fig. 5 is a schematic structural diagram of an electronic device according to a fifth embodiment of the present invention. FIG. 5 illustrates a schematic diagram of an electronic device 10 that may be used to implement an embodiment of the invention. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. The electronic device may also represent various forms of mobile devices, such as personal digital assistants, cellular phones, smart phones, wearable devices (e.g., helmets, glasses, watches, etc.), and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be exemplary only, and are not meant to limit implementations of the inventions described and/or claimed herein.
As shown in fig. 5, the electronic device 10 includes at least one processor 11, and a memory communicatively connected to the at least one processor 11, such as a Read Only Memory (ROM) 12, a Random Access Memory (RAM) 13, and the like, wherein the memory stores a computer program executable by the at least one processor, and the processor 11 may perform various suitable actions and processes according to the computer program stored in the Read Only Memory (ROM) 12 or the computer program loaded from the storage unit 18 into the Random Access Memory (RAM) 13. In the RAM 13, various programs and data necessary for the operation of the electronic apparatus 10 can also be stored. The processor 11, the ROM 12, and the RAM 13 are connected to each other via a bus 14. An input/output (I/O) interface 15 is also connected to bus 14.
A number of components in the electronic device 10 are connected to the I/O interface 15, including: an input unit 16 such as a keyboard, a mouse, or the like; an output unit 17 such as various types of displays, speakers, and the like; a storage unit 18 such as a magnetic disk, an optical disk, or the like; and a communication unit 19 such as a network card, modem, wireless communication transceiver, etc. The communication unit 19 allows the electronic device 10 to exchange information/data with other devices via a computer network such as the internet and/or various telecommunication networks.
The processor 11 may be a variety of general and/or special purpose processing components having processing and computing capabilities. Some examples of processor 11 include, but are not limited to, a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), various specialized Artificial Intelligence (AI) computing chips, various processors running machine learning model algorithms, a Digital Signal Processor (DSP), and any suitable processor, controller, microcontroller, or the like. The processor 11 performs the various methods and processes described above, such as the part capture method.
In some embodiments, the part grabbing method may be implemented as a computer program tangibly embodied in a computer-readable storage medium, such as storage unit 18. In some embodiments, part or all of the computer program may be loaded and/or installed onto the electronic device 10 via the ROM 12 and/or the communication unit 19. When loaded into RAM 13 and executed by processor 11, the computer program may perform one or more of the steps of the part grabbing method described above. Alternatively, in other embodiments, the processor 11 may be configured to perform the part grabbing method by any other suitable means (e.g., by means of firmware).
Various implementations of the systems and techniques described here above may be implemented in digital electronic circuitry, integrated circuitry, field Programmable Gate Arrays (FPGAs), application Specific Integrated Circuits (ASICs), application Specific Standard Products (ASSPs), system on a chip (SOCs), load programmable logic devices (CPLDs), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor, which may be special or general purpose, receiving data and instructions from, and transmitting data and instructions to, a storage system, at least one input device, and at least one output device.
A computer program for implementing the methods of the present invention may be written in any combination of one or more programming languages. These computer programs may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus, such that the computer programs, when executed by the processor, cause the functions/acts specified in the flowchart and/or block diagram block or blocks to be performed. A computer program can execute entirely on a machine, partly on a machine, as a stand-alone software package partly on a machine and partly on a remote machine or entirely on a remote machine or server.
In the context of the present invention, a computer-readable storage medium may be a tangible medium that can contain, or store a computer program for use by or in connection with an instruction execution system, apparatus, or device. A computer readable storage medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. Alternatively, the computer readable storage medium may be a machine readable signal medium. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
To provide for interaction with a user, the systems and techniques described here can be implemented on an electronic device having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and a pointing device (e.g., a mouse or a trackball) by which a user may provide input to the electronic device. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user can be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic, speech, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a back-end component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such back-end, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), wide Area Networks (WANs), blockchain networks, and the internet.
The computing system may include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. The server can be a cloud server, also called a cloud computing server or a cloud host, and is a host product in a cloud computing service system, so that the defects of high management difficulty and weak service expansibility in the traditional physical host and VPS service are overcome.
It should be understood that various forms of the flows shown above, reordering, adding or deleting steps, may be used. For example, the steps described in the present invention may be executed in parallel, sequentially, or in different orders, and are not limited herein as long as the desired results of the technical solution of the present invention can be achieved.
The above-described embodiments should not be construed as limiting the scope of the invention. It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and substitutions may be made in accordance with design requirements and other factors. Any modification, equivalent replacement, and improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (10)

1. A part picking method, comprising:
responding to a part grabbing request, controlling a camera to acquire an image of a part to be grabbed, and determining a target image and edge point cloud information of the target image;
determining attribute information of candidate circles in the target image according to the edge point cloud information; the candidate circle is a circular detection area on the part to be grabbed in the target image;
grouping the candidate circles according to the attribute information, and determining a target circle associated with the corresponding part to be grabbed from each group of candidate circles;
and determining the accurate pose corresponding to the part to be grabbed according to the attribute information of the target circle, and controlling a mechanical arm clamping jaw to grab the part to be grabbed.
2. The method according to claim 1, wherein the determining of the accurate pose of the corresponding part to be grabbed according to the attribute information of the target circle comprises:
determining an initial pose corresponding to the part to be grabbed according to the attribute information of the target circle;
and based on an iterative closest point algorithm, performing iterative update on the initial pose, and determining the accurate pose of the corresponding part to be grabbed.
3. The method according to claim 2, wherein the determining an initial pose corresponding to the part to be grabbed according to the attribute information of the target circle comprises:
and calculating the initial pose corresponding to the part to be grabbed according to the circle center coordinate and the circle center normal direction in the attribute information of the target circle.
4. The method according to claim 1, wherein the grouping the candidate circles according to the attribute information and determining a target circle associated with a corresponding part to be grabbed from each group of candidate circles comprises:
grouping the candidate circles according to the coordinates of the circle centers in the attribute information of the candidate circles based on a preset clustering algorithm; each group of candidate circles is associated with a part to be grabbed;
for each group of candidate circles, determining a target circle associated with the corresponding part to be grabbed from the candidate circles according to the number of interior points in the attribute information of the candidate circles; the number of the target circles is the same as the number of the parts to be grabbed.
5. The method of claim 1, wherein determining edge point cloud information for a target image comprises:
determining the curvature and/or normal direction of candidate points in the target image;
and screening out edge points from the candidate points according to the curvatures and/or normal directions of the candidate points, and determining the edge point cloud information of the target image.
6. The method of claim 1, wherein the determining the attribute information of the candidate circle in the target image according to the edge point cloud information comprises:
and based on a random sampling consistency algorithm, extracting candidate circles in the target image according to the edge point cloud information, and determining attribute information of each candidate circle.
7. The method according to any one of claims 1-6, wherein after determining the attribute information of the candidate circle in the target image according to the edge point cloud information, the method further comprises:
and screening candidate circles meeting the grabbing requirements from the candidate circles according to the radius and the number of the inner points in the attribute information of the candidate circles.
8. A part grabbing device, comprising:
the point cloud information determining module is used for responding to a part grabbing request, controlling a camera to acquire an image of a part to be grabbed and determining a target image and edge point cloud information of the target image;
the attribute information determining module is used for determining the attribute information of the candidate circle in the target image according to the edge point cloud information; the candidate circle is a circular detection area on the part to be grabbed in the target image;
the target circle determining module is used for grouping the candidate circles according to the attribute information and determining a target circle associated with the corresponding part to be grabbed from each group of candidate circles;
and the grabbing module is used for determining the accurate pose corresponding to the part to be grabbed according to the attribute information of the target circle, and controlling a mechanical arm clamping jaw to grab the part to be grabbed.
9. An electronic device, characterized in that the electronic device comprises:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,
the memory stores a computer program executable by the at least one processor to enable the at least one processor to perform the part picking method of any of claims 1-7.
10. A computer-readable storage medium storing computer instructions for causing a processor to perform the part grabbing method according to any one of claims 1-7 when executed.
CN202211449195.9A 2022-11-18 2022-11-18 Part grabbing method, device, equipment and medium Pending CN115781673A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211449195.9A CN115781673A (en) 2022-11-18 2022-11-18 Part grabbing method, device, equipment and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211449195.9A CN115781673A (en) 2022-11-18 2022-11-18 Part grabbing method, device, equipment and medium

Publications (1)

Publication Number Publication Date
CN115781673A true CN115781673A (en) 2023-03-14

Family

ID=85438982

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211449195.9A Pending CN115781673A (en) 2022-11-18 2022-11-18 Part grabbing method, device, equipment and medium

Country Status (1)

Country Link
CN (1) CN115781673A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115995013A (en) * 2023-03-21 2023-04-21 江苏金恒信息科技股份有限公司 Covering agent adding method, covering agent adding device, computer equipment and storage medium
CN116883488A (en) * 2023-07-21 2023-10-13 捷安特(中国)有限公司 Method, device, equipment and medium for determining center position of circular pipe
CN116985141A (en) * 2023-09-22 2023-11-03 深圳市协和传动器材有限公司 Industrial robot intelligent control method and system based on deep learning
CN117697768A (en) * 2024-02-05 2024-03-15 季华实验室 Target grabbing method, robot, electronic equipment and storage medium

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111428731A (en) * 2019-04-04 2020-07-17 深圳市联合视觉创新科技有限公司 Multi-class target identification and positioning method, device and equipment based on machine vision
CN111687839A (en) * 2020-06-03 2020-09-22 北京如影智能科技有限公司 Method and device for clamping articles
CN111775152A (en) * 2020-06-29 2020-10-16 深圳大学 Method and system for guiding mechanical arm to grab scattered stacked workpieces based on three-dimensional measurement
CN112356019A (en) * 2020-08-06 2021-02-12 武汉科技大学 Method and device for analyzing body of target object grabbed by dexterous hand
CN112720487A (en) * 2020-12-23 2021-04-30 东北大学 Mechanical arm grabbing method and system based on self-adaptive dynamic force balance
CN113191174A (en) * 2020-01-14 2021-07-30 北京京东乾石科技有限公司 Article positioning method and device, robot and computer readable storage medium
CN113894799A (en) * 2021-12-08 2022-01-07 北京云迹科技有限公司 Robot and marker identification method and device for assisting environment positioning
CN114102593A (en) * 2021-11-24 2022-03-01 航天晨光股份有限公司 Method for grabbing regular materials by robot based on two-dimensional low-definition image
CN114347015A (en) * 2021-12-09 2022-04-15 华南理工大学 Robot grabbing control method, system, device and medium
US20220193894A1 (en) * 2020-12-21 2022-06-23 Boston Dynamics, Inc. Supervised Autonomous Grasping
CN115213896A (en) * 2022-05-10 2022-10-21 浙江西图盟数字科技有限公司 Object grabbing method, system and equipment based on mechanical arm and storage medium

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111428731A (en) * 2019-04-04 2020-07-17 深圳市联合视觉创新科技有限公司 Multi-class target identification and positioning method, device and equipment based on machine vision
CN113191174A (en) * 2020-01-14 2021-07-30 北京京东乾石科技有限公司 Article positioning method and device, robot and computer readable storage medium
CN111687839A (en) * 2020-06-03 2020-09-22 北京如影智能科技有限公司 Method and device for clamping articles
CN111775152A (en) * 2020-06-29 2020-10-16 深圳大学 Method and system for guiding mechanical arm to grab scattered stacked workpieces based on three-dimensional measurement
CN112356019A (en) * 2020-08-06 2021-02-12 武汉科技大学 Method and device for analyzing body of target object grabbed by dexterous hand
US20220193894A1 (en) * 2020-12-21 2022-06-23 Boston Dynamics, Inc. Supervised Autonomous Grasping
CN112720487A (en) * 2020-12-23 2021-04-30 东北大学 Mechanical arm grabbing method and system based on self-adaptive dynamic force balance
CN114102593A (en) * 2021-11-24 2022-03-01 航天晨光股份有限公司 Method for grabbing regular materials by robot based on two-dimensional low-definition image
CN113894799A (en) * 2021-12-08 2022-01-07 北京云迹科技有限公司 Robot and marker identification method and device for assisting environment positioning
CN114347015A (en) * 2021-12-09 2022-04-15 华南理工大学 Robot grabbing control method, system, device and medium
CN115213896A (en) * 2022-05-10 2022-10-21 浙江西图盟数字科技有限公司 Object grabbing method, system and equipment based on mechanical arm and storage medium

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115995013A (en) * 2023-03-21 2023-04-21 江苏金恒信息科技股份有限公司 Covering agent adding method, covering agent adding device, computer equipment and storage medium
CN116883488A (en) * 2023-07-21 2023-10-13 捷安特(中国)有限公司 Method, device, equipment and medium for determining center position of circular pipe
CN116883488B (en) * 2023-07-21 2024-03-26 捷安特(中国)有限公司 Method, device, equipment and medium for determining center position of circular pipe
CN116985141A (en) * 2023-09-22 2023-11-03 深圳市协和传动器材有限公司 Industrial robot intelligent control method and system based on deep learning
CN116985141B (en) * 2023-09-22 2023-11-24 深圳市协和传动器材有限公司 Industrial robot intelligent control method and system based on deep learning
CN117697768A (en) * 2024-02-05 2024-03-15 季华实验室 Target grabbing method, robot, electronic equipment and storage medium
CN117697768B (en) * 2024-02-05 2024-05-07 季华实验室 Target grabbing method, robot, electronic equipment and storage medium

Similar Documents

Publication Publication Date Title
CN115781673A (en) Part grabbing method, device, equipment and medium
CN111178250A (en) Object identification positioning method and device and terminal equipment
CN108364311A (en) A kind of metal parts automatic positioning method and terminal device
CN110065068B (en) Robot assembly operation demonstration programming method and device based on reverse engineering
CN111428731A (en) Multi-class target identification and positioning method, device and equipment based on machine vision
CN111598172B (en) Dynamic target grabbing gesture rapid detection method based on heterogeneous depth network fusion
CN115273071A (en) Object identification method and device, electronic equipment and storage medium
CN115321090B (en) Method, device, equipment, system and medium for automatically receiving and taking luggage in airport
CN115213896A (en) Object grabbing method, system and equipment based on mechanical arm and storage medium
CN115937101A (en) Quality detection method, device, equipment and storage medium
CN116091727A (en) Complex Qu Miandian cloud registration method based on multi-scale feature description, electronic equipment and storage medium
CN116342585A (en) Product defect detection method, device, equipment and storage medium
CN114202526A (en) Quality detection method, system, apparatus, electronic device, and medium
CN116000966A (en) Workpiece grabbing method, device, equipment and storage medium
CN116071429B (en) Method and device for identifying outline of sub-pattern, electronic equipment and storage medium
CN117270832B (en) Machine instruction generation method and device, electronic equipment and storage medium
CN117272425B (en) Assembly method, assembly device, electronic equipment and storage medium
CN114694138B (en) Road surface detection method, device and equipment applied to intelligent driving
CN115131435A (en) Material container positioning method and device, electronic equipment and storage medium
CN116883488B (en) Method, device, equipment and medium for determining center position of circular pipe
CN117444970A (en) Mechanical arm movement control method, device, equipment and storage medium
WO2022137509A1 (en) Object recognition device, object recognition method, non-transitory computer-readable medium, and object recognition system
CN116125422A (en) Equipment detection configuration method and device, electronic equipment and storage medium
CN115741696A (en) Object grabbing method, device and equipment and storage medium
CN115661211A (en) Object detection method, device, equipment and medium based on point cloud

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination