CN109373898B - Complex part pose estimation system and method based on three-dimensional measurement point cloud - Google Patents

Complex part pose estimation system and method based on three-dimensional measurement point cloud Download PDF

Info

Publication number
CN109373898B
CN109373898B CN201811428771.5A CN201811428771A CN109373898B CN 109373898 B CN109373898 B CN 109373898B CN 201811428771 A CN201811428771 A CN 201811428771A CN 109373898 B CN109373898 B CN 109373898B
Authority
CN
China
Prior art keywords
workpiece
point cloud
measured
degree
pose
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811428771.5A
Other languages
Chinese (zh)
Other versions
CN109373898A (en
Inventor
李文龙
胡著
王刚
田亚明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huazhong University of Science and Technology
Original Assignee
Huazhong University of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huazhong University of Science and Technology filed Critical Huazhong University of Science and Technology
Priority to CN201811428771.5A priority Critical patent/CN109373898B/en
Publication of CN109373898A publication Critical patent/CN109373898A/en
Application granted granted Critical
Publication of CN109373898B publication Critical patent/CN109373898B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/002Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention belongs to the field of automatic measurement, and particularly discloses a complex part pose estimation system and method based on three-dimensional measurement point cloud, which realize complex part pose estimation through the following steps: calibrating a conversion matrix between a tail end flange plate of the six-degree-of-freedom industrial robot and a measurement coordinate system of a grating type area array scanner; the six-degree-of-freedom industrial robot drives the grating type area array scanner to move so as to scan the workpiece to be measured, and then three-dimensional point cloud data of the workpiece to be measured are obtained; converting the three-dimensional point cloud data of the workpiece to be detected to a robot base coordinate system to obtain converted three-dimensional point cloud data; and matching the converted three-dimensional point cloud data with the three-dimensional design model of the workpiece to be detected to obtain the pose of the workpiece to be detected relative to the robot base coordinate system, thereby finishing the estimation of the pose of the complex part. The invention has wide measurement range, can realize the accurate measurement of multiple areas and complete appearance of the workpiece to be measured, and can accurately acquire the pose of the workpiece to be measured.

Description

Complex part pose estimation system and method based on three-dimensional measurement point cloud
Technical Field
The invention belongs to the field of automatic measurement, and particularly relates to a complex part pose estimation system and method based on three-dimensional measurement point cloud.
Background
Robot three-dimensional automatic measurement requires planning a complete collision-free path, and the planned path requires knowing the pose of the workpiece with respect to the robot base coordinate system. The traditional workpiece attitude estimation method comprises a manual alignment method, which depends on manual operation, has great randomness and inaccurate estimation; the method for acquiring the pose of the workpiece by using visual positioning is also applicable to workpieces with unobvious features and is complex in calculation.
The traditional measuring modes are complex, accurate workpiece poses are difficult to acquire, and subsequent path planning is deviated, especially when workpieces are huge, a workpiece coordinate system is difficult to acquire, so that large-amplitude deviation occurs in subsequent robot measuring point planning, and unexpected collision between a robot measuring system and the external environment is possibly caused.
Disclosure of Invention
Aiming at the defects or improvement requirements of the prior art, the invention provides a complex part pose estimation system and method based on three-dimensional measurement point cloud, which utilize a six-degree-of-freedom industrial robot to drive a grating type area array scanner to carry out local characteristic scanning on a measured workpiece, acquire point cloud data of key parts, convert the point cloud data into a robot base coordinate system through a matrix calibrated by hands and eyes to obtain new point cloud data, and then utilize a design model of the measured workpiece to be matched with the point cloud to acquire a conversion matrix from the design model to the point cloud data, wherein the conversion matrix is the pose of the measured workpiece relative to the robot base coordinate system.
In order to achieve the above object, according to one aspect of the present invention, a complex part pose estimation system based on three-dimensional measurement point cloud is provided, which includes a six-degree-of-freedom industrial robot, a grating type area array scanner, a measured workpiece, a marker point support frame, and a data processing upper computer, wherein:
the tail end of the six-degree-of-freedom industrial robot clamps the grating type area array scanner and is used for driving the grating type area array scanner to move so as to scan and measure a workpiece to be measured placed on the mark point support frame, blue light emitted by the grating type area array scanner covers the surface of the workpiece to be measured and mark points on the mark point support frame during measurement, and at least 3 public mark points which are not on the same straight line exist in the two-time measurement; the six-degree-of-freedom industrial robot is connected with the data processing upper computer so as to transmit the motion parameters of the six-degree-of-freedom industrial robot to the data processing upper computer in real time;
the grating type area array scanner is arranged at the tail end of the six-degree-of-freedom industrial robot and used for scanning and measuring a workpiece to be measured to obtain three-dimensional point cloud data of the workpiece to be measured, and the grating type area array scanner is connected with the data processing upper computer to transmit the measured data to the processing upper computer in real time;
the data processing upper computer is used for receiving the motion parameters of the six-degree-of-freedom industrial robot and the measurement data of the grating type area array scanner, and calculating to obtain the pose of the measured workpiece under the robot base coordinate system, so that the estimation of the pose of the complex part is completed.
As a further preferred, before measurement, the position relationship between the end flange of the six-degree-of-freedom industrial robot and the measurement coordinate system of the grating type area array scanner is obtained through hand-eye calibration.
More preferably, the distance between adjacent marking points on the marking point support is preferably 3cm to 4 cm.
Preferably, the six-degree-of-freedom industrial robot is connected with the data processing upper computer through a six-degree-of-freedom industrial robot controller.
Preferably, the grating type area array scanner is connected with a six-degree-of-freedom industrial robot controller to achieve synchronization of signal triggering and data acquisition.
According to another aspect of the invention, a complex part pose estimation method based on three-dimensional measurement point cloud is provided, which comprises the following steps:
s1 calibration conversion matrix between six-freedom-degree industrial robot tail end flange plate and grating type area array scanner measurement coordinate system
Figure BDA0001882231300000031
S2, the six-degree-of-freedom industrial robot drives the grating type area array scanner to move so as to scan the workpiece to be measured, and then three-dimensional point cloud data p of the workpiece to be measured is obtainedi(i=1,2,...,s);
S3, converting the three-dimensional point cloud data of the workpiece to be detected to a robot base coordinate system to obtain the converted three-dimensional point cloud data:
Figure BDA0001882231300000032
wherein the content of the first and second substances,
Figure BDA0001882231300000033
the pose of the robot is measured for the first time;
and S4, matching the converted three-dimensional point cloud data with the three-dimensional design model of the workpiece to be measured to obtain the pose of the workpiece to be measured relative to the robot base coordinate system, thereby finishing the estimation of the pose of the complex part.
Further preferably, in step S4, the pose of the workpiece to be measured with respect to the robot base coordinate system is obtained by the following method:
firstly, calculating a transformation matrix from point cloud to three-dimensional design model coordinate system after each matching is finished
Figure BDA0001882231300000034
Wherein the matching times are n;
then according to the conversion matrix T corresponding to each matchingkCalculating the pose of the measured workpiece under the robot base coordinate system
Figure BDA0001882231300000035
The result is obtained.
Generally, compared with the prior art, the above technical solution conceived by the present invention mainly has the following technical advantages:
1. the invention overcomes the defect that the traditional pose estimation method needs manual measurement, can utilize a six-degree-of-freedom industrial robot to drive a scanner to realize remote operation, acquire point cloud data and calculate the pose of a workpiece, is safe and reliable, cannot cause threat to operators, has wide measurement range, and can realize the accurate measurement of multiple regions and complete appearance of the workpiece to be measured.
2. The grating type area array scanner has an automatic splicing function based on the mark points, can avoid the splicing error of the point cloud measured by the scanner caused by the absolute positioning error of the robot, improves the precision of the measured point cloud, and realizes the splicing of the mark points by pasting the mark points on the periphery of the measured area of the workpiece to be measured and ensuring that at least three repeated mark points are observed under two adjacent visual angles in the measuring process, thereby ensuring that complete measured point cloud data are spliced.
3. The invention can solve the problem that manual measurement is difficult to obtain more accurate position parameters, particularly the acquisition of the rotation matrix R, and can more accurately acquire the pose of the workpiece to be measured by utilizing an accurate hand-eye matrix and a point cloud matching algorithm.
4. The invention has simple structure, can well solve the problem of the pose of the randomly placed workpiece relative to the pose of the robot base coordinate system, is applicable to pose estimation of the same type and has strong universality.
Drawings
Fig. 1 is a schematic structural diagram of a complex part pose estimation system based on a three-dimensional measurement point cloud according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of a three-dimensional feature of an exemplary workpiece;
FIG. 3 is a schematic diagram of the pose relationship between the scanner and the end flange of the robot and the acquisition of a three-dimensional point cloud, where OBXYZ is the robot-based coordinate system { B }, OEXYZ is the robot end flange coordinate system { E }, OSXYZ is the scanner measurement coordinate system S;
FIG. 4 is a schematic diagram of a point cloud matching process, where xyz is the coordinate system of the three-dimensional measurement point cloud, x0y0z0T is a transformation matrix from a coordinate system of the three-dimensional measurement point cloud to the model coordinate system.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention. In addition, the technical features involved in the embodiments of the present invention described below may be combined with each other as long as they do not conflict with each other.
As shown in fig. 1, a complex part pose estimation system based on three-dimensional measurement point cloud according to an embodiment of the present invention includes a six-degree-of-freedom industrial robot 100, a workpiece 200 to be measured, a grating type area array scanner 300, a marker point support 400, and a data processing upper computer 500.
The end of the six-degree-of-freedom industrial robot 100 clamps the grating type area array scanner 300, the six-degree-of-freedom industrial robot 100 moves in a working space according to a certain track to drive the grating type area array scanner 300 to move, the grating type area array scanner 300 scans and measures a measured workpiece 200 placed on a mark point support frame 400, during measurement, blue light emitted by the grating type area array scanner 300 needs to be ensured to cover the surface of the measured workpiece and a part of mark points on the mark point support frame, and at least 3 public mark points which are not on the same straight line need to be ensured to exist during two times of measurement, so as to be used for splicing point clouds; the six-degree-of-freedom industrial robot 100 is connected to the data processing upper computer 500, so that the motion parameters (including the position, attitude, motion state, and other parameters) of the six-degree-of-freedom industrial robot 100 are transmitted to the data processing upper computer 500 in real time.
The grating type area array scanner 300 is installed at the end of the six-degree-of-freedom industrial robot 100, scans and measures a region to be measured of a workpiece to be measured under the driving of the six-degree-of-freedom industrial robot 100 to obtain the appearance point cloud of the workpiece, and the grating type area array scanner 300 is also connected with the data processing upper computer 500 to transmit three-dimensional point cloud data obtained through measurement to the processing upper computer 500 in real time. The testing principle of the grating type area array scanner 300 is that a binocular stereo vision three-dimensional measurement technology based on a triangulation principle is adopted, a reference grating with different phase differences is projected to a measured object through the scanner, a dual camera is used for collecting deformed grating images modulated by the surface of the object from two visual angles, then the phase value of each pixel point is calculated by using the deformed grating, the three-dimensional point cloud coordinate of the object is calculated according to the phase value, and the grating type area array scanner is adopted for obtaining the three-dimensional point cloud data of the measured workpiece. Specifically, the grating type area array scanner communicates with the upper computer data processing software through the switch 700 to complete the collection and transmission of point cloud data, and the grating type area array scanner may adopt, for example, ato Compact Scan of germany Gom company, and PowerScan of the world's view-only technology.
The data processing upper computer 500 is used for receiving the motion parameters of the six-degree-of-freedom industrial robot 100 and the measurement data of the grating type area array scanner 300, and calculating to obtain the pose of the measured workpiece under the robot base coordinate system, so as to complete the estimation of the pose of the complex part. The upper computer data processing software in the data processing upper computer 500 has the functions of point cloud data acquisition, point cloud splicing, point cloud simplification, point cloud position conversion, point cloud-design model matching and the like, and can realize coordinate conversion (conversion to a six-degree-of-freedom industrial robot base coordinate system) and point cloud matching functions based on measured point cloud, so that the relative pose of the measured workpiece relative to the six-degree-of-freedom industrial robot base coordinate system is output, namely the pose of the measured workpiece under the robot base coordinate system is calculated according to the received motion parameters of the six-degree-of-freedom industrial robot 100 and the measurement data of the grating type area array scanner 300, and the pose is obtained.
Wherein, the point cloud data acquisition is to acquire the coordinate p of the point on the measured workpiece under the measurement coordinate system of the grating type area array scanner by utilizing the stereo vision three-dimensional measurement technologyi(i 1, 2.., r.., s); the point cloud splicing is to splice the point clouds of 2 nd, 3 rd, … th and s th into the point cloud of 1 st by utilizing two point clouds before and after the mark point splicing measurement, and the total point cloud coordinate obtained in the way represents the position of a measurement coordinate system of the grating type area array scanner when the six-degree-of-freedom industrial robot measures the point clouds for the first time; point cloud simplification is to uniformly sample large-scale measurement point cloud initially measured by a robot optical measurement system, reduce the data scale and improve the data processing and calculating efficiency; the point cloud position conversion is carried out according to the relative pose between the grating type area array scanner and the end flange plate of the six-degree-of-freedom industrial robot
Figure BDA0001882231300000061
Pose of robot tail end flange plate relative to robot base coordinate system when robot measures point for the first time
Figure BDA0001882231300000062
And measuring the obtained point cloud pi(i=1,2,...,r,...,s) By means of a calculation formula
Figure BDA0001882231300000063
Converting the coordinate system into a six-degree-of-freedom industrial robot base coordinate system; the point cloud-design model matching is to match and align the point cloud converted into the six-degree-of-freedom industrial robot base coordinate system and the design model to the same coordinate system so as to obtain a conversion matrix, wherein the conversion matrix is the pose of the workpiece to be measured in the robot base coordinate system.
The positional relationship between the end flange of the six-degree-of-freedom industrial robot 100 and the measurement coordinate system of the grating type area array scanner 300 before measurement is obtained by hand-eye calibration, which is the prior art and is not described herein again. In the process of acquiring the point cloud of the workpiece to be measured 200, the blue light of the grating type area array scanner 300 covers the workpiece and part of the mark points, so that the subsequent point cloud can be spliced.
Specifically, the six-degree-of-freedom industrial robot 100 is connected with the data processing upper computer 500 through the six-degree-of-freedom industrial robot controller 600, the grating type area array scanner 300 is connected with the six-degree-of-freedom industrial robot controller 600, and signal triggering of upper computer software and synchronization of part point cloud data acquisition are achieved through control of the six-degree-of-freedom industrial robot controller 600.
The six-degree-of-freedom industrial robot 100 and the grating type area array scanner 300 are connected to the same data processing upper computer 500, pose parameters including angles of six axes and end positions of the robot during first measurement are recorded in the measurement process, and measurement data are synchronized to the data processing upper computer 500.
The height of the mark point support frame cannot be larger than the lowest height of the measured workpiece 200 during assembly, so that the point cloud of the measured workpiece is not influenced, the subsequent point cloud matching process is not interfered, meanwhile, the distance between the mark points is required to be 3 cm-4 cm, and the requirements that the point clouds measured in two times can be spliced and the mark points cannot be too many in one measuring range are met.
In actual work, firstly, the six-degree-of-freedom industrial robot, the area array scanner and the upper computer processing software need to be ensured to be in an open state, and the robot is controlledThe demonstrator controls the actual measurement posture of the robot, the measurement points are required to meet the measurement requirements, and the two images are required to contain at least 3 mark points which are not on the same straight line by the two-time measurement, so that the point cloud splicing is completed. After the acquisition of the measured point cloud data is finished, the measured point cloud is simplified by utilizing upper computer data processing software for improving the point cloud quality and reducing the data volume, hand-eye calibration parameters and the measurement pose of a first robot are input, the measured point cloud is converted into a robot base coordinate system, then point cloud-design model matching can be carried out, the point cloud model is matched into a design model coordinate system, and simultaneously, a conversion matrix T during each time of point cloud matching is recordedkAfter matching is completed, calculation is performed
Figure BDA0001882231300000071
The pose of the measured part to be solved relative to the six-degree-of-freedom industrial robot coordinate system is obtained.
The invention discloses a complex part pose estimation system based on three-dimensional measurement point cloud, which is used for estimating the pose of a complex part, and the basic idea is to obtain a conversion matrix of a grating type area array scanner and a six-degree-of-freedom industrial robot through hand-eye calibration, convert the workpiece measurement point cloud below a base coordinate system of the six-degree-of-freedom industrial robot, and match the point cloud with a workpiece three-dimensional design model through three-dimensional point cloud matching to obtain the conversion matrix, wherein the method specifically comprises the following steps:
s1 calibration conversion matrix between six-freedom-degree industrial robot tail end flange plate and grating type area array scanner measurement coordinate system
Figure BDA0001882231300000081
Wherein, R is a rotation matrix of the measurement coordinate system of the grating type area array scanner relative to the end flange of the six-degree-of-freedom industrial robot, t is a translation matrix of the measurement coordinate system of the grating type area array scanner relative to the end flange of the six-degree-of-freedom industrial robot, and the calibration is specifically performed by adopting hand-eye calibration, which is the prior art and is not repeated herein;
s2 driving grating by six-degree-of-freedom industrial robotThe surface array scanner moves to scan the workpiece to be measured, and then three-dimensional point cloud data p of the workpiece to be measured is obtainedi(i 1, 2., r., s), that is, three-dimensional point cloud data of the workpiece to be measured is obtained by a grating type area array scanner, which is the prior art and is not described herein again;
s3, converting the three-dimensional point cloud data of the workpiece to be detected to a robot base coordinate system to obtain the converted three-dimensional point cloud data:
Figure BDA0001882231300000082
wherein the content of the first and second substances,
Figure BDA0001882231300000083
the pose of the robot during the first measurement, namely the conversion pose from a flange plate at the tail end of the robot to a robot base coordinate system during the first measurement, can be obtained by calculation according to the motion parameters of the robot, and T is a conversion matrix from a connecting rod j-1 of the robot to a connecting rod j;
s4, matching the converted three-dimensional point cloud data with the three-dimensional design model of the workpiece to be measured to obtain the pose of the workpiece to be measured relative to the robot base coordinate system, thereby completing the estimation of the pose of the complex part, specifically, calculating the conversion matrix from the point cloud to the three-dimensional design model coordinate system after each matching is completed through the ICP matching algorithm
Figure BDA0001882231300000084
Setting the matching times as n, and then matching the corresponding transformation matrix T according to each timekCalculating the pose of the measured workpiece under the robot base coordinate system
Figure BDA0001882231300000085
The result is obtained.
The point cloud of the three-dimensional design model of the workpiece to be measured is assumed to be Q, and comprises the point cloud Qi(i 1, 2.. a., l.) P represents the converted three-dimensional point cloud data, including the point cloudBpi(i=1,2,...,r,...,s),TkThe solution of (c) is as follows:
s41 pairs all points in PBpiSearching the closest point Q corresponding to each point from QiCalculating the centroid muP、μQAnd difference of coordinates
Figure BDA0001882231300000091
Figure BDA0001882231300000092
Figure BDA0001882231300000093
S42 calculates a 3 × 3 order covariance matrix H from the set of points P, Q:
Figure BDA0001882231300000094
wherein HijRepresents the ith row and jth column elements of the matrix H;
s43 constructs a 4 × 4 order symmetric matrix W from H:
Figure BDA0001882231300000095
s44, calculating the eigenvalue of matrix W, and extracting the eigenvector corresponding to the maximum eigenvalue
Figure BDA0001882231300000096
And further solving a rotation matrix R and a translation matrix t:
Figure BDA0001882231300000097
t=μQ-R×μP
further, the following is obtained:
Figure BDA0001882231300000098
obtaining a transformation matrix TkThen, using TkSolving for pi(i 1, 2.. r.. s.) matched position pi′,pi′=Tk×piUpdating p at next matchi=pi', then repeating the steps S41-S44 to match n times, thereby improving the matching precision, and then converting the matrix T according to each timekCalculating the pose of the measured workpiece under the robot base coordinate system
Figure BDA0001882231300000101
Is obtained, wherein T is1I.e. the transformation matrix, T, obtained in the first matching2I.e. the transformation matrix obtained in the second matching, and so on, TnI.e. the transformation matrix obtained at the n-th matching.
The method can estimate the pose of a sealing surface flange sample of a certain model, skillfully acquire the pose of a workpiece in a robot base coordinate system after a series of coordinate transformation is carried out on measured three-dimensional point cloud data, and can carry out subsequent path planning and simulation on the robot after the pose is acquired.
It will be understood by those skilled in the art that the foregoing is only a preferred embodiment of the present invention, and is not intended to limit the invention, and that any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the scope of the present invention.

Claims (4)

1. The utility model provides a complicated part position appearance estimation system based on three-dimensional measurement point cloud, its characterized in that includes six degree of freedom industrial robot (100), grating formula area array scanner (300), by survey work piece (200), mark point support frame (400) and data processing host computer (500), wherein:
the tail end of the six-degree-of-freedom industrial robot (100) clamps the grating type area array scanner (300) and is used for driving the grating type area array scanner (300) to move to scan and measure a workpiece (200) to be measured placed on the mark point support frame (400), blue light emitted by the grating type area array scanner (300) covers the surface of the workpiece to be measured and mark points on the mark point support frame during measurement, at least 3 public mark points which are not on the same straight line exist in the two previous and later measurements, and the distance between adjacent mark points on the mark point support frame (400) is 3-4 cm; the six-degree-of-freedom industrial robot (100) is connected with a data processing upper computer (500) through a six-degree-of-freedom industrial robot controller (600) so as to transmit the motion parameters of the six-degree-of-freedom industrial robot (100) to the data processing upper computer (500) in real time;
the grating type area array scanner (300) is arranged at the tail end of the six-degree-of-freedom industrial robot (100) and used for scanning and measuring a workpiece to be measured (200) to obtain three-dimensional point cloud data of the workpiece to be measured, and the grating type area array scanner (300) is connected with the data processing upper computer (500) to transmit the measurement data to the processing upper computer (500) in real time; the grating type area array scanner (300) is connected with a six-degree-of-freedom industrial robot controller (600) so as to realize the synchronization of the signal triggering of a data processing upper computer and the data acquisition of the grating type area array scanner;
the data processing upper computer (500) is used for receiving the motion parameters of the six-degree-of-freedom industrial robot (100) and the measurement data of the grating type area array scanner (300), and calculating to obtain the pose of the measured workpiece under the robot base coordinate system, so that the estimation of the pose of the complex part is completed.
2. The complex part pose estimation system based on three-dimensional measurement point cloud as claimed in claim 1, characterized in that the position relation between the end flange of the six-degree-of-freedom industrial robot (100) and the measurement coordinate system of the grating type area array scanner (300) is obtained by hand-eye calibration before measurement.
3. A complex part pose estimation method based on three-dimensional measurement point cloud is characterized by being carried out by the system of claim 1 or 2, and comprising the following steps:
s1 calibration conversion matrix between six-freedom-degree industrial robot tail end flange plate and grating type area array scanner measurement coordinate system
Figure FDA0002439886570000021
Wherein R is a rotation matrix of a measurement coordinate system of the grating type area array scanner relative to a tail end flange plate of the six-degree-of-freedom industrial robot, and t is a translation matrix of the measurement coordinate system of the grating type area array scanner relative to the tail end flange plate of the six-degree-of-freedom industrial robot;
s2, the six-degree-of-freedom industrial robot drives the grating type area array scanner to move so as to scan the workpiece to be measured, and then three-dimensional point cloud data p of the workpiece to be measured is obtainedi,i=1,2,...,s;
S3, converting the three-dimensional point cloud data of the workpiece to be detected to a robot base coordinate system to obtain the converted three-dimensional point cloud data:
Figure FDA0002439886570000022
wherein the content of the first and second substances,
Figure FDA0002439886570000023
the pose of the robot is measured for the first time;
and S4, matching the converted three-dimensional point cloud data with the three-dimensional design model of the workpiece to be measured to obtain the pose of the workpiece to be measured relative to the robot base coordinate system, thereby finishing the estimation of the pose of the complex part.
4. The method for estimating the pose of a complex part based on three-dimensional measurement point cloud according to claim 3, wherein the pose of the measured workpiece relative to the robot base coordinate system is obtained in step S4 by adopting the following method:
firstly, calculating a transformation matrix from point cloud to three-dimensional design model coordinate system after each matching is finished
Figure FDA0002439886570000024
Wherein the matching times are n;
then according to the conversion matrix T corresponding to each matchingkCalculating the pose of the measured workpiece under the robot base coordinate system
Figure FDA0002439886570000025
The result is obtained.
CN201811428771.5A 2018-11-27 2018-11-27 Complex part pose estimation system and method based on three-dimensional measurement point cloud Active CN109373898B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811428771.5A CN109373898B (en) 2018-11-27 2018-11-27 Complex part pose estimation system and method based on three-dimensional measurement point cloud

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811428771.5A CN109373898B (en) 2018-11-27 2018-11-27 Complex part pose estimation system and method based on three-dimensional measurement point cloud

Publications (2)

Publication Number Publication Date
CN109373898A CN109373898A (en) 2019-02-22
CN109373898B true CN109373898B (en) 2020-07-10

Family

ID=65377364

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811428771.5A Active CN109373898B (en) 2018-11-27 2018-11-27 Complex part pose estimation system and method based on three-dimensional measurement point cloud

Country Status (1)

Country Link
CN (1) CN109373898B (en)

Families Citing this family (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110310331B (en) * 2019-06-18 2023-04-14 哈尔滨工程大学 Pose estimation method based on combination of linear features and point cloud features
CN110434679B (en) * 2019-07-25 2020-12-04 王东 Intelligent machining method for workpiece with random size error
CN110434671B (en) * 2019-07-25 2020-04-24 王东 Cast member surface machining track calibration method based on characteristic measurement
CN110634185A (en) * 2019-07-31 2019-12-31 众宏(上海)自动化股份有限公司 Visual algorithm for quickly forming point cloud for gear repair
CN110634161B (en) * 2019-08-30 2023-05-05 哈尔滨工业大学(深圳) Rapid high-precision estimation method and device for workpiece pose based on point cloud data
CN110553584A (en) * 2019-08-30 2019-12-10 长春理工大学 Measuring tool, automatic measuring system and measuring method for small-sized complex parts
CN110640585A (en) * 2019-10-25 2020-01-03 华中科技大学 Three-dimensional non-contact measuring device and method for blade grinding and polishing
CN112828878B (en) * 2019-11-22 2022-10-25 中国科学院沈阳自动化研究所 Three-dimensional measurement and tracking method for large-scale equipment in butt joint process
CN111043963A (en) * 2019-12-31 2020-04-21 芜湖哈特机器人产业技术研究院有限公司 Three-dimensional scanning system measuring method of carriage container based on two-dimensional laser radar
CN111551111B (en) * 2020-05-13 2021-02-05 华中科技大学 Part feature robot rapid visual positioning method based on standard ball array
CN112161619B (en) * 2020-09-16 2022-11-15 思看科技(杭州)股份有限公司 Pose detection method, three-dimensional scanning path planning method and detection system
CN112307562B (en) * 2020-10-30 2022-03-01 泉州装备制造研究所 Method for assembling complex parts on large-scale airplane by combining thermal deformation and gravity deformation
CN112577447B (en) * 2020-12-07 2022-03-22 新拓三维技术(深圳)有限公司 Three-dimensional full-automatic scanning system and method
CN112828552B (en) * 2021-01-29 2022-05-20 华中科技大学 Intelligent butt joint method and system for flange parts
CN113386136B (en) * 2021-06-30 2022-05-20 华中科技大学 Robot posture correction method and system based on standard spherical array target estimation
CN113686268A (en) * 2021-07-13 2021-11-23 北京航天计量测试技术研究所 Automatic measuring system and method for exhaust area of turbine guider
CN113894785B (en) * 2021-10-27 2023-06-09 华中科技大学无锡研究院 Control method, device and system for in-situ measurement and processing of turbine blades
TWI806294B (en) 2021-12-17 2023-06-21 財團法人工業技術研究院 3d measuring equipment and 3d measuring method
CN114279326B (en) * 2021-12-22 2024-05-28 易思维(天津)科技有限公司 Global positioning method of three-dimensional scanning equipment
CN114417616A (en) * 2022-01-20 2022-04-29 青岛理工大学 Digital twin modeling method and system for assembly robot teleoperation environment
CN114459377A (en) * 2022-02-10 2022-05-10 中国航发沈阳发动机研究所 Device and method for measuring blade profile of aircraft engine
CN115049730B (en) * 2022-05-31 2024-04-26 北京有竹居网络技术有限公司 Component mounting method, component mounting device, electronic apparatus, and storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101097131A (en) * 2006-06-30 2008-01-02 廊坊智通机器人***有限公司 Method for marking workpieces coordinate system
CN101566461A (en) * 2009-05-18 2009-10-28 西安交通大学 Method for quickly measuring blade of large-sized water turbine
CN106370106A (en) * 2016-09-30 2017-02-01 上海航天精密机械研究所 Industrial robot and linear guide rail-combined linear laser scanning measurement system and method

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4508252B2 (en) * 2008-03-12 2010-07-21 株式会社デンソーウェーブ Robot teaching device
JP4649554B1 (en) * 2010-02-26 2011-03-09 株式会社三次元メディア Robot controller
CN106959080B (en) * 2017-04-10 2019-04-05 上海交通大学 A kind of large complicated carved components three-dimensional pattern optical measuring system and method
CN107270833A (en) * 2017-08-09 2017-10-20 武汉智诺维科技有限公司 A kind of complex curved surface parts three-dimension measuring system and method

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101097131A (en) * 2006-06-30 2008-01-02 廊坊智通机器人***有限公司 Method for marking workpieces coordinate system
CN101566461A (en) * 2009-05-18 2009-10-28 西安交通大学 Method for quickly measuring blade of large-sized water turbine
CN106370106A (en) * 2016-09-30 2017-02-01 上海航天精密机械研究所 Industrial robot and linear guide rail-combined linear laser scanning measurement system and method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
大型构件复杂曲面自动化测量方法与技术;杨守瑞;《中国博士学位论文全文数据库 信息科技辑》;20180815(第08期);I138-26 *

Also Published As

Publication number Publication date
CN109373898A (en) 2019-02-22

Similar Documents

Publication Publication Date Title
CN109373898B (en) Complex part pose estimation system and method based on three-dimensional measurement point cloud
CN111775146B (en) Visual alignment method under industrial mechanical arm multi-station operation
CN110238845B (en) Automatic hand-eye calibration method and device for optimal calibration point selection and error self-measurement
CN109927036A (en) A kind of method and system of 3D vision guidance manipulator crawl
CN111089569B (en) Large box body measuring method based on monocular vision
US8520067B2 (en) Method for calibrating a measuring system
CN112325796A (en) Large-scale workpiece profile measuring method based on auxiliary positioning multi-view point cloud splicing
CN110202573B (en) Full-automatic hand-eye calibration and working plane calibration method and device
JP5371927B2 (en) Coordinate system calibration method and robot system
US10310054B2 (en) Relative object localization process for local positioning system
CN108227929B (en) Augmented reality lofting system based on BIM technology and implementation method
CN103020952A (en) Information processing apparatus and information processing method
JP2022516852A (en) Robot visual guidance method and device by integrating overview vision and local vision
Mi et al. A vision-based displacement measurement system for foundation pit
CN108180834A (en) A kind of industrial robot is the same as three-dimensional imaging instrument position orientation relation scene real-time calibration method
CN112766328A (en) Intelligent robot depth image construction method fusing laser radar, binocular camera and ToF depth camera data
CN112958960B (en) Robot hand-eye calibration device based on optical target
CN115284292A (en) Mechanical arm hand-eye calibration method and device based on laser camera
CN114459345B (en) Aircraft fuselage position and posture detection system and method based on visual space positioning
CN109773589B (en) Method, device and equipment for online measurement and machining guidance of workpiece surface
CN112508933B (en) Flexible mechanical arm movement obstacle avoidance method based on complex space obstacle positioning
Chen et al. Heterogeneous multi-sensor calibration based on graph optimization
Li et al. Extrinsic calibration of non-overlapping multi-camera system with high precision using circular encoded point ruler
Yamauchi et al. Calibration of a structured light system by observing planar object from unknown viewpoints
CN115847491A (en) Space on-orbit maintenance robot target autonomous measurement method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
EE01 Entry into force of recordation of patent licensing contract
EE01 Entry into force of recordation of patent licensing contract

Application publication date: 20190222

Assignee: WUHAN POWER3D TECHNOLOGY Ltd.

Assignor: HUAZHONG University OF SCIENCE AND TECHNOLOGY

Contract record no.: X2022420000110

Denomination of invention: A pose estimation system and method for complex parts based on 3d measurement point cloud

Granted publication date: 20200710

License type: Common License

Record date: 20220930