WO2020133873A1 - Three-dimensional vision-based production method by automatically calculating robot glue coating trajectory - Google Patents

Three-dimensional vision-based production method by automatically calculating robot glue coating trajectory Download PDF

Info

Publication number
WO2020133873A1
WO2020133873A1 PCT/CN2019/086537 CN2019086537W WO2020133873A1 WO 2020133873 A1 WO2020133873 A1 WO 2020133873A1 CN 2019086537 W CN2019086537 W CN 2019086537W WO 2020133873 A1 WO2020133873 A1 WO 2020133873A1
Authority
WO
WIPO (PCT)
Prior art keywords
point
frame
robot
product
points
Prior art date
Application number
PCT/CN2019/086537
Other languages
French (fr)
Chinese (zh)
Inventor
章悦晨
王杰高
严律
王明松
Original Assignee
南京埃克里得视觉技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 南京埃克里得视觉技术有限公司 filed Critical 南京埃克里得视觉技术有限公司
Publication of WO2020133873A1 publication Critical patent/WO2020133873A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J11/00Manipulators not otherwise provided for
    • B25J11/0075Manipulators for painting or coating
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems

Definitions

  • the invention relates to a robot industrial automatic trajectory teaching method, in particular to a robot glue trajectory automatic production method based on three-dimensional vision.
  • the processing method for realizing the shoe bottom glue application is as follows: the glue application station in the automatic production line adopts the method of manual teaching robot glue application track to complete the setting of the station glue application track, but there are many shoe brands and styles , So the method has poor adaptability. Each shoe needs to suspend the automated production line to teach the robot to set all the sizes of the shoe and its corresponding left and right feet. This method is extremely time-consuming and labor-intensive, so this method cannot meet the high productivity and High production efficiency and short production cycle requirements. In addition, this method has extremely high requirements on the position of the shoes, because the trajectory of the robot's movement is fixed, and it cannot adapt to the position of the shoes every time, so it can not adapt to the situation of the deviation of the position of the shoes. This problem can easily lead to the inaccurate placement of the shoes and the deviation of the glue application. In addition, in the production of other products, the robot automatic glue production method is also a common process. Therefore, to solve the automatic production method of robot glue trajectory
  • the purpose of the present invention is to overcome the shortcomings of the existing technology, and propose a three-dimensional vision-based automatic production method of robot glue trajectory, which can realize robot-free teaching and calculate the primer glue trajectory in real time.
  • a three-dimensional camera is used to collect a three-dimensional point cloud of the product to be coated with a track; then the collected data is processed through parameters and corresponding series of calculations to complete the product to be coated Recognition of contours; finally, the robot glue points and the corresponding poses of each point are automatically calculated by setting the trajectory and posture required by the glue process of the product to be glued; in the automated production process, the industrial control computer and the robot Communication, the industrial computer sends the robot glue points and the corresponding postures to each robot, and the robot completes the glue application of the products to be glued through the received glue points and the postures corresponding to each point.
  • the automatic production method of the robot glue trajectory based on three-dimensional vision of the present invention has the following steps:
  • Step 1 Use a 3D camera to obtain real-time 3D point cloud model data of the product to be glued on the station.
  • Step 2 Debug and set the parameters of 3D image processing and perform noise processing, point cloud clipping and filtering on the 3D point cloud model data according to the set parameters.
  • parameters such as exposure time and confidence of the 3D camera need to be adjusted.
  • Noise reduction remove the point cloud of noise by statistical filtering and removing the points whose number of neighbors is less than the set threshold within the specified neighborhood.
  • Point cloud clipping In order to reduce the point cloud of unnecessary environmental objects to reduce the amount of calculation and improve the overall efficiency, the point cloud outside the working area, that is, the content of its corresponding two-dimensional image, is removed.
  • the point cloud data information contains the normal vector information of each point, and the project only focuses on the contour of the product to be coated, the points where the angle between the normal vector and the vertical vector of the point is less than the set threshold value are further reduced. To reduce the amount of calculation and improve the overall efficiency.
  • Step 3 Extract and calculate the required contour data of the product to be glued Up, Un, Ut, Ur
  • Up the set of extracted contour points of the product to be glued
  • Un the normal set corresponding to the contour points of the product to be glued
  • Ut the tangential set corresponding to the contour points of the product to be glued
  • Ur the corresponding product of the product to be glued Radial collection of contour points.
  • Up, Un, Ut, Ur correspond to the points and their corresponding normal, tangential, and radial sets, respectively, which are all obtained by calculating the three-dimensional model data of the product to be coated.
  • Normal calculation method Fit the plane by specifying the points within the range of the field or the number of nearest points, and calculate the plane normal vector of the fitted plane.
  • the plane normal vector is the normal vector of the point;
  • Tangential calculation method Fit the spatial straight line by fitting the requested point and the points within the specified field or the specified number of nearest points.
  • the components of the fitted spatial line in the x, y, and z directions are the points sought Tangent vector
  • Radial calculation method project the tangent vector to the two-dimensional plane, the radial amount in the two-dimensional plane is perpendicular to the tangent vector, and when re-projected to the three-dimensional space, the x and y components are the same as those in the two-dimensional plane, and the z component is set to 0;
  • x a is the space coordinate x value of point A
  • y a is the space coordinate y value of point A
  • z a is the space coordinate z value of point A
  • Un, Ut, Ur in x a , y a ,, z a represents a space vector.
  • Step 4 Set the robot attitude offset according to the glue application process of the product to be glued
  • the robot attitude offset includes: Zoffset: the position offset in the Z direction of the space; AngleReal: the angular rotation amount along the tangential direction of the contour point of the product to be coated; Roffset: the contour of the product to be coated The radial offset of the point; TCP_COffsetReal: the amount of angular rotation of the TCP along the contour point of the product to be glued.
  • Step 5 Calculate the position and posture of the glue trajectory actually required by the robot.
  • Step 5.1 Get a 3x3 rotation matrix frame from Un, Ut and Ur:
  • Ut x , Ut y , Ut z are the x, y, z values of the radial data corresponding to the contour points of the product to be coated;
  • Ur x , Ur y , Ur z are the radial data corresponding to the contour points of the product to be coated X, y, z values;
  • Un x , Un y , Un z are the x, y, z values of the normal data corresponding to the contour points of the product to be coated.
  • Step 5.2 Translate the robot's preliminary glue trajectory position information (not including the gluing process settings) under a 3x3 rotation matrix frame, the translation direction is the negative direction of the z-axis, and the translation distance is Zoffset to obtain the new trajectory position Collection Pt.
  • Step 5.3 Rotate the rotation matrix frame 180° around the z axis and then rotate the AngleReal angle around the x axis to obtain a new rotation matrix frame1.
  • Step 5.4 obtain the quaternion PtOnCam of the spatial point in the camera coordinate system by rotating frame1 and the trajectory point Pt:
  • PtOnCam [frame 11 , frame 12 , frame 13 , Pt x ]
  • [frame 11 , frame 12 , frame 13 ], [frame 21 , frame 22 , frame 23 ] and [frame 31 , frame 32 , frame 33 ] are rotation matrices, and [Pt x , Pt y , Pt z ] T is Pan matrix.
  • Step 5.5 Convert the quaternion PtOnCam to the coordinate system (also called user coordinate system) calibrated by the robot to obtain the quaternion PtOnRef.
  • Step 5.6 Convert the quaternion PtOnRef to the coordinate system of the robot used (different robot manufacturers use different coordinate systems) to obtain the actually required posture a, b, c of the glue trajectory point;
  • Step 5.7 The actual trajectory information required by the robot for gluing is:
  • PosPt [a+TCP_COffsetReal, b, c, PtOnCam 14 , PtOnCam 24 , PtOnCam 34 ].
  • a, b, c are the a, b, c values of the robot's required pose
  • PtOnCam 14 , PtOnCam 24 , PtOnCam 34 correspond to Pt x , Pt y , Pt z in step 5.4, which is the space coordinate x required by the robot , Y, z
  • TCP_COffsetReal is the compensation value of angle a required by the robot.
  • the sequence containing PosPt of the glue trajectory position information is sent to the robot/master control system.
  • the data representation is the same as that in step 5.7.
  • the robot completes the glue application of the product to be glued according to the received data, thus realizing the automation of the robot-free robot teaching glue trajectory and the adaptive placement of the product to be glued.
  • the method of the invention realizes the automatic glue trajectory calculation of the glue trajectory of the manual teaching-free robot, saves the labor cost, shortens the overall generation cycle, improves the production efficiency requirements, and the process change is quick and convenient. Since there is no need to manually teach the robot to glue the trajectory, the preparation time in the early stage of production is saved.
  • the 3D point cloud model data of the product to be glued is acquired in real time. The accuracy of the point cloud model data can be ensured by a high-precision 3D camera.
  • the method of the present invention can accurately identify the contour and position of the product to be glued. The method cannot adapt the placement position of the products to be glued, which increases the bad products caused by the misalignment of the glue position due to the deviation of the placement of the products to be glued, the rework of the bad products and the loss and waste of time.
  • FIG. 1 Schematic diagram of glue application system.
  • FIG. 2 is a schematic block diagram of the automatic production method of the robot glue trajectory based on three-dimensional vision of the present invention.
  • Figure 3 Schematic diagram of the establishment of the coordinate system.
  • FIG. 4 Schematic diagram of robot attitude offset parameters.
  • Embodiment is an example in which a robot base coats shoes.
  • Figure 1 shows a device for implementing the method of the present invention, including a three-dimensional camera 1, shoe bottoming 2, industrial robot 3, industrial computer 4 and communication network cable 5;
  • the robot 4 is a general six-joint series industrial robot.
  • the robot has general industrial robot functions.
  • the coordinate system includes a joint coordinate system, a rectangular coordinate system, a tool coordinate system, and an external coordinate system.
  • the four-point method can be used to establish a tool coordinate system, etc.
  • the three-dimensional camera 1 has the function of taking pictures in real time and acquiring three-dimensional point cloud data.
  • the real-time output of the three-dimensional point cloud data output by the three-dimensional camera 1 is transmitted to the industrial control computer 4 through the shielded communication network cable 5.
  • the origin O is selected through the same calibration paper, and a point OX in the X direction and a point XY in the plane establish a camera coordinate system and a robot coordinate system and make the two coordinate systems coincide.
  • the gluing software debugs the set 3D image processing parameters, and reduces noise, and cuts and filters the point cloud.
  • Noise reduction remove the point cloud of noise by statistical filtering and removing the points whose number of neighbors is less than the set threshold within the specified neighborhood.
  • Point cloud clipping In order to reduce the point cloud of unnecessary environmental objects to reduce the amount of calculation and improve the overall efficiency, the point cloud outside the working area, that is, the content of its corresponding two-dimensional image, is removed.
  • the point cloud data information contains the normal vector information of each point, and the project only focuses on the contour of the shoe, the points where the angle between the normal vector and the vertical vector of the point is less than the set threshold, further reduce the point cloud amount to reduce Calculate the amount and improve the overall efficiency.
  • Normal calculation method Fit the plane by specifying the points within the specified range or the number of nearest points, and calculate the plane normal vector of the fitted plane.
  • the plane normal vector is the normal vector of the point.
  • Tangential calculation method Fit the spatial straight line by fitting the requested point and the points within the specified field or the specified number of nearest points.
  • the components of the fitted spatial line in the x, y, and z directions are the points sought Cut vector.
  • Radial calculation method project the tangent vector to the two-dimensional plane, the radial quantity in the two-dimensional plane is perpendicular to the tangent vector, and when re-projected to the three-dimensional space, the x and y components are the same as those in the two-dimensional plane, and the z component is set to 0.
  • the preliminary three-dimensional information data of the contours Up, Un, Ut, Ur can be used to obtain the preliminary glue track point information.
  • the robot's preliminary glue trajectory position information (not including the glue process setting) is translated into a matrix under a 3x3 rotation matrix frame, the translation direction is the negative direction of the z-axis, and the translation distance is Zoffset, to obtain a new set of trajectory points Pt .
  • PosPt [a+TCP_COffsetReal, b, c, PtOnCaml4, PtOnCam24, PtOnCam34]
  • PosPt is sent to the robot through communication according to the robot's glue trajectory point, and the robot completes the shoe bottoming and glue application through the received glue trajectory point.

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Manipulator (AREA)
  • Numerical Control (AREA)

Abstract

A three-dimensional vision-based production method by automatically calculating a robot gluing trajectory, comprising: first, using a three-dimensional camera to acquire a three-dimensional point cloud for a product to be glued for which a trajectory needs to be set; then, processing the acquired data by means of parameters and corresponding calculations to implement contour recognition of said product; and finally, automatically calculating robot gluing points and the posture corresponding to each point according to the trajectory and posture setting required by a gluing process of said product. Compared with the method of setting a robot gluing trajectory by manual teaching, the present method can save labor costs, shorten the overall production cycle, improve production efficiency, and achieves a quick and convenient process.

Description

基于三维视觉的机器人涂胶轨迹自动生产方法Automatic production method of robot glue trajectory based on three-dimensional vision 技术领域Technical field
本发明涉及一种机器人工业自动化轨迹示教方法,具体说是一种基于三维视觉的机器人涂胶轨迹自动生产方法。The invention relates to a robot industrial automatic trajectory teaching method, in particular to a robot glue trajectory automatic production method based on three-dimensional vision.
背景技术Background technique
随着人民生活水平的提高以及人民审美的与时俱进,制鞋行业产品的更新换代速度日益提高。舒适、美观、不同适用性的客户需求千差万别以及市场竞争全球化,致使开发周期以及供货周期不断缩短。制鞋行业的客户会根据季节、市场审美风向、材料的更新换代等进行订单的更新和增加,且要求供货周期短从而抢占市场先机。因此制鞋行业需要保证更换生产产品的速度快,生产效率高,产能大,但人力成本的增加以及长时间工作致使工人的工作效率下降,促使传统制鞋行业进行自动化升级以适应当前的环境。With the improvement of the people's living standards and the advancement of the people's aesthetics, the speed of product replacement in the footwear industry is increasing. Comfortable, beautiful, and diverse customer needs vary widely and the global market competition has led to a continuous shortening of the development cycle and supply cycle. Customers in the shoe industry will update and increase orders based on seasons, market aesthetics, and material updates, and require short supply cycles to seize market opportunities. Therefore, the footwear industry needs to ensure fast replacement of production products, high production efficiency, and large production capacity. However, the increase in labor costs and long working hours have reduced the work efficiency of workers, prompting the traditional footwear industry to upgrade automatically to adapt to the current environment.
目前,实现鞋子打底涂胶自动化的处理方法为:在自动化生产线中的涂胶工位采用人工示教机器人涂胶轨迹的方法来完成该工位涂胶轨迹的设置,但鞋子品牌和款式繁多,因此该方法适应性很差。每一款鞋子都需要暂停自动化生产线来示教机器人以设置该款鞋子的所有尺码及其对应的左右脚,这种方法极其耗费时间和人力,因此该种方法无法满足制鞋行业对高产能、高生产效率、生产周期短的要求。除此之外,该方法对鞋子摆放的位置要求极高,因为机器人运动的轨迹是固定的无法针对每次鞋子的摆放位置做自适应,从而无法适应鞋子摆放位置出现偏差的情况。该问题很容易导致鞋子摆放位置不准而造成涂胶偏离的情况。另外,在其他产品生产中,机器人自动涂胶生产方法也是常见的工艺。因此解决机器人涂胶轨迹自动生产方法At present, the processing method for realizing the shoe bottom glue application is as follows: the glue application station in the automatic production line adopts the method of manual teaching robot glue application track to complete the setting of the station glue application track, but there are many shoe brands and styles , So the method has poor adaptability. Each shoe needs to suspend the automated production line to teach the robot to set all the sizes of the shoe and its corresponding left and right feet. This method is extremely time-consuming and labor-intensive, so this method cannot meet the high productivity and High production efficiency and short production cycle requirements. In addition, this method has extremely high requirements on the position of the shoes, because the trajectory of the robot's movement is fixed, and it cannot adapt to the position of the shoes every time, so it can not adapt to the situation of the deviation of the position of the shoes. This problem can easily lead to the inaccurate placement of the shoes and the deviation of the glue application. In addition, in the production of other products, the robot automatic glue production method is also a common process. Therefore, to solve the automatic production method of robot glue trajectory
发明内容Summary of the invention
本发明的目的在于,克服现有技术存在的缺陷,提出了一种基于三维视觉的机器人涂胶轨迹自动生产方法,可以实现免机器人示教并实时计算打底涂胶轨迹。The purpose of the present invention is to overcome the shortcomings of the existing technology, and propose a three-dimensional vision-based automatic production method of robot glue trajectory, which can realize robot-free teaching and calculate the primer glue trajectory in real time.
本发明方法的基本技术思路:先采用三维相机对需要设置轨迹的待涂胶产品进行三维点云的采集;然后通过参数及对应的一系列计算对采集的数据进行处理,完成该待涂胶产品轮廓的识别;最后通过对应该待涂胶产品的涂胶工艺要求的轨迹姿态设置来自动计算出机器人涂胶点位以及每个点对应的姿态;在自动化生产过程中,通过工控机与机器人的通信,工控机将机器人涂胶点位以及每个点对应的姿态发送至机器人,机器人通过接收到的涂胶点位以及每个点对应的姿态完成待涂胶产品涂胶。The basic technical idea of the method of the present invention: First, a three-dimensional camera is used to collect a three-dimensional point cloud of the product to be coated with a track; then the collected data is processed through parameters and corresponding series of calculations to complete the product to be coated Recognition of contours; finally, the robot glue points and the corresponding poses of each point are automatically calculated by setting the trajectory and posture required by the glue process of the product to be glued; in the automated production process, the industrial control computer and the robot Communication, the industrial computer sends the robot glue points and the corresponding postures to each robot, and the robot completes the glue application of the products to be glued through the received glue points and the postures corresponding to each point.
本发明基于三维视觉的机器人涂胶轨迹自动生产方法,其步骤如下:The automatic production method of the robot glue trajectory based on three-dimensional vision of the present invention has the following steps:
步骤1.使用三维相机获取工位上的待涂胶产品的实时三维点云模型数据。 Step 1. Use a 3D camera to obtain real-time 3D point cloud model data of the product to be glued on the station.
步骤2.调试并设置三维图像处理的参数并根据设置的参数对三维点云模型数据进行噪声处理、点云裁剪和过滤。 Step 2. Debug and set the parameters of 3D image processing and perform noise processing, point cloud clipping and filtering on the 3D point cloud model data according to the set parameters.
调试并设置三维图像处理的参数并根据设置的参数对三维点云模型数据进行噪声处理、点云裁剪和过滤不需要的点云数据。该步骤需要调整三维相机的曝光时间、置信度 等参数。Debug and set the parameters of 3D image processing and perform noise processing, point cloud clipping and filtering of unnecessary point cloud data on the 3D point cloud model data according to the set parameters. In this step, parameters such as exposure time and confidence of the 3D camera need to be adjusted.
降噪:通过统计学滤波,在规定邻域范围内去除邻域个数小于所设阈值的点等方法去除噪声点云。Noise reduction: remove the point cloud of noise by statistical filtering and removing the points whose number of neighbors is less than the set threshold within the specified neighborhood.
点云剪裁:出于减少不必要环境物体的点云从而降低计算量和提升整体效率,将工作区域外的点云即其对应二维图像的内容去除。Point cloud clipping: In order to reduce the point cloud of unnecessary environmental objects to reduce the amount of calculation and improve the overall efficiency, the point cloud outside the working area, that is, the content of its corresponding two-dimensional image, is removed.
过滤:由于点云数据信息中包含每个点的法向量信息,且该项目只关注待涂胶产品轮廓,保留点的法向量与竖直向量的角度小于所设阈值的点,进一步减少点云量来降低计算量和提升整体效率。Filtering: Since the point cloud data information contains the normal vector information of each point, and the project only focuses on the contour of the product to be coated, the points where the angle between the normal vector and the vertical vector of the point is less than the set threshold value are further reduced. To reduce the amount of calculation and improve the overall efficiency.
步骤3.提取并计算所需的待涂胶产品轮廓数据Up,Un,Ut,Ur Step 3. Extract and calculate the required contour data of the product to be glued Up, Un, Ut, Ur
其中,Up:提取的待涂胶产品轮廓点的集合;Un:对应待涂胶产品轮廓点的法向集合;Ut:对应待涂胶产品轮廓点的切向集合;Ur:对应待涂胶产品轮廓点的径向集合。Among them, Up: the set of extracted contour points of the product to be glued; Un: the normal set corresponding to the contour points of the product to be glued; Ut: the tangential set corresponding to the contour points of the product to be glued; Ur: the corresponding product of the product to be glued Radial collection of contour points.
Up,Un,Ut,Ur分别对应点位和其对应法向、切向、径向的集合,均通过计算待涂胶产品的三维模型数据得到。Up, Un, Ut, Ur correspond to the points and their corresponding normal, tangential, and radial sets, respectively, which are all obtained by calculating the three-dimensional model data of the product to be coated.
法向计算方法:通过指定领域范围内的点或指定最近点数量来拟合平面,计算该拟合平面的平面法向量,该平面法向量即为该点的法向量;Normal calculation method: Fit the plane by specifying the points within the range of the field or the number of nearest points, and calculate the plane normal vector of the fitted plane. The plane normal vector is the normal vector of the point;
切向计算方法:通过拟合所求点及其指定领域范围内的点或指定最近点数量来拟合空间直线,该拟合空间直线在x,y,z方向的分量即为所求点的切向量;Tangential calculation method: Fit the spatial straight line by fitting the requested point and the points within the specified field or the specified number of nearest points. The components of the fitted spatial line in the x, y, and z directions are the points sought Tangent vector
径向计算方法:将切向量投射至二维平面,二维平面中径向量与切向量垂直,重新重新投射至三维空间时,x,y分量与二维平面内相同,z分量设置为0;Radial calculation method: project the tangent vector to the two-dimensional plane, the radial amount in the two-dimensional plane is perpendicular to the tangent vector, and when re-projected to the three-dimensional space, the x and y components are the same as those in the two-dimensional plane, and the z component is set to 0;
集合内每个点的表示方式如下:Each point in the set is represented as follows:
A(x a,y a,,z a) A(x a , y a ,, z a )
其中Up中x a为A点的空间坐标x值,y a,为A点的空间坐标y值,z a为A点的空间坐标z值;Un,Ut,Ur中x a,y a,,z a表示的是空间向量。 Where x a is the space coordinate x value of point A, y a is the space coordinate y value of point A, and z a is the space coordinate z value of point A; Un, Ut, Ur in x a , y a ,, z a represents a space vector.
通过轮廓的三维信息数据Up,Un,Ut,Ur获得机器人初步涂胶轨迹点位信息。Through the contour three-dimensional information data Up, Un, Ut, Ur to obtain the robot's preliminary glue trajectory point information.
步骤4.根据待涂胶产品的涂胶工艺设置机器人姿态偏移Step 4. Set the robot attitude offset according to the glue application process of the product to be glued
机器人姿态偏移量(姿态偏移参数)包括:Zoffset:空间Z方向的位置偏移量;AngleReal:沿待涂胶产品轮廓点切向方向旋转的角度旋转量;Roffset:沿待涂胶产品轮廓点径向的位置偏移量;TCP_COffsetReal:沿待涂胶产品轮廓点TCP的角度旋转量。The robot attitude offset (pose offset parameter) includes: Zoffset: the position offset in the Z direction of the space; AngleReal: the angular rotation amount along the tangential direction of the contour point of the product to be coated; Roffset: the contour of the product to be coated The radial offset of the point; TCP_COffsetReal: the amount of angular rotation of the TCP along the contour point of the product to be glued.
步骤5.计算机器人实际所需的涂胶轨迹点位和姿态。Step 5. Calculate the position and posture of the glue trajectory actually required by the robot.
空间坐标系的转换需要通过矩阵运算,方法如下:The transformation of the space coordinate system needs to be performed through matrix operations, as follows:
步骤5.1由Un,Ut,Ur得到一个3x3的旋转矩阵frame:Step 5.1 Get a 3x3 rotation matrix frame from Un, Ut and Ur:
frame=[Ut x,Ut y,Ut z] frame = [Ut x , Ut y , Ut z ]
[-Ur x,-Ur y,-Ur z] [-Ur x , -Ur y , -Ur z ]
[-Un x,-Un y,-Un z] [-Un x , -Un y , -Un z ]
其中:Ut x,Ut y,Ut z为对应待涂胶产品轮廓点的径向数据的x,y,z值;Ur x,Ur y,Ur z为对应待涂胶产品轮廓点的径向数据的x,y,z值;Un x,Un y,Un z为对应待涂胶产品轮廓点的法向数据的x,y,z值。 Where: Ut x , Ut y , Ut z are the x, y, z values of the radial data corresponding to the contour points of the product to be coated; Ur x , Ur y , Ur z are the radial data corresponding to the contour points of the product to be coated X, y, z values; Un x , Un y , Un z are the x, y, z values of the normal data corresponding to the contour points of the product to be coated.
步骤5.2将机器人初步涂胶轨迹点位信息(不包含涂胶工艺设置)在3x3旋转矩阵frame下进行矩阵的平移,平移方向为z轴负方向,平移距离为Zoffset,得到的新的轨迹点 位集合Pt。Step 5.2 Translate the robot's preliminary glue trajectory position information (not including the gluing process settings) under a 3x3 rotation matrix frame, the translation direction is the negative direction of the z-axis, and the translation distance is Zoffset to obtain the new trajectory position Collection Pt.
步骤5.3将旋转矩阵frame绕z轴旋转180°后再绕x轴旋转AngleReal的角度得到新的旋转矩阵frame1。Step 5.3 Rotate the rotation matrix frame 180° around the z axis and then rotate the AngleReal angle around the x axis to obtain a new rotation matrix frame1.
步骤5.4由旋转frame1和轨迹点位Pt得到空间点位在相机坐标系下的四元数PtOnCam:Step 5.4 obtain the quaternion PtOnCam of the spatial point in the camera coordinate system by rotating frame1 and the trajectory point Pt:
PtOnCam=[frame 11,frame 12,frame 13,Pt x] PtOnCam = [frame 11 , frame 12 , frame 13 , Pt x ]
[frame 11,frame 12,frame 13,Pt y] [frame 11 , frame 12 , frame 13 , Pt y ]
[frame 11,frame 12,frame 13,Pt z] [frame 11 , frame 12 , frame 13 , Pt z ]
[0,0,0,1][0, 0, 0, 1]
其中,[frame 11,frame 12,frame 13],[frame 21,frame 22,frame 23]和[frame 31,frame 32,frame 33]为旋转矩阵,[Pt x,Pt y,Pt z] T为平移矩阵。 Among them, [frame 11 , frame 12 , frame 13 ], [frame 21 , frame 22 , frame 23 ] and [frame 31 , frame 32 , frame 33 ] are rotation matrices, and [Pt x , Pt y , Pt z ] T is Pan matrix.
步骤5.5将四元数PtOnCam转换至机器人标定的坐标系(又称用户坐标系)下得到四元数PtOnRef。Step 5.5 Convert the quaternion PtOnCam to the coordinate system (also called user coordinate system) calibrated by the robot to obtain the quaternion PtOnRef.
步骤5.6将四元数PtOnRef转换到所使用的机器人的坐标系下(不同机器人厂家所使用的坐标系不同)以得到实际所需的涂胶轨迹点位的姿态a,b,c;Step 5.6 Convert the quaternion PtOnRef to the coordinate system of the robot used (different robot manufacturers use different coordinate systems) to obtain the actually required posture a, b, c of the glue trajectory point;
步骤5.7最终机器人涂胶实际所需的涂胶轨迹点位信息为:Step 5.7 The actual trajectory information required by the robot for gluing is:
PosPt=[a+TCP_COffsetReal,b,c,PtOnCam 14,PtOnCam 24,PtOnCam 34]。 PosPt=[a+TCP_COffsetReal, b, c, PtOnCam 14 , PtOnCam 24 , PtOnCam 34 ].
其中:a,b,c值为机器人所需姿态的a,b,c值;PtOnCam 14,PtOnCam 24,PtOnCam 34对应步骤5.4中Pt x,Pt y,Pt z,即机器人所需的空间坐标x,y,z;TCP_COffsetReal为机器人所需姿态a角度的补偿值。 Where: a, b, c are the a, b, c values of the robot's required pose; PtOnCam 14 , PtOnCam 24 , PtOnCam 34 correspond to Pt x , Pt y , Pt z in step 5.4, which is the space coordinate x required by the robot , Y, z; TCP_COffsetReal is the compensation value of angle a required by the robot.
通过工控机与机器人/主控***的通信连接,将包含涂胶轨迹点位信息PosPt的数列发送到机器人/主控***,数据表达形式与第步骤5.7中的表达形式相同。机器人根据接收到的数据完成待涂胶产品涂胶,这样就实现了免人工示教机器人涂胶轨迹并且自适应待涂胶产品摆放位置的自动化。Through the communication connection between the industrial computer and the robot/master control system, the sequence containing PosPt of the glue trajectory position information is sent to the robot/master control system. The data representation is the same as that in step 5.7. The robot completes the glue application of the product to be glued according to the received data, thus realizing the automation of the robot-free robot teaching glue trajectory and the adaptive placement of the product to be glued.
本发明方法实现了免人工示教机器人涂胶轨迹的自动涂胶轨迹计算,节约人工成本,缩短整体生成周期、提高生产效率需求,工艺变更快捷方便。由于不需要人工示教机器人涂胶轨迹,节省了生产前期的准备时间。采取实时获取待涂胶产品的三维点云模型数据,该点云模型数据的准确性可以由高精度三维相机保证,本发明方法可以精确识别待涂胶产品轮廓和位置,而该行业现阶段的方法无法自适应待涂胶产品摆放位置,加大了因待涂胶产品摆放位置出现偏差导致涂胶位置不对造成的坏品,对坏品的返工及其损失和时间浪费。The method of the invention realizes the automatic glue trajectory calculation of the glue trajectory of the manual teaching-free robot, saves the labor cost, shortens the overall generation cycle, improves the production efficiency requirements, and the process change is quick and convenient. Since there is no need to manually teach the robot to glue the trajectory, the preparation time in the early stage of production is saved. The 3D point cloud model data of the product to be glued is acquired in real time. The accuracy of the point cloud model data can be ensured by a high-precision 3D camera. The method of the present invention can accurately identify the contour and position of the product to be glued. The method cannot adapt the placement position of the products to be glued, which increases the bad products caused by the misalignment of the glue position due to the deviation of the placement of the products to be glued, the rework of the bad products and the loss and waste of time.
附图说明BRIEF DESCRIPTION
图1涂胶***示意图。Figure 1 Schematic diagram of glue application system.
图2本发明基于三维视觉的机器人涂胶轨迹自动生产方法示意框图。FIG. 2 is a schematic block diagram of the automatic production method of the robot glue trajectory based on three-dimensional vision of the present invention.
图3坐标系建立示意图。Figure 3 Schematic diagram of the establishment of the coordinate system.
图4机器人姿态偏移参数示意图。Figure 4 Schematic diagram of robot attitude offset parameters.
具体实施方式detailed description
下面结合实施例和附图,对本发明方法作进一步详细说明。The method of the present invention will be further described in detail below in conjunction with embodiments and drawings.
实施例:本实施例是机器人给鞋子打底涂胶的例子。图1所表示的是实现本发明方法的装置,包括三维相机1、鞋子打底2、工业机器人3、工控机4以及通讯网线5;Embodiment: This embodiment is an example in which a robot base coats shoes. Figure 1 shows a device for implementing the method of the present invention, including a three-dimensional camera 1, shoe bottoming 2, industrial robot 3, industrial computer 4 and communication network cable 5;
机器人4为通用六关节串联工业机器人,该机器人具有通用的工业机器人功能,如坐标系有关节坐标系、直角坐标系、工具坐标系和外部坐标系等,能够进行外部坐标系建立和设置、用户能够使用四点法建立工具坐标系等。The robot 4 is a general six-joint series industrial robot. The robot has general industrial robot functions. For example, the coordinate system includes a joint coordinate system, a rectangular coordinate system, a tool coordinate system, and an external coordinate system. The four-point method can be used to establish a tool coordinate system, etc.
三维相机l具有实时拍照并获取三维点云数据的功能。三维相机1输出的三维点云数据实时输出通过屏蔽通讯网线5传输到工控机4。The three-dimensional camera 1 has the function of taking pictures in real time and acquiring three-dimensional point cloud data. The real-time output of the three-dimensional point cloud data output by the three-dimensional camera 1 is transmitted to the industrial control computer 4 through the shielded communication network cable 5.
如图3所示,通过同一张标定纸选取原点O,X方向一点OX和平面内一点XY建立相机坐标系和机器人坐标系并使两者坐标系重合。As shown in FIG. 3, the origin O is selected through the same calibration paper, and a point OX in the X direction and a point XY in the plane establish a camera coordinate system and a robot coordinate system and make the two coordinate systems coincide.
让鞋子到达拍照位置,通过光幕传感器发送到位信号给工控机,软件接收到到位信号后触发拍照并获取实时的鞋子三维点云数据。Let the shoes arrive at the photographing position, send the in-position signal to the industrial control computer through the light curtain sensor, and the software triggers the photograph after receiving the in-position signal and obtains real-time three-dimensional point cloud data of the shoes.
涂胶软件通过调试设置好的三维图像处理参数,并降噪、点云裁剪和过滤。The gluing software debugs the set 3D image processing parameters, and reduces noise, and cuts and filters the point cloud.
降噪:通过统计学滤波,在规定邻域范围内去除邻域个数小于所设阈值的点等方法去除噪声点云。Noise reduction: remove the point cloud of noise by statistical filtering and removing the points whose number of neighbors is less than the set threshold within the specified neighborhood.
点云剪裁:出于减少不必要环境物体的点云从而降低计算量和提升整体效率,将工作区域外的点云即其对应二维图像的内容去除。Point cloud clipping: In order to reduce the point cloud of unnecessary environmental objects to reduce the amount of calculation and improve the overall efficiency, the point cloud outside the working area, that is, the content of its corresponding two-dimensional image, is removed.
过滤:由于点云数据信息中包含每个点的法向量信息,且该项目只关注鞋子轮廓,保留点的法向量与竖直向量的角度小于所设阈值的点,进一步减少点云量来降低计算量和提升整体效率。Filtering: Since the point cloud data information contains the normal vector information of each point, and the project only focuses on the contour of the shoe, the points where the angle between the normal vector and the vertical vector of the point is less than the set threshold, further reduce the point cloud amount to reduce Calculate the amount and improve the overall efficiency.
提取和计算鞋子轮廓的三维信息数据:点位Up,法向Un,切向Ut,径向Ur。Extract and calculate the three-dimensional information data of shoe contour: point Up, normal Un, tangential Ut, radial Ur.
法向计算方法:通过指定领域范围内的点或指定最近点数量来拟合平面,计算该拟合平面的平面法向量。该平面法向量即为该点的法向量。Normal calculation method: Fit the plane by specifying the points within the specified range or the number of nearest points, and calculate the plane normal vector of the fitted plane. The plane normal vector is the normal vector of the point.
切向计算方法:通过拟合所求点及其指定领域范围内的点或指定最近点数量来拟合空间直线,该拟合空间直线在x,y,z方向的分量即为所求点的切向量。Tangential calculation method: Fit the spatial straight line by fitting the requested point and the points within the specified field or the specified number of nearest points. The components of the fitted spatial line in the x, y, and z directions are the points sought Cut vector.
径向计算方法:将切向量投射至二维平面,二维平面中径向量与切向量垂直,重新重新投射至三维空间时,x,y分量与二维平面内相同,z分量设置为0。Radial calculation method: project the tangent vector to the two-dimensional plane, the radial quantity in the two-dimensional plane is perpendicular to the tangent vector, and when re-projected to the three-dimensional space, the x and y components are the same as those in the two-dimensional plane, and the z component is set to 0.
通过轮廓的三维信息数据Up,Un,Ut,Ur获得初步的涂胶轨迹点位信息。The preliminary three-dimensional information data of the contours Up, Un, Ut, Ur can be used to obtain the preliminary glue track point information.
根据该款鞋子的涂胶工艺设置机器人姿态偏移:Zoffset,AngleReal,Roffset,TCP_COffsetReal。Set the robot's attitude offset according to the gumming process of this shoe: Zoffset, AngleReal, Roffset, TCP_COffsetReal.
计算机器人实际所需的涂胶轨迹点位和姿态Calculate the actual position and posture of the glue trajectory required by the robot
由Un,Ut,Ur得到一个3x3的旋转矩阵frame:Get a 3x3 rotation matrix frame from Un, Ut, Ur:
frame=[Utx,Uty,Utz]frame = [Utx, Uty, Utz]
[-Urx,-Ury,-Urz][-Urx, -Ury, -Urz]
[-Unx,-Uny,-Unz][-Unx, -Uny, -Unz]
将机器人初步涂胶轨迹点位信息(不包含涂胶工艺设置)在3x3旋转矩阵frame下进行矩阵的平移,平移方向为z轴负方向,平移距离为Zoffset,得到的新的轨迹点位集合Pt。The robot's preliminary glue trajectory position information (not including the glue process setting) is translated into a matrix under a 3x3 rotation matrix frame, the translation direction is the negative direction of the z-axis, and the translation distance is Zoffset, to obtain a new set of trajectory points Pt .
将旋转矩阵frame绕z轴旋转180°后再绕x轴旋转AngleReal的角度得到新的旋转矩阵frame1。Rotate the rotation matrix frame 180° around the z axis and then rotate the AngleReal angle around the x axis to get a new rotation matrix frame1.
由旋转frame1和轨迹点位Pt得到空间点位在相机坐标系下的四元数PtOnCam:The quaternion PtOnCam of the spatial point in the camera coordinate system is obtained by rotating frame1 and the track point Pt:
PtOnCam=[frame11,frame12,frame13,Ptx]PtOnCam=[frame11, frame12, frame13, Ptx]
[frame11,frame12,frame13,Pry][frame11, frame12, frame13, Pry]
[frame11,frame12,frame13,Ptz][frame11, frame12, frame13, Ptz]
[0,0,0,1][0, 0, 0, 1]
将四元数PtOnCam转换至机器人标定的坐标系(又称用户坐标系)下得到四元数PtOnRef。Convert the quaternion PtOnCam to the coordinate system (also called user coordinate system) calibrated by the robot to get the quaternion PtOnRef.
将四元数PtOnRef转换到所使用的机器人的坐标系下(不同机器人厂家所使用的坐标系不同)以得到实际所需的涂胶轨迹点位的姿态a,b,c。Convert the quaternion PtOnRef to the coordinate system of the robot used (different robot manufacturers use different coordinate systems) to obtain the actual required posture a, b, c of the glue trajectory point.
最终机器人涂胶实际所需的涂胶轨迹点位信息为:The actual glue track point information required by the robot glue application is:
PosPt=[a+TCP_COffsetReal,b,c,PtOnCaml4,PtOnCam24,PtOnCam34]PosPt=[a+TCP_COffsetReal, b, c, PtOnCaml4, PtOnCam24, PtOnCam34]
根据机器人涂胶轨迹点位PosPt通过通讯发送至机器人,机器人通过接收到的涂胶轨迹点位完成鞋子打底涂胶。PosPt is sent to the robot through communication according to the robot's glue trajectory point, and the robot completes the shoe bottoming and glue application through the received glue trajectory point.

Claims (1)

  1. 一种基于三维视觉的机器人涂胶轨迹自动生产方法,其步骤如下:A three-dimensional vision-based automatic production method of robot glue trajectory, the steps are as follows:
    步骤1.使用二维相机获取流水线上喷处理剂/胶水的操作位的待涂胶产品实时二维点云模型数据;Step 1. Use a two-dimensional camera to obtain the real-time two-dimensional point cloud model data of the product to be coated on the operation position of the spray treatment agent/glue on the assembly line;
    步骤2.调试并设置三维图像处理的参数并根据设置的参数对三维点云模型数据进行噪声处理、点云裁剪和过滤不需要的点云数据;该步骤需要调整三维相机的曝光时间、置信度等参数;Step 2. Debug and set the parameters of 3D image processing and perform noise processing, point cloud clipping and filtering of unnecessary point cloud data on the 3D point cloud model data according to the set parameters; this step needs to adjust the exposure time and confidence of the 3D camera Other parameters
    降噪:通过统计学滤波,在规定邻域范围内去除邻域个数小于所设阈值的点等方法去除噪声点云;Noise reduction: remove the point cloud of noise by statistical filtering, removing the points whose number of neighbors is less than the set threshold within the specified neighborhood;
    点云剪裁:出于减少不必要环境物体的点云从而降低计算量和提升整体效率,将工作区域外的点云即其对应二维图像的内容去除;Point cloud clipping: In order to reduce the point cloud of unnecessary environmental objects to reduce the amount of calculation and improve the overall efficiency, the point cloud outside the working area, that is, the content of its corresponding two-dimensional image, is removed;
    过滤:由于点云数据信息中包含每个点的法向量信息,且该项目只关注待涂胶产品轮廓,保留点的法向量与竖直向量的角度小于所设阈值的点,进一步减少点云量来降低计算量和提升整体效率;Filtering: Since the point cloud data information contains the normal vector information of each point, and the project only focuses on the contour of the product to be coated, the points where the angle between the normal vector and the vertical vector of the point is less than the set threshold value are further reduced. To reduce the amount of calculation and improve the overall efficiency;
    步骤3.提取并计算所需的待涂胶产品轮廓数据Up,Un,Ut,UrStep 3. Extract and calculate the required contour data of the product to be glued Up, Un, Ut, Ur
    其中,Up:提取的待涂胶产品轮廓点的集合;Un:对应待涂胶产品轮廓点的法向集合;Ut:对应待涂胶产品轮廓点的切向集合;Ur:对应待涂胶产品轮廓点的径向集合;Among them, Up: the set of extracted contour points of the product to be glued; Un: the normal set corresponding to the contour points of the product to be glued; Ut: the tangential set corresponding to the contour points of the product to be glued; Ur: the corresponding product of the product to be glued Radial collection of contour points;
    Up,Un,Ut,Ur分别对应点位和其对应法向、切向、径向的集合,均通过计算待涂胶产品的三维模型数据得到:Up, Un, Ut, Ur correspond to the points and their corresponding sets of normal, tangential and radial directions, respectively, which are obtained by calculating the three-dimensional model data of the product to be coated:
    法向计算方法:通过指定领域范围内的点或指定最近点数量来拟合平面,计算该拟合平面的平面法向量,该平面法向量即为该点的法向量;Normal calculation method: Fit the plane by specifying the points within the range of the field or the number of nearest points, and calculate the plane normal vector of the fitted plane. The plane normal vector is the normal vector of the point;
    切向计算方法:通过拟合所求点及其指定领域范围内的点或指定最近点数量来拟合空间直线,该拟合空间直线在x,y,z方向的分量即为所求点的切向量;Tangential calculation method: Fit the spatial straight line by fitting the requested point and the points within the specified field or the specified number of nearest points. The components of the fitted spatial line in the x, y, and z directions are the points sought Tangent vector
    径向计算方法:将切向量投射至二维平面,二维平面中径向量与切向量垂直,重新重新投射至三维空间时,x,y分量与二维平面内相同,z分量设置为0;Radial calculation method: project the tangent vector to the two-dimensional plane, the radial amount in the two-dimensional plane is perpendicular to the tangent vector, and when re-projected to the three-dimensional space, the x and y components are the same as those in the two-dimensional plane, and the z component is set to 0;
    集合内每个点的表示方式如下:Each point in the set is represented as follows:
    A(x a,y a,,z a) A(x a , y a ,, z a )
    其中Up中x a为A点的空间坐标x值,y a,为A点的空间坐标y值,z a为A点的空间坐标z值;Un,Ut,Ur中x a,y a,,z a表示的是空间向量; Where x a is the space coordinate x value of point A, y a is the space coordinate y value of point A, and z a is the space coordinate z value of point A; Un, Ut, Ur in x a , y a ,, z a represents a space vector;
    通过拟合空间曲线的方式来拟合涂胶轨迹点位数据集,然后根据实际产线需求制定喷处理剂/胶水的起喷方向/位置作为机器人运动轨迹中第一点并按顺时针/逆时针方向将其余点位进行排序;经过排序后的空间点位数据集即为初步的涂胶轨迹点位数据集;Fit the glue trajectory point data set by fitting the space curve, and then formulate the spraying direction/position of the spraying agent/glue according to the actual production line requirements as the first point in the robot trajectory and press clockwise/inverse The remaining points are sorted in the clockwise direction; the sorted spatial point data set is the preliminary glue trajectory point data set;
    步骤4.根据该款待涂胶产品的涂胶工艺设置机器人姿态偏移Step 4. Set the robot attitude offset according to the glue application process of the product to be glued
    机器人姿态偏移量包括:Zoffset:空间Z方向的位置偏移量;AngleReal:沿待涂胶产品轮廓点切向方向旋转的角度旋转量;Roffset:沿待涂胶产品轮廓点径向的位置偏移量;TCP_COffsetReal:沿待涂胶产品轮廓点TCP的角度旋转量;The robot's attitude offset includes: Zoffset: the position offset in the Z direction of the space; AngleReal: the angular rotation amount along the tangential direction of the contour point of the product to be coated; Roffset: the radial position offset of the contour point of the product to be coated The amount of shift; TCP_COffsetReal: the amount of rotation of the angle along the TCP of the contour point of the product to be glued;
    步骤5.计算机器人实际所需的涂胶轨迹点位和姿态Step 5. Calculate the position and posture of the glue trajectory actually required by the robot
    步骤5.1由Un,Ut,Ur得到一个3 x 3的旋转矩阵frame:Step 5.1: Un, Ut, Ur get a 3x3 rotation matrix frame:
    frame=[Ut x,Ut y,Ut z] frame = [Ut x , Ut y , Ut z ]
    [Ur x,-Ur y,-Ur z] [Ur x , -Ur y , -Ur z ]
    [-Un x,-Un y,-Un z] [-Un x , -Un y , -Un z ]
    其中:Ut x,Ut y,Ut z为对应待涂胶产品轮廓点的径向数据的x,y,z值; Where: Ut x , Ut y , Ut z are the x, y, z values of the radial data corresponding to the contour points of the product to be coated;
    Ur x,Ur y,Ur z为对应待涂胶产品轮廓点的径向数据的x,y,z值; Ur x , Ur y , Ur z are the x, y, z values of the radial data corresponding to the contour points of the product to be coated;
    Un x,Un y,Un z为对应待涂胶产品轮廓点的法向数据的x,y,z值; Un x , Un y , Un z are the x, y, z values of the normal data corresponding to the contour points of the product to be coated;
    步骤5.2将机器人初步涂胶轨迹点位信息在3 x 3旋转矩阵frame下进行矩阵的平移,平移方向为z轴负方向,平移距离为Zoffset,得到的新的轨迹点位集合Pt;Step 5.2: Translate the robot's preliminary glued trajectory position information into a matrix under a 3×3 rotation matrix frame, the translation direction is the negative direction of the z-axis, and the translation distance is Zoffset, to obtain a new set of trajectory points Pt;
    步骤5.3将旋转矩阵frame绕z轴旋转180°后再绕x轴旋转AngleReal的角度得到新的旋转矩阵framel;Step 5.3 Rotate the rotation matrix frame 180° around the z axis and then rotate the AngleReal angle around the x axis to obtain a new rotation matrix framel;
    步骤5.4由旋转framel和轨迹点位Pt得到空间点位在相机坐标系下的四元数PtOnCam:Step 5.4 obtain the quaternion PtOnCam of the spatial point in the camera coordinate system by rotating the framel and the track point Pt:
    PtOnCam=[frame 11,frame 12,frame 13,Pt x] PtOnCam = [frame 11 , frame 12 , frame 13 , Pt x ]
    [frame 11,frame 12,frame 13,Pt y] [frame 11 , frame 12 , frame 13 , Pt y ]
    [frame 11,frame 12,frame 13,Pt z] [frame 11 , frame 12 , frame 13 , Pt z ]
    [0,0,0,1][0, 0, 0, 1]
    其中,[frame 11,frame 12,frame 13],[frame 21,frame 22,frame 23]和[frame 31,frame 32,frame 33]为旋转矩阵,[Pt x,Pt y,Pt z] T为平移矩阵; Among them, [frame 11 , frame 12 , frame 13 ], [frame 21 , frame 22 , frame 23 ] and [frame 31 , frame 32 , frame 33 ] are rotation matrices, and [Pt x , Pt y , Pt z ] T is Translation matrix
    步骤5.5将四元数PtOnCam转换至机器人标定的坐标系下得到四元数PtOnRef;Step 5.5 Convert the quaternion PtOnCam to the coordinate system calibrated by the robot to obtain the quaternion PtOnRef;
    步骤5.6将四元数PtOnRef转换到所使用的机器人的坐标系下以得到实际所需的涂胶轨迹点位的姿态a,b,c;Step 5.6 Convert the quaternion PtOnRef to the coordinate system of the robot used to obtain the actually required posture a, b, c of the glue trajectory point;
    步骤5.7最终机器人涂胶实际所需的涂胶轨迹点位信息为:Step 5.7 The actual glue track point information required by the final robot glue is:
    PosPt=[a+TCP_COffsetReal,b,c,PtOnCam 14,PtOnCam 24,PtOnCam 34] PosPt=[a+TCP_COffsetReal, b, c, PtOnCam 14 , PtOnCam 24 , PtOnCam 34 ]
    其中:a,b,c值为机器人所需姿态的a,b,c值;PtOnCam 14,PtOnCam 24,PtOnCam 34对应步骤5.4中Pt x,Pt y,Pt z,即机器人所需的空间坐标x,y,z;TCP_COffsetReal为机器人所需姿态a角度的补偿值。 Where: a, b, c are the a, b, c values of the robot's required pose; PtOnCam 14 , PtOnCam 24 , PtOnCam 34 correspond to Pt x , Pt y , Pt z in step 5.4, which is the space coordinate x required by the robot , Y, z; TCP_COffsetReal is the compensation value of angle a required by the robot.
PCT/CN2019/086537 2018-12-27 2019-05-13 Three-dimensional vision-based production method by automatically calculating robot glue coating trajectory WO2020133873A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201811609355.5A CN109454642B (en) 2018-12-27 2018-12-27 Robot gluing track automatic production method based on three-dimensional vision
CN201811609355.5 2018-12-27

Publications (1)

Publication Number Publication Date
WO2020133873A1 true WO2020133873A1 (en) 2020-07-02

Family

ID=65614993

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/086537 WO2020133873A1 (en) 2018-12-27 2019-05-13 Three-dimensional vision-based production method by automatically calculating robot glue coating trajectory

Country Status (2)

Country Link
CN (1) CN109454642B (en)
WO (1) WO2020133873A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112862704A (en) * 2021-01-22 2021-05-28 北京科技大学 Glue spraying and glue spraying quality detection system based on 3D vision
CN114193460A (en) * 2022-02-16 2022-03-18 常州铭赛机器人科技股份有限公司 Rubber road guiding and positioning method based on three-dimensional vision and Mark self-compensation
CN114373012A (en) * 2021-12-21 2022-04-19 中科新松有限公司 Method for generating special-shaped plane spraying operation track

Families Citing this family (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109454642B (en) * 2018-12-27 2021-08-17 南京埃克里得视觉技术有限公司 Robot gluing track automatic production method based on three-dimensional vision
CN110226806B (en) * 2019-05-07 2022-04-01 深圳市皕像科技有限公司 Sole gluing track generation method and device
CN110355759A (en) * 2019-07-05 2019-10-22 保定科海自动化科技有限公司 A kind of industrial robot gluing control system of view-based access control model
CN110434671B (en) * 2019-07-25 2020-04-24 王东 Cast member surface machining track calibration method based on characteristic measurement
CN112296999B (en) * 2019-11-12 2022-07-08 太原科技大学 Irregular workpiece machining path generation method based on machine vision
CN110876512B (en) * 2019-11-13 2024-01-12 广东工业大学 Control method of high-precision automatic gluing system for soles
CN111192189A (en) * 2019-12-27 2020-05-22 中铭谷智能机器人(广东)有限公司 Three-dimensional automatic detection method and system for automobile appearance
CN111230862B (en) * 2020-01-10 2021-05-04 上海发那科机器人有限公司 Handheld workpiece deburring method and system based on visual recognition function
CN111055286B (en) * 2020-01-13 2021-08-03 广州启帆工业机器人有限公司 Industrial robot track generation method, system, device and storage medium
CN111369593B (en) * 2020-03-16 2024-01-09 梅卡曼德(北京)机器人科技有限公司 Glass gluing method, device, electronic equipment and storage medium
CN111546337B (en) * 2020-04-30 2022-02-11 重庆见芒信息技术咨询服务有限公司 Industrial robot full-coverage path generation method and system based on free-form surface
CN111702772B (en) * 2020-06-04 2022-07-12 浙江和生荣智能科技有限公司 Automatic upper surface guiding and gluing method and system
CN114274139B (en) * 2020-09-27 2024-04-19 西门子股份公司 Automatic spraying method, device, system and storage medium
CN112297007B (en) * 2020-10-22 2021-10-26 南京埃斯顿自动化股份有限公司 Linear motion planning method under external reference coordinate system of robot
CN112415949A (en) * 2020-10-29 2021-02-26 深圳群宾精密工业有限公司 Method for automatically adjusting operation track through three-dimensional shape information of real object
CN112604901B (en) * 2020-10-30 2022-04-01 江苏天艾美自动化科技有限公司 Subway shielding door gluing robot and gluing method thereof
CN112465767A (en) * 2020-11-25 2021-03-09 南京熊猫电子股份有限公司 Industrial robot sole gluing track extraction method
CN112767237B (en) * 2020-12-30 2024-06-25 无锡祥生医疗科技股份有限公司 Annular pose control method and device based on point cloud data and ultrasonic equipment
CN113303564A (en) * 2021-04-30 2021-08-27 泉州华中科技大学智能制造研究院 Dynamic following glue spraying method and system for soles
CN113189934A (en) * 2021-05-11 2021-07-30 梅卡曼德(北京)机器人科技有限公司 Trajectory generation method and apparatus, electronic device, storage medium, and 3D camera
CN113910226A (en) * 2021-10-11 2022-01-11 深圳大学 Method and system for processing shoe body based on guiding robot execution of vision system
CN114178832B (en) * 2021-11-27 2023-03-24 南京埃斯顿机器人工程有限公司 Robot guide assembly robot method based on vision
CN114170314B (en) * 2021-12-07 2023-05-26 群滨智造科技(苏州)有限公司 Intelligent 3D vision processing-based 3D glasses process track execution method
CN114794668B (en) * 2022-03-31 2023-05-05 深圳市如本科技有限公司 Vamp gluing method, vamp gluing system, computer equipment and computer readable storage medium
CN114794669A (en) * 2022-03-31 2022-07-29 深圳市如本科技有限公司 Vamp gluing track generation method and system, computer equipment and storage medium
CN114747840B (en) * 2022-05-07 2023-10-20 东华大学 Method for adjusting sole gluing posture, storage device and sole gluing robot
CN114670352B (en) * 2022-05-26 2022-08-12 广东高景太阳能科技有限公司 Real-time automatic control silicon wafer production method, system, medium and equipment

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN201783437U (en) * 2010-06-29 2011-04-06 吴华 Integrated circuit loading machine gluing module
CN103272739A (en) * 2013-05-27 2013-09-04 武汉华卓奔腾科技有限公司 Three-dimensional positioning device based on visual guidance and dispensing equipment
CN106625713A (en) * 2017-01-11 2017-05-10 长春工业大学 Method of improving gumming accuracy of gumming industrial robot
WO2017133929A1 (en) * 2016-02-02 2017-08-10 Eisenmann Se Multi-axis robot and method for controlling the same for painting objects
CN108297097A (en) * 2018-01-19 2018-07-20 汽-大众汽车有限公司 A kind of body of a motor car paint spraying system and method
JP2018192551A (en) * 2017-05-16 2018-12-06 セイコーエプソン株式会社 Control device, robot, and robot system
US20180348730A1 (en) * 2017-06-01 2018-12-06 X Development Llc Automatic Generation of Toolpaths
CN109454642A (en) * 2018-12-27 2019-03-12 南京埃克里得视觉技术有限公司 Robot coating track automatic manufacturing method based on 3D vision

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104463851B (en) * 2014-11-19 2018-05-22 哈尔滨工业大学深圳研究生院 A kind of sole edge line automatic tracking method based on robot
CN105894120B (en) * 2016-04-08 2021-09-24 泉州装备制造研究所 Sole glue spraying path planning method based on attitude control
IT201600132357A1 (en) * 2016-12-29 2018-06-29 Univ Degli Studi Padova METHOD AND THREE-DIMENSIONAL MAPPING EQUIPMENT OF A PORTION OF THE SKIN OF A PATIENT
CN107127755B (en) * 2017-05-12 2023-12-08 华南理工大学 Real-time acquisition device of three-dimensional point cloud and robot polishing track planning method
CN107908152A (en) * 2017-12-26 2018-04-13 苏州瀚华智造智能技术有限公司 A kind of movable robot automatic spray apparatus, control system and method
CN108982546B (en) * 2018-08-29 2020-06-23 燕山大学 Intelligent robot gluing quality detection system and method

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN201783437U (en) * 2010-06-29 2011-04-06 吴华 Integrated circuit loading machine gluing module
CN103272739A (en) * 2013-05-27 2013-09-04 武汉华卓奔腾科技有限公司 Three-dimensional positioning device based on visual guidance and dispensing equipment
WO2017133929A1 (en) * 2016-02-02 2017-08-10 Eisenmann Se Multi-axis robot and method for controlling the same for painting objects
CN106625713A (en) * 2017-01-11 2017-05-10 长春工业大学 Method of improving gumming accuracy of gumming industrial robot
JP2018192551A (en) * 2017-05-16 2018-12-06 セイコーエプソン株式会社 Control device, robot, and robot system
US20180348730A1 (en) * 2017-06-01 2018-12-06 X Development Llc Automatic Generation of Toolpaths
CN108297097A (en) * 2018-01-19 2018-07-20 汽-大众汽车有限公司 A kind of body of a motor car paint spraying system and method
CN109454642A (en) * 2018-12-27 2019-03-12 南京埃克里得视觉技术有限公司 Robot coating track automatic manufacturing method based on 3D vision

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112862704A (en) * 2021-01-22 2021-05-28 北京科技大学 Glue spraying and glue spraying quality detection system based on 3D vision
CN112862704B (en) * 2021-01-22 2023-08-11 北京科技大学 Glue spraying and glue spraying quality detection system based on 3D vision
CN114373012A (en) * 2021-12-21 2022-04-19 中科新松有限公司 Method for generating special-shaped plane spraying operation track
CN114193460A (en) * 2022-02-16 2022-03-18 常州铭赛机器人科技股份有限公司 Rubber road guiding and positioning method based on three-dimensional vision and Mark self-compensation
CN114193460B (en) * 2022-02-16 2022-05-17 常州铭赛机器人科技股份有限公司 Rubber road guiding and positioning method based on three-dimensional vision and Mark self-compensation

Also Published As

Publication number Publication date
CN109454642A (en) 2019-03-12
CN109454642B (en) 2021-08-17

Similar Documents

Publication Publication Date Title
WO2020133873A1 (en) Three-dimensional vision-based production method by automatically calculating robot glue coating trajectory
CN110102490B (en) Assembly line parcel sorting device based on vision technology and electronic equipment
WO2017128865A1 (en) Multiple lens-based smart mechanical arm and positioning and assembly method
CN107192331A (en) A kind of workpiece grabbing method based on binocular vision
CN109719734B (en) Robot vision-guided mobile phone flashlight assembling system and assembling method
CN110065068B (en) Robot assembly operation demonstration programming method and device based on reverse engineering
US20150202776A1 (en) Data generation device for vision sensor and detection simulation system
CN105965519A (en) Vision-guided discharging positioning method of clutch
CN111721259A (en) Underwater robot recovery positioning method based on binocular vision
WO2022000713A1 (en) Augmented reality self-positioning method based on aviation assembly
JP7449241B2 (en) Shoe processing system and control method for shoe processing system
CN115578376B (en) Robot vamp glue spraying track extraction method and device based on 3D vision
JP2016078195A (en) Robot system, robot, control device and control method of robot
CN110385889A (en) A kind of device and method of dynamic attachment vision-based detection and correction
CN110749290A (en) Three-dimensional projection-based characteristic information rapid positioning method
CN110743735A (en) Intelligent spraying system based on robot platform
CN111702772A (en) Automatic upper surface guiding and gluing method and system
CN203519505U (en) Combined non-shadow non-blind online vision detection system for robot gumming quality
CN113070876A (en) Manipulator dispensing path guiding and deviation rectifying method based on 3D vision
CN112356073A (en) Online calibration device and method for three-dimensional camera pose of industrial robot
Prezas et al. AI-enhanced vision system for dispensing process monitoring and quality control in manufacturing of large parts
CN114463244A (en) Vision robot grabbing system and control method thereof
CN108582037B (en) Method for realizing precise fitting by matching two cameras with robot
CN114833038A (en) Gluing path planning method and system
CN110142748A (en) A kind of quick teaching system of robot suitable for spraying welding profession and method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19901748

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19901748

Country of ref document: EP

Kind code of ref document: A1