CN103817699A - Quick hand-eye coordination method for industrial robot - Google Patents

Quick hand-eye coordination method for industrial robot Download PDF

Info

Publication number
CN103817699A
CN103817699A CN201310449467.XA CN201310449467A CN103817699A CN 103817699 A CN103817699 A CN 103817699A CN 201310449467 A CN201310449467 A CN 201310449467A CN 103817699 A CN103817699 A CN 103817699A
Authority
CN
China
Prior art keywords
camera
coordinate system
epsiv
robot
industrial robot
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201310449467.XA
Other languages
Chinese (zh)
Inventor
胡峰俊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Shuren University
Original Assignee
Zhejiang Shuren University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Shuren University filed Critical Zhejiang Shuren University
Priority to CN201310449467.XA priority Critical patent/CN103817699A/en
Publication of CN103817699A publication Critical patent/CN103817699A/en
Pending legal-status Critical Current

Links

Landscapes

  • Measurement Of Optical Distance (AREA)
  • Image Analysis (AREA)

Abstract

A quick hand-eye coordination method for an industrial robot is characterized by comprising the following steps of (1) calibrating a depth camera by using a calibrating method of Zhengyou Zhang, and acquiring internal and external parameters of the depth camera; (2) acquiring the depth information of a detection target point by using a triangulation survey principle of the depth camera; (3) acquiring coordinates of the target point in a camera coordinate system by using a camera forming model; (4) describing the relation between the camera coordinate system and a robot world coordinate system by using a Bursa model; and (5) solving parameters of the Bursa model by using an indirect balancing model so as to acquire a hand-eye coordination parameter of the industrial robot. The quick hand-eye coordination method for the industrial robot has the advantages that the internal and internal parameters of the depth camera can be quickly calculated; the three-dimensional coordinates of the depth camera can be effectively extracted; the hand-eye relation of the industrial robot can be quickly calculated; and the operation method is simple and convenient.

Description

One is industrial robot hand eye coordination method fast
Technical field
The present invention relates to robot vision field, be specifically related to Robot Hand-eye coordination approach.
Background technology
Computer vision refers to and replaces human eye that target is identified, followed the tracks of and measures with camera, and as the current study hotspot in forward position the most, the Robotics based on computer vision is one of key technology.Vision can provide abundant environment, target information for robot, for judgement, the decision-making of robot provide foundation.The hand eye coordination system of robot, is mainly divided into two large class: Eye-to-Hand and Eye-in-Hand at present.Camera is fixed on the machine-independent people in certain workspace by the former, and the latter is generally fixed on end effector of robot by camera.
For Eye-in-Hand hand eye coordination, there are many different hand and eye calibrating methods, there is researcher that camera and end effector of robot are done to as a whole modeling, so just robot standard error cannot be distinguished, also there is researcher to adopt the normal derivative method of optical flow field, but thisly rotatablely move and calculate translation vectors by two, can will reduce to a great extent computational accuracy., there is a fixing homogeneous transformation relation in Eye-to-Hand hand eye coordination system, imaging and the robot motion on camera is irrelevant for target object between camera coordinate system and industrial robot coordinate system.Traditional Eye-to-Hand hand eye coordination, employing be that the parallax information of left and right between two cameras calculates depth information, binocular vision Stereo Matching Algorithm is not also very ripe, amount of calculation is very large, the error of depth information is also larger.
Therefore, the defect that existing Robot Hand-eye coordination approach exists is: method of operating complexity, and precision is generally not high, and computational methods are too complicated.
Summary of the invention
In order to overcome existing industrial robot hand eye coordination method deficiency, the present invention proposes a kind of Robot Hand-eye coordination technique based on degree of depth camera and boolean Sha model, show by experiment, this Robot Hand-eye coordination approach is simple, convenient, can effectively carry out vision guide task.
The technical solution adopted for the present invention to solve the technical problems comprises the following steps:
1), first adopt the scaling method of Zhang Zhengyou to demarcate degree of depth camera, obtain its inside and outside parameter.
2) principle of triangulation, by degree of depth camera obtains the depth information that detects impact point, has following formula:
D = b * f d - - - ( 1 )
Wherein, D is depth information, described degree of depth camera comprises " infrared camera " and " infrared projection machine ", " infrared camera " and " infrared projection machine;, be horizontal positioned, b is the length of the horizontal base line between " infrared camera " and " infrared projection machine "; f is the focal length of " infrared camera ", d is two parallaxes between camera;
3), obtain and detect the coordinate of impact point at camera coordinate system by camera imaging model, as a certain point coordinates (X in camera coordinate system c, Y c, Z c) can calculate and obtain by principle of triangulation, there is following formula:
X C = ( a - a 0 ) Z C f x = ( a - a 0 ) * D f x
Y C = ( b - b 0 ) Z C f y = ( b - b 0 ) * D f y
Z C=D (2)
A certain point coordinates (X in camera coordinate system c, Y c, Z c), its subpoint coordinate in imaging plane pixel coordinate system is (a, b), and the point coordinates that corresponding imaging plane physical coordinates is is (x, y), and each pixel unit is dx and dy at x and actual range corresponding to y direction, f x=f/dx, f y=f/dy, the intersecting point coordinate of camera coordinate system and imaging plane pixel coordinate system is (a 0, b 0).
4), the relation of camera coordinate system and robot world's coordinate system is described with boolean Sha model, has following formula:
X W Y W Z W = ( 1 + m ) X C Y C Z C + 0 ϵ z - ϵ y - ϵ z 0 ϵ x ϵ y - ϵ x 0 X C Y C Z C + Δx Δy Δz - - - ( 3 )
Wherein, Δ x, Δ y, Δ z are two translational movements between rectangular coordinate system in space, ε x, ε y, ε zbe rotation parameter, m is scale parameter, (X c, Y c, Z c) be any coordinate of the camera coordinate system that obtains in formula (2), (X w, Y w, Z w) be coordinates of targets value under robot coordinate system, obtain by the manual teaching of robot teach box.
5), utilize indirect adjustment model to solve the parameter of boolean Sha model, thereby obtain parameter (Δ x, Δ y, Δ z, the ε of industrial robot hand eye coordination x, ε y, ε z, m).
Advantage of the present invention is: the inside and outside parameter of compute depth camera fast; Can effectively extract the three-dimensional coordinate of degree of depth camera; Can calculate industrial robot trick relation fast, method of operating is simple and convenient.
Accompanying drawing explanation
Fig. 1 is industrial robot fast hand eye coordinate method schematic diagram of the present invention.
Point 1, point 23, point 34, degree of depth camera 5, industrial robot 6, operating platform in the drawings, 1,
The specific embodiment
Below in conjunction with accompanying drawing, the invention will be further described.As Fig. 1, the present invention is achieved in that degree of depth camera 4 is arranged on as left and right, 1 meter of 6 top of Fig. 1 operating platform.Adopt the scaling method of Zhang Zhengyou to obtain degree of depth camera inside and outside parameter, by principle of triangulation and the camera imaging model of degree of depth camera, in computing platform, put one, point two and the coordinate of point three under degree of depth camera coordinate system.Point one, point two and the coordinate of point three in robot coordinate system, obtained by the robot manual teaching of 5 teach box.Therefore, camera coordinate system coordinate and robot coordinate system's coordinate of the point one of acquisition, point two and point three correspondences are updated to boolean Sha model, utilize indirect adjustment model to solve the parameter of boolean Sha model, thereby obtain parameter (Δ x, Δ y, Δ z, the ε of industrial robot hand eye coordination x, ε y, ε z, m), degree of depth camera and industrial robot just can co-ordinations.For whether boolean Sha model of verifying proposition is applicable to industrial robot hand and eye calibrating fast, 3 points of under we selected robot coordinate system 3 and corresponding camera coordinate system, as table 1:
Table 1 robot coordinate system and visual coordinate are corresponding 3 point coordinates
Figure BSA0000095653690000031
Carry out error analysis by choosing multiple points, what error ratio was larger mainly concentrate on, and x and y sit on target value, by asking the mean value of multiple points, can obtain some the mean error between point apart from d average(mm) be:
d average = 1.557 + 2.220 + 1.493 + 1.786 + 1.449 + 1.674 + 2.383 + 1.662 = 1.78
Therefore, good result can be obtained by boolean Sha model, the function such as welding, spraying of robot can be met completely.

Claims (1)

1. an industrial robot hand eye coordination method fast, is characterized in that hand eye coordination method is:
1), first adopt the scaling method of Zhang Zhengyou to demarcate degree of depth camera, obtain its inside and outside parameter.
2) principle of triangulation, by degree of depth camera obtains the depth information that detects impact point, has following formula:
D = b * f d - - - ( 1 )
Wherein, D is depth information, described degree of depth camera comprises " infrared camera " and " infrared projection machine ", " infrared camera " and " infrared projection machine " is horizontal positioned, b is the length of the horizontal base line between " infrared camera " and " infrared projection machine ", f is the focal length of " infrared camera ", and d is two parallaxes between camera;
3), obtain and detect the coordinate of impact point at camera coordinate system by camera imaging model, as a certain point coordinates (X in camera coordinate system c, Y c, Z c) can calculate and obtain by principle of triangulation, there is following formula:
X C = ( a - a 0 ) Z C f x = ( a - a 0 ) * D f x
Y C = ( b - b 0 ) Z C f y = ( b - b 0 ) * D f y
Z C=D (2)
A certain point coordinates (X in camera coordinate system c, Y c, Z c), its subpoint coordinate in imaging plane pixel coordinate system is (a, b), and the point coordinates that corresponding imaging plane physical coordinates is is (x, y), and each pixel unit is dx and dy at x and actual range corresponding to y direction, f x=f/dx, f y=f/dy, the intersecting point coordinate of camera coordinate system and imaging plane pixel coordinate system is (a 0, b 0).
4), the relation of camera coordinate system and robot world's coordinate system is described with boolean Sha model, has following formula:
X W Y W Z W = ( 1 + m ) X C Y C Z C + 0 ϵ z - ϵ y - ϵ z 0 ϵ x ϵ y - ϵ x 0 X C Y C Z C + Δx Δy Δz - - - ( 3 )
Wherein, Δ x, Δ y, Δ z are two translational movements between rectangular coordinate system in space, ε x, ε y, ε zbe rotation parameter, m is scale parameter, (X c, Y c, Z c) be any coordinate of the camera coordinate system that obtains in formula (2), (X w, Y w, Z w) be coordinates of targets value under robot coordinate system, obtain by the manual teaching of robot teach box.
5), utilize indirect adjustment model to solve the parameter of boolean Sha model, thereby obtain parameter (Δ x, Δ y, Δ z, the ε of industrial robot hand eye coordination x, ε y, ε z, m).
CN201310449467.XA 2013-09-25 2013-09-25 Quick hand-eye coordination method for industrial robot Pending CN103817699A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201310449467.XA CN103817699A (en) 2013-09-25 2013-09-25 Quick hand-eye coordination method for industrial robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310449467.XA CN103817699A (en) 2013-09-25 2013-09-25 Quick hand-eye coordination method for industrial robot

Publications (1)

Publication Number Publication Date
CN103817699A true CN103817699A (en) 2014-05-28

Family

ID=50753127

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310449467.XA Pending CN103817699A (en) 2013-09-25 2013-09-25 Quick hand-eye coordination method for industrial robot

Country Status (1)

Country Link
CN (1) CN103817699A (en)

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104626142A (en) * 2014-12-24 2015-05-20 镇江市计量检定测试中心 Method for automatically locating and moving binocular vision mechanical arm for weight testing
CN104786226A (en) * 2015-03-26 2015-07-22 华南理工大学 Posture and moving track positioning system and method of robot grabbing online workpiece
CN105411681A (en) * 2015-12-22 2016-03-23 哈尔滨工业大学 Hand-eye coordination control system and method of split type minimally invasive surgery robot
CN106248028A (en) * 2016-08-08 2016-12-21 苏州天准科技股份有限公司 Depth transducer scaling method based on linear movement platform and the device of correspondence
CN106488204A (en) * 2015-09-02 2017-03-08 财团法人工业技术研究院 Possess depth photographic attachment and the self-aligning method of self-aligning
CN108010074A (en) * 2017-10-19 2018-05-08 宁波蓝圣智能科技有限公司 A kind of workpiece inspection method and system based on machine vision
CN108051002A (en) * 2017-12-04 2018-05-18 上海文什数据科技有限公司 Transport vehicle space-location method and system based on inertia measurement auxiliary vision
CN108381549A (en) * 2018-01-26 2018-08-10 广东三三智能科技有限公司 A kind of quick grasping means of binocular vision guided robot, device and storage medium
CN108942927A (en) * 2018-06-29 2018-12-07 齐鲁工业大学 A method of pixel coordinate and mechanical arm coordinate unification based on machine vision
CN108942934A (en) * 2018-07-23 2018-12-07 珠海格力电器股份有限公司 Determine the method and device of hand and eye calibrating
CN109318234A (en) * 2018-11-09 2019-02-12 哈尔滨工业大学 A kind of scaling method suitable for visual servo plug operation
CN109938841A (en) * 2019-04-11 2019-06-28 哈尔滨理工大学 A kind of surgical instrument navigation system based on the fusion of more mesh camera coordinates
CN110193826A (en) * 2019-02-22 2019-09-03 浙江树人学院(浙江树人大学) Industrial robot track following and motion planning method
CN110342252A (en) * 2019-07-01 2019-10-18 芜湖启迪睿视信息技术有限公司 A kind of article automatically grabs method and automatic grabbing device
CN111452043A (en) * 2020-03-27 2020-07-28 陕西丝路机器人智能制造研究院有限公司 Method for calibrating hands and eyes of robot and industrial camera
US10742961B2 (en) 2015-09-02 2020-08-11 Industrial Technology Research Institute Depth sensing apparatus with self-calibration and self-calibration method thereof

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101053953A (en) * 2004-07-15 2007-10-17 上海交通大学 Method for rapid calibrating hand-eye relationship of single eye vision sensor of welding robot
CN101186038A (en) * 2007-12-07 2008-05-28 北京航空航天大学 Method for demarcating robot stretching hand and eye
JP2009006452A (en) * 2007-06-29 2009-01-15 Nissan Motor Co Ltd Method for calibrating between camera and robot, and device therefor
JP2011093014A (en) * 2009-10-27 2011-05-12 Ihi Corp Control device of hand-eye bin picking robot
CN103175485A (en) * 2013-02-20 2013-06-26 天津工业大学 Method for visually calibrating aircraft turbine engine blade repair robot

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101053953A (en) * 2004-07-15 2007-10-17 上海交通大学 Method for rapid calibrating hand-eye relationship of single eye vision sensor of welding robot
JP2009006452A (en) * 2007-06-29 2009-01-15 Nissan Motor Co Ltd Method for calibrating between camera and robot, and device therefor
CN101186038A (en) * 2007-12-07 2008-05-28 北京航空航天大学 Method for demarcating robot stretching hand and eye
JP2011093014A (en) * 2009-10-27 2011-05-12 Ihi Corp Control device of hand-eye bin picking robot
CN103175485A (en) * 2013-02-20 2013-06-26 天津工业大学 Method for visually calibrating aircraft turbine engine blade repair robot

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
刘小力: "机器人视觉伺服***的图像处理和标定技术", 《智能制造及控制技术》 *
吕小莲: "基于四自由度西红柿采摘机器人视觉***的研究", 《中国博士学位论文全文数据库 信息科技辑》 *
潘国荣: "两种坐标系转换计算方法的比较", 《大地测量与地球动力学》 *

Cited By (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104626142A (en) * 2014-12-24 2015-05-20 镇江市计量检定测试中心 Method for automatically locating and moving binocular vision mechanical arm for weight testing
CN104786226A (en) * 2015-03-26 2015-07-22 华南理工大学 Posture and moving track positioning system and method of robot grabbing online workpiece
US10742961B2 (en) 2015-09-02 2020-08-11 Industrial Technology Research Institute Depth sensing apparatus with self-calibration and self-calibration method thereof
CN106488204A (en) * 2015-09-02 2017-03-08 财团法人工业技术研究院 Possess depth photographic attachment and the self-aligning method of self-aligning
CN106488204B (en) * 2015-09-02 2018-06-15 财团法人工业技术研究院 Have the depth camera of self-aligning and self-aligning method
CN105411681A (en) * 2015-12-22 2016-03-23 哈尔滨工业大学 Hand-eye coordination control system and method of split type minimally invasive surgery robot
CN105411681B (en) * 2015-12-22 2018-07-03 哈尔滨工业大学 The hand eye coordination control system and method for split type micro-wound operation robot
CN106248028A (en) * 2016-08-08 2016-12-21 苏州天准科技股份有限公司 Depth transducer scaling method based on linear movement platform and the device of correspondence
CN108010074A (en) * 2017-10-19 2018-05-08 宁波蓝圣智能科技有限公司 A kind of workpiece inspection method and system based on machine vision
CN108051002A (en) * 2017-12-04 2018-05-18 上海文什数据科技有限公司 Transport vehicle space-location method and system based on inertia measurement auxiliary vision
CN108381549A (en) * 2018-01-26 2018-08-10 广东三三智能科技有限公司 A kind of quick grasping means of binocular vision guided robot, device and storage medium
CN108381549B (en) * 2018-01-26 2021-12-14 广东三三智能科技有限公司 Binocular vision guide robot rapid grabbing method and device and storage medium
CN108942927A (en) * 2018-06-29 2018-12-07 齐鲁工业大学 A method of pixel coordinate and mechanical arm coordinate unification based on machine vision
CN108942934A (en) * 2018-07-23 2018-12-07 珠海格力电器股份有限公司 Determine the method and device of hand and eye calibrating
CN109318234A (en) * 2018-11-09 2019-02-12 哈尔滨工业大学 A kind of scaling method suitable for visual servo plug operation
CN110193826A (en) * 2019-02-22 2019-09-03 浙江树人学院(浙江树人大学) Industrial robot track following and motion planning method
CN110193826B (en) * 2019-02-22 2021-06-04 浙江树人学院(浙江树人大学) Industrial robot trajectory tracking and motion planning method
CN109938841A (en) * 2019-04-11 2019-06-28 哈尔滨理工大学 A kind of surgical instrument navigation system based on the fusion of more mesh camera coordinates
CN110342252A (en) * 2019-07-01 2019-10-18 芜湖启迪睿视信息技术有限公司 A kind of article automatically grabs method and automatic grabbing device
CN110342252B (en) * 2019-07-01 2024-06-04 河南启迪睿视智能科技有限公司 Automatic article grabbing method and automatic grabbing device
CN111452043A (en) * 2020-03-27 2020-07-28 陕西丝路机器人智能制造研究院有限公司 Method for calibrating hands and eyes of robot and industrial camera
CN111452043B (en) * 2020-03-27 2023-02-17 陕西丝路机器人智能制造研究院有限公司 Method for calibrating hands and eyes of robot and industrial camera

Similar Documents

Publication Publication Date Title
CN103817699A (en) Quick hand-eye coordination method for industrial robot
CN104626206B (en) The posture information measuring method of robot manipulating task under a kind of non-structure environment
CN103115613B (en) Three-dimensional space positioning method
CN105234943A (en) Industrial robot demonstration device and method based on visual recognition
CN1971206A (en) Calibration method for binocular vision sensor based on one-dimension target
Gratal et al. Visual servoing on unknown objects
CN102253057B (en) Endoscope system and measurement method using endoscope system
Li et al. 3D triangulation based extrinsic calibration between a stereo vision system and a LIDAR
Xin et al. 3D augmented reality teleoperated robot system based on dual vision
CN110415286A (en) A kind of outer ginseng scaling method of more flight time depth camera systems
CN114750154A (en) Dynamic target identification, positioning and grabbing method for distribution network live working robot
Yang et al. Visual servoing control of baxter robot arms with obstacle avoidance using kinematic redundancy
Li et al. Workpiece intelligent identification and positioning system based on binocular machine vision
Lu et al. Binocular stereo vision based on OpenCV
Ueno et al. An efficient method for human pointing estimation for robot interaction
Xu et al. A flexible 3D point reconstruction with homologous laser point array and monocular vision
Kitayama et al. 3D map construction based on structure from motion using stereo vision
Krishnan et al. Intelligent indoor mobile robot navigation using stereo vision
Meng et al. Extrinsic calibration of a camera with dual 2D laser range sensors for a mobile robot
Liu et al. A Robot 3D Grasping Application Based on Binocular Vision System
Mosnier et al. A New Method for Projector Calibration Based on Visual Servoing.
Jia et al. Depth Information Extraction of Seam Structure based on Visual Method
Zhu et al. Target Measurement Method Based on Sparse Disparity for Live Power Lines Maintaining Robot
Yang et al. An Automatic Laser Scanning System for Objects with Unknown Model
Castejón et al. Friendly interface to learn stereovision theory

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C02 Deemed withdrawal of patent application after publication (patent law 2001)
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20140528