CN1672881A - On-line robot hand and eye calibrating method based on motion selection - Google Patents

On-line robot hand and eye calibrating method based on motion selection Download PDF

Info

Publication number
CN1672881A
CN1672881A CN 200510025252 CN200510025252A CN1672881A CN 1672881 A CN1672881 A CN 1672881A CN 200510025252 CN200510025252 CN 200510025252 CN 200510025252 A CN200510025252 A CN 200510025252A CN 1672881 A CN1672881 A CN 1672881A
Authority
CN
China
Prior art keywords
motion
trick
hand
time
eye
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN 200510025252
Other languages
Chinese (zh)
Other versions
CN1330466C (en
Inventor
石繁槐
王建华
刘允才
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Jiaotong University
Original Assignee
Shanghai Jiaotong University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Jiaotong University filed Critical Shanghai Jiaotong University
Priority to CNB2005100252520A priority Critical patent/CN1330466C/en
Publication of CN1672881A publication Critical patent/CN1672881A/en
Application granted granted Critical
Publication of CN1330466C publication Critical patent/CN1330466C/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Manipulator (AREA)

Abstract

The on-line robot hand and eye calibrating method based on motion selection includes: first setting the minimum threshold of the angle between the rotation axes of two successive motions of the robot hand claw, the minimum threshold of the rotation angle of each motion of the robot hand claw and the maximum threshold of module in the translation component in each motion of the robot hand claw; then selecting two required successive hand and eye motion pairs from the first sampled hand and eye motions; finally calculating the hand and eye conversion relation matrix in Andreff linear algorithm based the hand and eye motion pairs to complete once hand and eye calibration; and continuing the next calibration successively. The present invention may be used widely in the 3D visual measurement, visual servo, servo sensing, etc. of robot.

Description

Robot on line hand eye calibration method based on the motion selection
Technical field
The present invention relates to a kind of robot on line hand eye calibration method of selecting based on motion, can be widely used in aspects such as robot three-dimensional vision measurement, visual servo and tactilely-perceptible.Belonging to advanced makes and automatic field.
Background technology
When computer vision is applied to robot, often video camera is fixed on the end effector (robot hand) of robot arm.It is exactly relative position and the direction relations of measuring between camera and the robot hand that Robot Hand-eye is demarcated, and it is a basic problem in the robotics.When this class problem of solution, most of in the past method all is to find the solution homogeneous transformation equation AX=XB (Y.C.Shiu andS.Ahmad by the mode of iteration optimization, " Calibration of wrist-mounted robotic sensors by solving homogeneoustransform equations of the form AX=XB; " IEEE Trans.Robot.Automat., vol.5, pp.16-29, Feb.1989.) demarcate, here A represents the motion of robot hand, B represents corresponding camera motion, video camera and spatial alternation robot hand between the relation of X for demarcating.Because iteration optimization computational process is non real-time, so hand and eye calibrating can only off line carry out in this case.(J.Angeles such as Angeles, G.Soucy and F.P.Ferrie, " The online solution of the hand-eyeproblem ", IEEE Trans.Robot.Automat., vol.16, pp.720-731, Dec.2000.) and (N.Andreff such as Andreff, R.Horaud, and B.Espiau, " Robot hand-eye calibration usingstructure-from-motion ", Int.J.Robot.Res., 20 (3): 228-248,2001.) almost propose the canbe used on line technology of hand and eye calibrating simultaneously, so just overcome the shortcoming that conventional method can not be carried out real-time calibration.The method of Angeles is based on the linear invariant of spin matrix, and the linear least-squares of employing recurrence is realized.Andreff then is subjected to the inspiration of Sylvester equation, directly the trick equation of motion is changed into linear forms and finds the solution.
No matter adopt which kind of method to carry out hand and eye calibrating, the calculating of trick relation all requires robot self-movement twice at least, and the rotating shaft of twice motion must not be parallel.Therefore, when there is degenerate case in the trick exercise data that collects,, just can't obtain the global solution of trick transformation relation as pure rotation or pure flat move etc.The weak point of existing method is: when robot carries out online hand-eye calibration in the middle of the course of work, the motion of robot be by concrete application determine and be not for hand and eye calibrating designs, so just probably there is the degeneration situation in the exercise data that is used for hand and eye calibrating that collects this moment.In addition, the anglec of rotation is very little or translation is bigger in the sampling motion, when perhaps the angle between the rotating shaft of twice motion is very little, can produce bigger calibrated error.The appearance of above situation all can make existing algorithm cisco unity malfunction or inefficacy.
Summary of the invention
The objective of the invention is at the deficiencies in the prior art and shortcoming, make full use of the existing trick exercise data that samples, a kind of online hand-eye calibration method of selecting based on motion has been proposed, to remove the motion of degeneration trick to the normal influence of calculating of trick transformation relation, simultaneously can also reduce the error of calculation, improve the Robot Hand-eye stated accuracy by avoiding measures such as low-angle rotation.
Technical scheme of the present invention: at first set three threshold values that motion is selected, i.e. the max-thresholds d of the mould of the translational component of the minimum threshold β of the anglec of rotation of the minimum threshold α of angle, the each motion of robot hand and the each motion of robot hand between the rotating shaft of robot hand twice motion in front and back.Then, since the trick sampling motion first time, it is right to select two satisfactory trick motions successively, moves to utilizing the Andreff linear algorithm to calculate the trick transformation relation with two tricks selecting at last.Continue timing signal next time, second motion that the last time can be used to demarcate is to right as first motion of demarcating next time, and search selects another motion right from the subsequent sampling exercise data, carries out new hand and eye calibrating then and calculates.So move in circles, can continuously carry out the online hand-eye calibration operation of robot.
Online hand-eye calibration method of the present invention mainly comprises following step:
1. set three threshold values that motion is selected: the max-thresholds d of the mould of the translational component of the minimum threshold β of the anglec of rotation of the minimum threshold α of angle, the each motion of robot hand and the each motion of robot hand between the rotating shaft of robot hand twice motion in front and back.
2. search selects first trick that satisfies threshold condition motion to (A ', B ').At first by for the first time and the trick attitude that samples for the second time can calculate a trick motion to (A ', B '), if the anglec of rotation of A ' smaller or equal to d, then obtains first motion of satisfying threshold condition to (A ', B ') more than or equal to the mould of the translational component of β and A '.Otherwise, again by for the first time and the motion of the trick Attitude Calculation trick of sampling for the third time to (A ', B ') and judge the anglec of rotation of A ' and whether the mould of translational component satisfies and impose a condition.So move in circles, until finding qualified first trick motion to (A ', B '), suppose this motion to be by for the first time and the trick Attitude Calculation of sampling for the i+1 time obtain.
Search select second trick motion of satisfying threshold condition to (A ", B ").At first by the trick attitude that samples for the i+1 time and the i+2 time can calculate a trick motion to (A "; B "), if the mould of the translational component of A " the anglec of rotation more than or equal to β, A " is smaller or equal to d and A ' and A " rotating shaft between angle more than or equal to α; then obtain second motion of satisfying threshold condition to (A ", B ").Otherwise (whether the angle between the rotating shaft of the anglec of rotation of A ", B ") also judges A ", A mould and A ' and the A of translational component " " satisfied imposing a condition by the trick Attitude Calculation trick motion of the i+1 time and the i+3 time sampling again.So move in circles, until find qualified second trick motion to (A ", B ").
4. utilize the motion of two tricks to (A ', B ') and (A ", B ") carries out hand and eye calibrating.Promptly utilize trick motion to (A ', B ') and (A ", B ") lists the linear equation of Andreff, finds the solution this linear equation and obtains trick transformation relation matrix, finishes hand and eye calibrating one time.
5. if when continuing next time hand and eye calibrating, the trick motion that can be used to demarcate last time to (A ", B ") as new trick motion to (A ', B '), repeating step 3 from the subsequent sampling data, continue search select new trick motion to (A ", B "), repeating step 4 carries out new hand and eye calibrating.
During practical application,, move automatically by software and to select and hand and eye calibrating calculating according to three threshold values of prior setting.Hand and eye calibrating method proposed by the invention not only can prevent normally finding the solution of the degeneration motion effects hand and eye calibrating that occurs in the sampled data, but also can avoid the motion of the little anglec of rotation to cause bigger calibrated error, thereby improves stated accuracy.
The online hand-eye calibration method that the present invention proposes can be widely used in aspects such as robot three-dimensional vision measurement, visual servo and tactilely-perceptible, has suitable practical value.
Description of drawings
Fig. 1 is a Robot Hand-eye peg model schematic diagram of the present invention.
Fig. 2 is the motion selection algorithm schematic diagram that is used for online hand-eye calibration of the present invention.
The specific embodiment
In order to explain technical scheme of the present invention better, be described in further detail below in conjunction with drawings and Examples.
1. set three threshold values that motion is selected: the max-thresholds d of the mould of the translational component of the minimum threshold β of the anglec of rotation of the minimum threshold α of angle, the each motion of robot hand and the each motion of robot hand between the rotating shaft of robot hand twice motion in front and back.Select α=β=30 °, d=100 generally speaking.If the homogeneous matrix of i (i is a natural number) inferior video camera that samples and paw attitude is respectively P i, Q i, X is the required homogeneous matrix of trick transformation relation of finding the solution.
2. search selects first trick that satisfies condition motion to (A ', B ').Twice trick attitude can be obtained the homogeneous matrix A of paw motion for the first time at first in the past 1Homogeneous matrix B with camera motion 1, as shown in Figure 1, P 1, P 2Be respectively video camera the 1st and the 2nd constantly with respect to the homogeneous matrix of the attitude of calibrated reference, Q 1, Q 2Be respectively robot hand the 1st be engraved in the homogeneous matrix of attitude under the robot basis coordinates system at the 2nd o'clock, then have:
A 1 = Q 1 - 1 Q 2 , B 1 = P 1 - 1 P 2
If (A ', B ')=(A 1, B 1).If the anglec of rotation of A ' more than or equal to the mould of β and A ' translational component smaller or equal to d, then current motion to (A ', B ') satisfies given threshold condition, otherwise again by for the first time and the motion of the trick Attitude Calculation trick of sampling for the third time to (A ', B ') and judge the anglec of rotation of A ' and whether the mould of the translation of A ' satisfies above condition.So move in circles, until finding the motion of qualified trick to as (A ', B '), establish this moment (A ', B ') by for the first time and the trick attitude of sampling for the i+1 time (promptly having passed through i trick moves) calculate.Fig. 2 shows the right complete procedure of first trick that satisfies condition motion of search.
Since second trick that satisfies condition of trick attitude (promptly the i time motion back) search selection of the i+1 time sampling move to (A ", B ") is shown in figure two.At first can obtain the paw homogeneous matrix A of moving from the trick attitude of the i+1 time and the i+2 time I+1With the homogeneous matrix B of camera motion I+1, promptly have:
A i + 1 = Q i + 1 - 1 Q i + 2 , B i + 1 = P i + 1 - 1 P i + 2
If (A ", B ")=(A I+1, B I+1).If the mould of the translational component of A " the anglec of rotation more than or equal to β, A " is smaller or equal to d and A ' and A " rotating shaft between angle more than or equal to α; then current motion to (A ", B ") satisfies given threshold condition; otherwise again by the trick Attitude Calculation trick motion of the i+1 time and the i+3 time sampling to (A ", B ") also judges A " the rotating shaft of the anglec of rotation, A mould and the A ' and the A of translational component " " between angle whether satisfy above condition, so move in circles, until find the motion of qualified trick to as (A ", B ").
4. utilize two tricks motion of selecting in the step 2,3 to (A ', B ') and (A ", B ") carries out hand and eye calibrating.According to trick motion to (A ', B ') and (A ", B ") lists Andreff hand and eye calibrating linear equation, finds the solution this equation and obtains trick transformation relation matrix X, finishes hand and eye calibrating one time.
5. continue to demarcate if desired, the trick motion that can be used to demarcate last time to (A ", B ") as new trick motion to (A ', B '), repeating step 3 from follow-up trick attitude sampled data, begin to search for select new trick motion to (A ", B "), repeating step 4 carries out new hand and eye calibrating.

Claims (1)

1, a kind of robot on line hand eye calibration method of selecting based on motion is characterized in that comprising following concrete steps:
1) sets three threshold values that motion is selected: the max-thresholds d of the mould of the minimum threshold β of the anglec of rotation of the minimum threshold α of angle, the each motion of robot hand and the each translational component that moves of robot hand between the rotating shaft of setting robot hand twice motion in front and back;
2) search select first trick that satisfies condition motion to (A ', B '): by for the first time and the trick Attitude Calculation that samples for the second time obtain a trick motion to (A ', B '), if the anglec of rotation of A ' more than or equal to the mould of the translational component of β and A ' smaller or equal to d, then think (A ', B ') finds, otherwise, again by for the first time and the motion of the trick Attitude Calculation trick of sampling for the third time to (A ', B ') and the mould of judging the anglec of rotation of A ' and translational component whether satisfy and impose a condition, so move in circles, until find the motion of qualified trick to as (A ', B '), (A ', B ') that establishes this moment by for the first time and the trick Attitude Calculation of sampling for the i+1 time obtain;
3) search select second trick motion that satisfies condition to (A "; B "): by the trick Attitude Calculation that samples for the i+1 time and the i+2 time obtain a trick motion to (A "; B "), if A " the anglec of rotation more than or equal to β; A " the mould of translational component smaller or equal to d and A ' and A " rotating shaft between angle more than or equal to α; then think (A ", B ") finds; otherwise; again by the trick Attitude Calculation trick motion of the i+1 time and the i+3 time sampling to (A ", B ") also judges A " the anglec of rotation, whether the angle between the rotating shaft of A mould and the A ' and the A of translational component " " satisfies imposes a condition, so move in circles, until find the motion of qualified trick to (A ", B ");
4) utilize the motion of two tricks to (A ', B ') and (A ", B ") carries out hand and eye calibrating: by (A ', B ') and (A ", B ") lists the Andreff linear equation, finds the solution this equation and obtains trick transformation relation matrix, finishes hand and eye calibrating one time;
5) if continue timing signal, the trick that will be used to demarcate last time motion to (A ", B ") as new trick motion to (A ', B '), repeating step 3 from the subsequent sampling data, continue search select new trick motion to (A ", B "), repeating step 4 carries out new hand and eye calibrating.
CNB2005100252520A 2005-04-21 2005-04-21 On-line robot hand and eye calibrating method based on motion selection Expired - Fee Related CN1330466C (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CNB2005100252520A CN1330466C (en) 2005-04-21 2005-04-21 On-line robot hand and eye calibrating method based on motion selection

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CNB2005100252520A CN1330466C (en) 2005-04-21 2005-04-21 On-line robot hand and eye calibrating method based on motion selection

Publications (2)

Publication Number Publication Date
CN1672881A true CN1672881A (en) 2005-09-28
CN1330466C CN1330466C (en) 2007-08-08

Family

ID=35045785

Family Applications (1)

Application Number Title Priority Date Filing Date
CNB2005100252520A Expired - Fee Related CN1330466C (en) 2005-04-21 2005-04-21 On-line robot hand and eye calibrating method based on motion selection

Country Status (1)

Country Link
CN (1) CN1330466C (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN100408279C (en) * 2006-06-26 2008-08-06 北京航空航天大学 Robot foot-eye calibration method and device
CN103925879A (en) * 2014-04-24 2014-07-16 中国科学院合肥物质科学研究院 Indoor robot vision hand-eye relation calibration method based on 3D image sensor
CN107993227A (en) * 2017-12-15 2018-05-04 深圳先进技术研究院 A kind of method and apparatus of acquisition 3D laparoscope trick matrixes
CN108413896A (en) * 2018-02-27 2018-08-17 博众精工科技股份有限公司 A kind of manipulator demarcating method
CN108942934A (en) * 2018-07-23 2018-12-07 珠海格力电器股份有限公司 Determine the method and device of hand and eye calibrating

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS62214403A (en) * 1986-03-17 1987-09-21 Yaskawa Electric Mfg Co Ltd Calibration method of robot system with visual sensor
KR100468857B1 (en) * 2002-11-21 2005-01-29 삼성전자주식회사 Method for calibrating hand/eye using projective invariant shape descriptor for 2-dimensional shape
CN1292878C (en) * 2003-09-03 2007-01-03 中国科学院自动化研究所 Pickup camera self calibration method based on robot motion

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN100408279C (en) * 2006-06-26 2008-08-06 北京航空航天大学 Robot foot-eye calibration method and device
CN103925879A (en) * 2014-04-24 2014-07-16 中国科学院合肥物质科学研究院 Indoor robot vision hand-eye relation calibration method based on 3D image sensor
CN107993227A (en) * 2017-12-15 2018-05-04 深圳先进技术研究院 A kind of method and apparatus of acquisition 3D laparoscope trick matrixes
CN107993227B (en) * 2017-12-15 2020-07-24 深圳先进技术研究院 Method and device for acquiring hand-eye matrix of 3D laparoscope
CN108413896A (en) * 2018-02-27 2018-08-17 博众精工科技股份有限公司 A kind of manipulator demarcating method
CN108413896B (en) * 2018-02-27 2019-12-13 博众精工科技股份有限公司 mechanical arm calibration method
CN108942934A (en) * 2018-07-23 2018-12-07 珠海格力电器股份有限公司 Determine the method and device of hand and eye calibrating

Also Published As

Publication number Publication date
CN1330466C (en) 2007-08-08

Similar Documents

Publication Publication Date Title
CN106873550B (en) Simulation device and simulation method
Wilson et al. Relative end-effector control using cartesian position based visual servoing
JP7326911B2 (en) Control system and control method
CN107457783B (en) Six-degree-of-freedom mechanical arm self-adaptive intelligent detection method based on PD controller
CN110053044B (en) Model-free self-adaptive smooth sliding mode impedance control method for clamping serial fruits by parallel robot
CN107414827B (en) Six-degree-of-freedom mechanical arm self-adaptive detection method based on linear feedback controller
CN110181500B (en) Control system and control method of bionic manipulator
CN110370271B (en) Joint transmission ratio error calibration method of industrial series robot
CN110253574A (en) A kind of detection of multitask mechanical arm pose and error compensating method
CN1672881A (en) On-line robot hand and eye calibrating method based on motion selection
Gratal et al. Visual servoing on unknown objects
EP1862813A1 (en) A method for estimating the position of a sound source for online calibration of auditory cue to location transformations
CN115070781B (en) Object grabbing method and two-mechanical-arm cooperation system
Gratal et al. Virtual visual servoing for real-time robot pose estimation
CN114536346B (en) Mechanical arm accurate path planning method based on man-machine cooperation and visual detection
Lippiello et al. Position and orientation estimation based on Kalman filtering of stereo images
CN114029982A (en) Hand-eye calibration device and calibration method of camera outside robot arm
CN1686682A (en) Adaptive motion selection method used for robot on line hand eye calibration
CN114067210A (en) Mobile robot intelligent grabbing method based on monocular vision guidance
CN113021329B (en) Robot motion control method and device, readable storage medium and robot
CN110110469A (en) Parallel robot dynamic parameters identification method based on singular value decomposition
Wei et al. Multisensory visual servoing by a neural network
Liu et al. A new method for mobile robot arm blind grasping using ultrasonic sensors and Artificial Neural Networks
CN111915718B (en) Automatic docking system suitable for ship shore LNG loading and unloading arm
CN111002292B (en) Robot arm humanoid motion teaching method based on similarity measurement

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
C17 Cessation of patent right
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20070808

Termination date: 20100421