CN110634164A - Quick calibration method for vision sensor - Google Patents

Quick calibration method for vision sensor Download PDF

Info

Publication number
CN110634164A
CN110634164A CN201910981795.1A CN201910981795A CN110634164A CN 110634164 A CN110634164 A CN 110634164A CN 201910981795 A CN201910981795 A CN 201910981795A CN 110634164 A CN110634164 A CN 110634164A
Authority
CN
China
Prior art keywords
robot
coordinate system
points
calibration
flange
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910981795.1A
Other languages
Chinese (zh)
Other versions
CN110634164B (en
Inventor
刘海庆
赵素雷
郭寅
郭磊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Yi Si Si Hangzhou Technology Co ltd
Original Assignee
Isvision Hangzhou Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Isvision Hangzhou Technology Co Ltd filed Critical Isvision Hangzhou Technology Co Ltd
Priority to CN201910981795.1A priority Critical patent/CN110634164B/en
Publication of CN110634164A publication Critical patent/CN110634164A/en
Application granted granted Critical
Publication of CN110634164B publication Critical patent/CN110634164B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/002Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Manipulator (AREA)

Abstract

The invention discloses a quick calibration method of a visual sensor, wherein the position of the visual sensor is fixed in the using process; the characteristic points used in the calibration process are salient points on a robot flange or a gripper or manual marking points arranged on the robot flange or the gripper; the calibration method comprises the following steps: the method comprises the steps that a vision sensor respectively collects n characteristic point images under robot poses, a conversion matrix from a robot flange coordinate to a base coordinate system is obtained based on the robot poses, iterative solution is carried out through a nonlinear least square method according to a camera imaging equation set of characteristic points under n different poses, and a coordinate conversion matrix from the robot base coordinate system to the camera coordinate system is obtained. The method does not need a calibration plate, is suitable for various complex field environments, can complete calibration within half an hour, and reduces the time by at least 50%.

Description

Quick calibration method for vision sensor
Technical Field
The invention relates to the field of visual detection, in particular to a quick calibration method of a visual sensor.
Background
In industrial sites, vision sensors are increasingly being introduced into the field of robotic automation. The vision sensor is arranged at a certain fixed position, photographs and solves the workpiece with the changed posture of the incoming material in the sensor view field, and guides the robot to realize tasks such as accurate grabbing. In this situation, first, eye to hand calibration (eye to hand) needs to be performed on the vision sensor, that is, the relative pose relationship between the coordinate system of the vision sensor and the coordinate system of the robot base is obtained. For hand-eye calibration, the traditional calibration method is as follows: a chessboard pattern calibration plate is fixed at the tail end of the industrial robot through a special clamp, the pose of the robot is adjusted to enable the calibration plate to change different poses in the vision sensor, and then the pose relation between the vision sensor and the robot base is determined. The tail end of the field robot is usually provided with a tail end actuator (a gripper, a welding gun, a glue gun and the like), and a calibration plate needs to be installed by adapting different clamps according to different tail end actuators on the spot, so that the process is time-consuming, and the production beat on the spot is seriously influenced. Therefore, a simple and practical method for quickly calibrating a camera is needed in the industrial field.
Disclosure of Invention
In order to solve the technical problems, the invention provides a quick calibration method of a visual sensor, which does not need a calibration plate, is suitable for various complex field environments, can complete calibration within half an hour, and at least reduces the time by 50%. Therefore, the technical scheme of the invention is as follows:
a visual sensor rapid calibration method, the position of the visual sensor is fixed in the using process; the characteristic points used in the calibration process are salient points on a robot flange or a gripper or manual marking points arranged on the robot flange or the gripper;
the calibration method comprises the following steps:
1) adjust the robot to n poses, respectively, at each poseThen, an image of the feature point is acquired by using a vision sensor, and the image coordinate (u) of the feature point is correspondingly recordedi,vi) (ii) a Wherein i is the state that the robot is in the ith group of posture, and the range of values is the interval [1, n]The natural number in the natural number is more than or equal to 5; b represents a robot base coordinate system;
2) acquiring the position and posture of the robot in the ith group
Figure BDA0002235435040000022
Conversion matrix from time robot flange coordinate system to robot base coordinate system
Establishing a camera imaging equation of the feature points corresponding to the ith group of poses of the robot by using the following formula, wherein the number of the camera imaging equation groups is n;
Figure BDA0002235435040000024
in the formula: s is a scale factor;
(ui,vi) Representing the coordinates of the feature points in a camera image coordinate system;
m is a camera internal reference matrix, and the calibration is finished when the camera leaves the factory;
cTba coordinate conversion matrix representing the robot base coordinate system to the camera coordinate system, namely the hand-eye relation to be solved;
(xf,yf,zf) Representing the coordinates of the fixed characteristic points on the robot flange in a flange coordinate system;
3) carrying out iterative solution on the n camera imaging equation sets established in the step 2) by a nonlinear least square method to obtaincTb
In order to improve the precision of the test result, when the image coordinates of the feature points on the image acquired by the visual detection sensor are drawn on the same picture, the n feature points are dispersed at different positions of the picture.
Further, n is 7, 8 or 9.
Further, the salient points on the robot flange or the gripper are screws, bulges, edges and corners, holes or grooves.
Furthermore, the manual mark points arranged on the robot flange or the gripper are light-reflecting mark points.
According to the rapid calibration method of the vision sensor, calibration equipment such as a calibration plate and a clamp is not needed, rapid calibration of an industrial field fixed camera is achieved by using a feature point fixed at the tail end of a robot, the use under a field complex environment is met, and the calibration time is controlled within 30 min. The existing hand-eye calibration method of the fixed camera needs to arrange a calibration plate externally or use a tracker, the time consumption is about 1h, and in order to meet the installation requirement of the calibration plate, a corresponding tool clamp needs to be provided in a matched mode. The method provided by the invention is obviously superior to the existing method.
Drawings
FIG. 1 is a distribution diagram of a plurality of characteristic point images collected by the method of the present invention collected on the same image.
Detailed Description
The technical solution of the present invention is described in detail below with reference to the accompanying drawings and the detailed description.
A visual sensor fast calibration method, the position of the visual sensor is fixed in the course of using, the visual sensor can be fixed in other positions besides robot with characteristic point, it can be fixed on station, can also be fixed on another robot, as long as in the course of calibrating, its position is fixed; the characteristic points used in the calibration process are salient points on a robot flange or a gripper or manual marking points arranged on the robot flange or the gripper; specifically, the method comprises the following steps: the salient points on the robot flange or the gripper are screws, bulges, edges and corners, holes or grooves; the manual mark points arranged on the robot flange or the gripper are light-reflecting mark points;
the calibration method comprises the following steps:
1) adjust the robot to n poses, respectively, at each pose
Figure BDA0002235435040000031
Then, an image of the feature point is acquired by using a vision sensor, and the image coordinate (u) of the feature point is correspondingly recordedi,vi) (ii) a Wherein i is the state that the robot is in the ith group of posture, and the range of values is the interval [1, n]Natural number in, n is more than or equal to 5, preferably 7, 8 or 9; b represents a robot base coordinate system;
preferably, when the image coordinates of the feature points on the image acquired by the visual detection sensor are drawn on the same picture, n feature points are dispersed at different positions of the picture, for example, 7 pictures are distributed as shown in fig. 1, so as to improve the precision of the test result;
2) acquiring the position and posture of the robot in the ith group
Figure BDA0002235435040000041
Conversion matrix from time robot flange coordinate system to robot base coordinate system
Figure BDA0002235435040000042
Establishing a camera imaging equation of the feature points corresponding to the ith group of poses of the robot by using the following formula, wherein the number of the camera imaging equation groups is n;
Figure BDA0002235435040000043
in the formula: s is a scale factor;
(ui,vi) Representing the coordinates of the feature points in a camera image coordinate system;
m is a camera internal reference matrix, and the calibration is finished when the camera leaves the factory;
cTba coordinate conversion matrix representing the robot base coordinate system to the camera coordinate system, namely the hand-eye relation to be solved;
(xf,yf,zf) Representing the coordinates of the fixed characteristic points on the robot flange in a flange coordinate system;
3) carrying out iterative solution on the n camera imaging equation sets established in the step 2) by a nonlinear least square method to obtaincTb
The foregoing descriptions of specific exemplary embodiments of the present invention have been presented for purposes of illustration and description. The foregoing description is not intended to be exhaustive or to limit the invention to the precise form disclosed, and obviously many modifications and variations are possible in light of the above teaching. The exemplary embodiments were chosen and described in order to explain certain principles of the invention and its practical application to enable others skilled in the art to make and use various exemplary embodiments of the invention and various alternatives and modifications thereof. It is intended that the scope of the invention be defined by the following claims and their equivalents.

Claims (5)

1. A visual sensor rapid calibration method, the position of the visual sensor is fixed in the using process; the method is characterized in that: the characteristic points used in the calibration process are salient points on a robot flange or a gripper or manual marking points arranged on the robot flange or the gripper;
the calibration method comprises the following steps:
1) adjust the robot to n poses, respectively, at each pose
Figure FDA0002235435030000011
Then, an image of the feature point is acquired by using a vision sensor, and the image coordinate (u) of the feature point is correspondingly recordedi,vi) (ii) a Wherein i is the state that the robot is in the ith group of posture, and the range of values is the interval [1, n]The natural number in the natural number is more than or equal to 5; b represents a robot base coordinate system;
2) acquiring the position and posture of the robot in the ith group
Figure FDA0002235435030000012
Conversion matrix from time robot flange coordinate system to robot base coordinate system
Figure FDA0002235435030000013
Establishing a camera imaging equation of the characteristic points corresponding to the ith group of poses of the robot by using the following formula, wherein the number of the camera imaging equation groups is n;
Figure FDA0002235435030000014
in the formula: s is a scale factor;
(ui,vi) Representing the coordinates of the feature points in a camera image coordinate system;
m is a camera internal reference matrix;
cTba coordinate transformation matrix representing a robot base coordinate system to a camera coordinate system;
(xf,yf,zf) Representing the coordinates of the fixed characteristic points on the robot flange in a flange coordinate system;
3) carrying out iterative solution on the n camera imaging equation sets established in the step 2) by a nonlinear least square method to obtaincTb
2. The method for rapidly calibrating a vision sensor according to claim 1, wherein: when the image coordinates of the feature points on the image acquired by the visual detection sensor are drawn on the same picture, the n feature points are dispersed at different positions of the picture.
3. The method for rapidly calibrating a vision sensor according to claim 1, wherein: n is 7, 8 or 9.
4. The method for rapidly calibrating a vision sensor according to claim 1, wherein: the salient points on the robot flange or the gripper are screws, bulges, edges and corners, holes or grooves.
5. The method for rapidly calibrating a vision sensor according to claim 1, wherein: the manual mark points arranged on the robot flange or the gripper are light-reflecting mark points.
CN201910981795.1A 2019-10-16 2019-10-16 Quick calibration method for vision sensor Active CN110634164B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910981795.1A CN110634164B (en) 2019-10-16 2019-10-16 Quick calibration method for vision sensor

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910981795.1A CN110634164B (en) 2019-10-16 2019-10-16 Quick calibration method for vision sensor

Publications (2)

Publication Number Publication Date
CN110634164A true CN110634164A (en) 2019-12-31
CN110634164B CN110634164B (en) 2022-06-14

Family

ID=68975156

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910981795.1A Active CN110634164B (en) 2019-10-16 2019-10-16 Quick calibration method for vision sensor

Country Status (1)

Country Link
CN (1) CN110634164B (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111912337A (en) * 2020-07-24 2020-11-10 上海擎朗智能科技有限公司 Method, device, equipment and medium for determining robot posture information
CN112598752A (en) * 2020-12-24 2021-04-02 东莞市李群自动化技术有限公司 Calibration method based on visual identification and operation method
WO2021158773A1 (en) * 2020-02-06 2021-08-12 Berkshire Grey, Inc. Systems and methods for camera calibration with a fiducial of unknown position on an articulated arm of a programmable motion device
CN114918926A (en) * 2022-07-22 2022-08-19 杭州柳叶刀机器人有限公司 Mechanical arm visual registration method and device, control terminal and storage medium
CN116359891A (en) * 2023-06-01 2023-06-30 季华实验室 Multi-sensor rapid calibration method and system

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108733082A (en) * 2017-04-25 2018-11-02 深圳市裕展精密科技有限公司 The calibration method of robot tooling center points
CN108748146A (en) * 2018-05-30 2018-11-06 武汉库柏特科技有限公司 A kind of Robotic Hand-Eye Calibration method and system
US20190084160A1 (en) * 2010-05-14 2019-03-21 Cognex Corporation System and method for robust calibration between a machine vision system and a robot
CN109737871A (en) * 2018-12-29 2019-05-10 南方科技大学 A kind of scaling method of the relative position of three-dimension sensor and mechanical arm

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190084160A1 (en) * 2010-05-14 2019-03-21 Cognex Corporation System and method for robust calibration between a machine vision system and a robot
CN108733082A (en) * 2017-04-25 2018-11-02 深圳市裕展精密科技有限公司 The calibration method of robot tooling center points
CN108748146A (en) * 2018-05-30 2018-11-06 武汉库柏特科技有限公司 A kind of Robotic Hand-Eye Calibration method and system
CN109737871A (en) * 2018-12-29 2019-05-10 南方科技大学 A kind of scaling method of the relative position of three-dimension sensor and mechanical arm

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
谢则晓 等: "基于结构光视觉引导的工业机器人定位***", 《光学学报》 *

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021158773A1 (en) * 2020-02-06 2021-08-12 Berkshire Grey, Inc. Systems and methods for camera calibration with a fiducial of unknown position on an articulated arm of a programmable motion device
US11826918B2 (en) 2020-02-06 2023-11-28 Berkshire Grey Operating Company, Inc. Systems and methods for camera calibration with a fiducial of unknown position on an articulated arm of a programmable motion device
CN111912337A (en) * 2020-07-24 2020-11-10 上海擎朗智能科技有限公司 Method, device, equipment and medium for determining robot posture information
US11644302B2 (en) 2020-07-24 2023-05-09 Keenon Robotics Co., Ltd. Method and apparatus for determining pose information of a robot, device and medium
CN112598752A (en) * 2020-12-24 2021-04-02 东莞市李群自动化技术有限公司 Calibration method based on visual identification and operation method
CN112598752B (en) * 2020-12-24 2024-02-27 东莞市李群自动化技术有限公司 Calibration method and operation method based on visual recognition
CN114918926A (en) * 2022-07-22 2022-08-19 杭州柳叶刀机器人有限公司 Mechanical arm visual registration method and device, control terminal and storage medium
CN114918926B (en) * 2022-07-22 2022-10-25 杭州柳叶刀机器人有限公司 Mechanical arm visual registration method and device, control terminal and storage medium
CN116359891A (en) * 2023-06-01 2023-06-30 季华实验室 Multi-sensor rapid calibration method and system
CN116359891B (en) * 2023-06-01 2023-09-12 季华实验室 Multi-sensor rapid calibration method and system

Also Published As

Publication number Publication date
CN110634164B (en) 2022-06-14

Similar Documents

Publication Publication Date Title
CN110634164B (en) Quick calibration method for vision sensor
CN110103217B (en) Industrial robot hand-eye calibration method
CN108326850B (en) Method and system for robot to accurately move mechanical arm to reach specified position
CN109671122A (en) Trick camera calibration method and device
CN107942949B (en) A kind of lathe vision positioning method and system, lathe
CN106767393B (en) Hand-eye calibration device and method for robot
CN110238849B (en) Robot hand-eye calibration method and device
CN112894823B (en) Robot high-precision assembling method based on visual servo
CN110238820A (en) Hand and eye calibrating method based on characteristic point
CN110666805A (en) Industrial robot sorting method based on active vision
CN111452048B (en) Calibration method and device for relative spatial position relation of multiple robots
CN110202560A (en) A kind of hand and eye calibrating method based on single feature point
CN108748149B (en) Non-calibration mechanical arm grabbing method based on deep learning in complex environment
CN110148187A (en) A kind of the high-precision hand and eye calibrating method and system of SCARA manipulator Eye-in-Hand
EP1607194A3 (en) Robot system comprising a plurality of robots provided with means for calibrating their relative position
CN112958960B (en) Robot hand-eye calibration device based on optical target
CN110136204B (en) Sound film dome assembly system based on calibration of machine tool position of bilateral telecentric lens camera
CN107225882A (en) A kind of laser marking method based on CCD navigator fixs
CN111941425A (en) Rapid workpiece positioning method of robot milling system based on laser tracker and binocular camera
CN113601158A (en) Bolt feeding and pre-tightening system based on visual positioning and control method
CN109737871B (en) Calibration method for relative position of three-dimensional sensor and mechanical arm
Yang et al. Visual servoing of humanoid dual-arm robot with neural learning enhanced skill transferring control
CN111251189B (en) Visual positioning method for casting polishing
CN111815718B (en) Method for switching stations of industrial screw robot based on vision
CN110533727B (en) Robot self-positioning method based on single industrial camera

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP01 Change in the name or title of a patent holder
CP01 Change in the name or title of a patent holder

Address after: Room 495, building 3, 1197 Bin'an Road, Binjiang District, Hangzhou City, Zhejiang Province 310051

Patentee after: Yi Si Si (Hangzhou) Technology Co.,Ltd.

Address before: Room 495, building 3, 1197 Bin'an Road, Binjiang District, Hangzhou City, Zhejiang Province 310051

Patentee before: ISVISION (HANGZHOU) TECHNOLOGY Co.,Ltd.