CN109465830B - Robot monocular stereoscopic vision calibration system and method - Google Patents

Robot monocular stereoscopic vision calibration system and method Download PDF

Info

Publication number
CN109465830B
CN109465830B CN201811515700.9A CN201811515700A CN109465830B CN 109465830 B CN109465830 B CN 109465830B CN 201811515700 A CN201811515700 A CN 201811515700A CN 109465830 B CN109465830 B CN 109465830B
Authority
CN
China
Prior art keywords
module
camera
axis
coordinate
calibration
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811515700.9A
Other languages
Chinese (zh)
Other versions
CN109465830A (en
Inventor
张建国
候慧敏
齐家坤
李颖
季甜甜
刘隽
陈维光
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Institute of Technology
Original Assignee
Shanghai Institute of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Institute of Technology filed Critical Shanghai Institute of Technology
Priority to CN201811515700.9A priority Critical patent/CN109465830B/en
Publication of CN109465830A publication Critical patent/CN109465830A/en
Application granted granted Critical
Publication of CN109465830B publication Critical patent/CN109465830B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • B25J9/1605Simulation of manipulator lay-out, design, modelling of manipulator

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention provides a robot monocular stereoscopic vision calibration system and a method thereof, wherein the system comprises: the robot joint axis calibration system comprises an information acquisition module, a camera modeling module, a robot joint axis calibration module, a calibration feedback and fitting module and a robot model establishing module. Wherein, mark feedback and fitting module includes: the device comprises an actual deviation rate judging unit, an axis re-fitting unit and an Euler angle solving unit; the actual deviation rate judging unit is used for calculating the deviation between the data collected by the high-definition camera and the data collected by the infrared remote sensor; the axis re-fitting unit is used for correcting the data acquired by the high-definition camera; and the Euler angle calculating unit is used for calculating projection data of the actual value measured by the infrared remote sensor on three coordinate axes of the three-dimensional world coordinate system and calculating the deviation rate. Therefore, the dependence on the definition of the high-definition camera is reduced, the cost of equipment is reduced, the calibration precision is improved, and the robot is favorable for large-area popularization.

Description

Robot monocular stereoscopic vision calibration system and method
Technical Field
The invention relates to the technical field of intelligent robot positioning, in particular to a robot monocular stereoscopic vision calibration system and method.
Background
Along with the development of science and technology, each industry all relates to and adopts intelligent robot to carry out a series of production activities to replace traditional manual operation, thereby save manpower resources, improve work efficiency by a wide margin, and in intelligent robot field, an important link is the vision of robot and marks, and its accuracy is the work precision and the efficiency of robot directly decided.
At present, a vision calibration system of a robot is divided into a monocular mode and a binocular mode. The monocular vision calibration system mainly adopts a kinematics loop method and an axis measurement method, the kinematics loop method mainly obtains the pose of the tail end of the robot through measurement, then solves a kinematics equation of the robot to obtain joint parameters of the robot, the axis measurement method abstracts the joint axes of the robot into a straight line in space, and the geometric relationship among the joint axes is utilized to obtain the model kinematics parameters.
However, the monocular vision calibration system greatly increases the cost of equipment due to excessive dependence on the definition of the camera, and influences the large-area popularization of the robot.
Disclosure of Invention
Aiming at the defects in the prior art, the invention aims to provide a robot monocular stereoscopic vision calibration system and method.
In a first aspect, an embodiment of the present invention provides a robot monocular stereoscopic vision calibration system, including: the robot joint axis calibration system comprises an information acquisition module, a camera modeling module, a robot joint axis calibration module, a calibration feedback and fitting module and a robot model establishing module; wherein:
the output end of the information acquisition module is in communication connection with the camera modeling module and the calibration feedback and fitting module, and is used for acquiring a target image and the linear distance from the target to the robot joint, and respectively transmitting the target image and the linear distance from the target to the robot joint to the camera modeling module and the calibration feedback and fitting module;
the output end of the camera modeling module is in communication connection with the robot joint axis calibration module and is used for establishing a conversion model of a three-dimensional world coordinate system and a computer image coordinate system and outputting image information;
the output end and the input end of the robot joint axis calibration module are respectively in communication connection with the input end and the output end of the calibration feedback and fitting module; the output end of the robot joint axis calibration module is in communication connection with the input end of the robot model building module and is used for fitting and calibrating the robot joint axis according to the image information and the actual deviation information fed back by the calibration feedback and fitting module and outputting the joint axis data of the robot;
the output end of the calibration feedback and fitting module is in communication connection with the input ends of the robot model establishing module and the robot joint axis calibration module and is used for calculating the actual deviation of the target image and the linear distance from the target to the robot joint;
the input end of the robot model establishing module is in communication connection with the calibration feedback and fitting module and the output end of the robot joint axis calibration module, and is used for establishing a robot model according to joint axis data and actual deviation of the robot.
Optionally, the information collecting system includes: the high-definition camera and the infrared remote sensor are respectively used for collecting a target image and the linear distance from the target to the robot joint; the high-definition camera and the infrared remote sensor are fixedly arranged at the tail end of the robot and are distributed vertically.
Optionally, the camera modeling module is specifically configured to:
establishing a conversion model of a three-dimensional world coordinate system and a computer image coordinate system;
and outputting image information corresponding to the target image according to the conversion model.
Optionally, the establishing a conversion model of a three-dimensional world coordinate system and a computer image coordinate system includes:
establishing a transformation formula from a three-dimensional world coordinate system to a camera coordinate system:
Figure GDA0003337008830000021
wherein R represents a rotation orthogonal matrix, T represents a translation vector, (R)1,r4,r7)、(r2,r5,r8)、(r3,r6,r9) Respectively represent xw、yw、zwUnit direction vector component of coordinate axis in camera coordinate system, tx、ty、tzRespectively representing the unit direction vector component, x, of the translation vector in the camera coordinate systemw、yw、zwRespectively representing projection vectors of the target on each coordinate axis in a three-dimensional world coordinate system, and x, y and z respectively representing projection vectors of the target on each coordinate axis in a camera coordinate system;
establishing a coordinate perspective transformation relation formula from coordinates in a three-dimensional world coordinate system to a computer image coordinate system:
Figure GDA0003337008830000031
wherein, (Nx, Ny) is the image plane coordinate of the image plane coordinate system of the camera, f is the focal length of the lens, and (u)0,v0) ρ is a principal point, and u, v, and l each represent a vector of a target projected on each coordinate axis in the computer image coordinate system.
Optionally, the robot joint axis calibration module includes: the device comprises a calibration point acquisition unit, a camera optical center coordinate conversion unit and an axis equation fitting unit; the output end of the calibration point acquisition unit is in communication connection with the input end of the camera optical center coordinate conversion unit; the input end of the camera optical center coordinate conversion unit is in communication connection with the output end of the camera modeling module, and the output end of the camera optical center coordinate conversion unit is in communication connection with the input end of the axis equation fitting unit; the input end and the output end of the axis equation fitting unit are respectively in communication connection with the output end and the input end of the calibration feedback and fitting module, and the output end of the axis equation fitting unit is in communication connection with the input end of the robot model building module.
Optionally, the robot joint axis calibration module is specifically configured to:
the calibration point acquisition unit takes a point on the target as an initial value, takes five pixels respectively from top to bottom, left to right and left by taking the point as a center, and respectively obtains image surface coordinates u and v according to columns and rows, wherein the calculation formula is as follows:
Figure GDA0003337008830000032
Figure GDA0003337008830000033
wherein, f (u)i,vi) Is (u)i,vi) The gray value of the pixel point, n is less than or equal to 120;
the camera optical center coordinate conversion unit converts the optical center coordinate of the high-definition camera in a camera coordinate system into a coordinate in a three-dimensional world coordinate system, and the calculation formula is as follows:
Figure GDA0003337008830000034
and the axis equation fitting unit is used for fitting a circle with the multiple groups of camera optical center coordinate data and fitting a straight line which passes through the circle center and is vertical to the plane of the circle to serve as an axis equation.
Optionally, the calibration feedback and fitting module includes: the device comprises an Euler angle solving unit, an actual deviation rate judging unit and an axis re-fitting unit; the input end of the Euler angle solving unit is in communication connection with the output end of the axis equation fitting unit; the output end of the Euler angle solving unit is respectively in communication connection with the input ends of the camera optical center coordinate conversion unit and the actual deviation rate judging unit; the output end of the actual deviation rate judging unit is respectively in communication connection with the axis re-fitting unit and the input end of the robot model building system; and the output end of the axis re-fitting unit is electrically connected with the input end of the axis equation fitting unit.
Optionally, the calibration feedback and fitting module is specifically configured to:
the Euler angle calculating unit is used for calculating the Euler rotation angle, and the calculation formula is as follows:
θx=atan2(r8,r9)
Figure GDA0003337008830000041
θz=atan2(r4,r1)
wherein atan is an arctangent function of a trigonometric function tan, and thetax, thetay and thetaz are three-coordinate rotation angles of a three-dimensional world coordinate system rotating relative to a camera coordinate system respectively;
the actual deviation rate judging unit is used for calculating the deviation rate of the coordinate values under the camera coordinate system relative to the coordinate values actually measured by the infrared remote sensor, and the calculation formula is as follows:
Figure GDA0003337008830000042
in the formula, eta represents the deviation rate of coordinate values under a camera coordinate system relative to coordinate values actually measured by an infrared remote sensor, ABS represents an absolute value taking function, d represents a transverse moving axis coordinate value of the coordinate values actually measured by the infrared remote sensor, X represents a transverse moving axis coordinate value of the camera coordinate system, and m represents a preset reference value and is manually input;
an axis re-fitting unit for fitting an axis equation, the calculation formula is as follows:
Figure GDA0003337008830000043
wherein, x'wAnd the coordinate values of the corrected camera optical center in the three-dimensional world coordinate system are shown.
Optionally, the robot model building module builds a link coordinate system using a D-H model.
In a second aspect, an embodiment of the present invention provides a calibration method for robot monocular stereoscopic vision, which is applied to the calibration system for robot monocular stereoscopic vision described in any one of the first aspect, and executes a calibration operation for robot monocular stereoscopic vision.
Compared with the prior art, the invention has the following beneficial effects:
according to the robot monocular stereoscopic vision calibration system provided by the invention, through the arrangement of the calibration feedback and fitting system, the actual deviation rate judgment unit calculates the deviation between the data acquired by the current high-definition camera and the data acquired by the infrared remote sensor, judges the deviation height, and then determines whether the axis re-fitting unit re-fits the data or not, so that the correction of the data acquired by the high-definition camera can be realized, the high-definition camera and the infrared remote sensor are enabled to work in a double-matching manner, the excessive dependence on the definition of the high-definition camera can be reduced, the equipment cost is reduced, and the large-area popularization of the robot is facilitated.
On the other hand, by arranging the Euler angle calculating unit and adopting mutual calculation of the Euler angle and the rotation orthogonal matrix, the rotation angle between the three-dimensional world coordinate system and the camera coordinate system can be conveniently calculated, and then the projection data of the actual value projected on three coordinate axes of the three-dimensional world coordinate system measured by the infrared remote sensor is calculated by the rotation angle, so that the calculation of the deviation ratio is facilitated, the calculation flow of correcting the data acquired by the high-definition camera by the calibration feedback and fitting system is further optimized, and the system architecture is more reasonable.
Drawings
Other features, objects and advantages of the invention will become more apparent upon reading of the detailed description of non-limiting embodiments with reference to the following drawings:
fig. 1 is a schematic structural diagram of a monocular stereoscopic vision calibration system of a robot provided in an embodiment of the present invention;
fig. 2 is a schematic structural diagram of an information acquisition module according to an embodiment of the present invention;
FIG. 3 is a schematic structural diagram of a camera modeling module according to an embodiment of the present invention;
FIG. 4 is a schematic structural diagram of a robot joint axis calibration module according to an embodiment of the present invention;
FIG. 5 is a schematic structural diagram of a calibration feedback and fitting module according to an embodiment of the present invention;
fig. 6 is a schematic connection diagram of a robot joint axis calibration module and a calibration feedback and fitting module according to an embodiment of the present invention.
Detailed Description
The present invention will be described in detail with reference to specific examples. The following examples will assist those skilled in the art in further understanding the invention, but are not intended to limit the invention in any way. It should be noted that it would be obvious to those skilled in the art that various changes and modifications can be made without departing from the spirit of the invention. All falling within the scope of the present invention.
Fig. 1 is a schematic structural diagram of a monocular stereoscopic vision calibration system of a robot according to an embodiment of the present invention, and as shown in fig. 1, the system according to the present invention includes: the robot joint axis calibration system comprises an information acquisition module 10, a camera modeling module 20, a robot joint axis calibration module 30, a calibration feedback and fitting module 40 and a robot model establishing module 50. The output end of the information acquisition module 10 is electrically connected with the input ends of the camera modeling module 20 and the calibration feedback and fitting module 40, respectively.
Fig. 2 is a schematic structural diagram of an information acquisition module according to an embodiment of the present invention, and as shown in fig. 2, the information acquisition module 10 includes a high definition camera 11 and an infrared remote sensor 12, output ends of the high definition camera 11 and the infrared remote sensor 12 are respectively electrically connected to an input end of a camera modeling module 20 and an input end of a calibration feedback and fitting module 40, the high definition camera 11 and the infrared remote sensor 12 are respectively used for acquiring a target image and a linear distance from the target to a joint of a robot, and the high definition camera 11 and the infrared remote sensor 12 are both fixedly mounted at a terminal of the robot and are distributed vertically.
Fig. 3 is a schematic structural diagram of a camera modeling module according to an embodiment of the present invention, and as shown in fig. 3, the camera modeling module 20 includes a three-dimensional world coordinate system 21, a camera coordinate system 22, a camera image plane coordinate system 23, and a computer image coordinate system 24, and the three-dimensional world coordinate system 21, the camera coordinate system 22, the camera image plane coordinate system 23, and the computer image coordinate system 24 are converted by using the following conversion formulas:
transformation from the three-dimensional world coordinate system 21 to the camera coordinate system 22:
Figure GDA0003337008830000061
wherein R represents a rotation orthogonal matrix, T represents a translation vector, (R)1,r4,r7)、(r2,r5,r8)、(r3,r6,r9) Respectively represent xw、yw、zwUnit direction vector component of coordinate axis in camera coordinate system, tx、ty、tzRespectively representing the unit direction vector component, x, of the translation vector in the camera coordinate systemw、yw、zwThe projection vectors of the target on each coordinate axis in the three-dimensional world coordinate system are respectively represented, and x, y and z respectively represent the projection vectors of the target on each coordinate axis in the camera coordinate system.
The coordinate perspective transformation relationship from the points in the three-dimensional world coordinate system 21 to the computer image coordinate system 24 is as follows:
Figure GDA0003337008830000062
in the formula, (Nx, Ny) is an image plane coordinate of the camera image plane coordinate system 23, which is a known parameter, and the rotation orthogonal matrix R, the translation vector T, the lens focal length f, and the principal point (u0, v0) are calculated and determined by a radial arrangement constraint calibration method and the result is sent to the robot joint axis calibration module 30. The radial arrangement constraint calibration method is to solve an overdetermined linear equation by using a least square method to give external parameters, and then the internal parameters are determined by whether a camera has lens distortion or not. When the camera has no lens distortion, the internal parameters can be solved by an overdetermined linear equation, and when the camera has radial distortion, the internal parameters need to be solved by combining a nonlinear optimization method. The output of the camera modelling module 20 is electrically connected to the input of the robot joint axis calibration module 30.
Fig. 4 is a schematic structural diagram of a robot joint axis calibration module according to an embodiment of the present invention, and as shown in fig. 4, the robot joint axis calibration module 30 includes a calibration point obtaining unit 31, a camera optical center coordinate conversion unit 32 and an axis equation fitting unit 33, an output end of the calibration point obtaining unit 31 is electrically connected to an input end of the camera optical center coordinate conversion unit 32, an input end of the camera optical center coordinate conversion unit 32 is electrically connected to an output end of the camera modeling module 20, an output end of the camera optical center coordinate conversion unit 32 is electrically connected to an input end of the axis equation fitting unit 33, an input end and an output end of the axis equation fitting unit 33 are respectively electrically connected to an output end and an input end of the calibration feedback and fitting module 40, an output end of the axis equation fitting unit 33 is electrically connected to an input end of the robot model building module 50, the calibration point obtaining unit 31, the calibration point obtaining unit 33, the axis equation fitting unit 32, and the axis equation fitting unit 33, The camera optical center coordinate conversion unit 32 and the axis equation fitting unit 33 respectively use the following formulas.
The calibration point obtaining unit 31 uses a point on the target as an initial value, and takes five pixels up, down, left, and right around the point as a center, and obtains image plane coordinates u and v by column and row:
Figure GDA0003337008830000071
in the formula, f (ui, vi) is the gray value of the pixel point (ui, vi);
the camera optical center coordinate conversion unit 32 converts the optical center coordinate of the high definition camera 11 in the camera coordinate system 22 into the coordinate in the three-dimensional world coordinate system 21 by using the following formula:
Figure GDA0003337008830000072
the axis equation fitting unit 33 fits a circle by a plurality of sets of camera optical center coordinate data converted by the calibration point obtaining unit 31 and the camera optical center coordinate conversion unit 32, a linear equation passing through the circle center and perpendicular to the plane of the circle is an axis equation, by arranging the calibration point obtaining unit 31, the calibration point obtaining unit 31 freely and randomly obtains the calibration point on the target, and then the selected calibration point is vertically and horizontally symmetrical to obtain the adjacent calibration point.
The output end and the input end of the robot joint axis calibration module 30 are respectively and electrically connected with the input end and the output end of the calibration feedback and fitting module 40. Fig. 5 is a schematic structural diagram of a calibration feedback and fitting module according to an embodiment of the present invention, and as shown in fig. 5, the calibration feedback and fitting module 40 includes an euler angle obtaining unit 41, an actual deviation ratio determining unit 42, and an axis re-fitting unit 43. Fig. 6 is a schematic connection diagram of a robot joint axis calibration module and a calibration feedback and fitting module according to an embodiment of the present invention. The input end of the euler angle solving unit 41 is electrically connected with the output end of the axis equation fitting unit 33, the output end of the euler angle solving unit 41 is electrically connected with the input ends of the camera optical center coordinate conversion unit 32 and the actual deviation rate judging unit 42, the output end of the actual deviation rate judging unit 42 is electrically connected with the input ends of the axis re-fitting unit 43 and the robot model building module 50, and the output end of the axis re-fitting unit 43 is electrically connected with the input end of the axis equation fitting unit 33.
The calculation formula of the euler angle calculation unit 41 is as follows:
θx=atan2(r8,r9)
Figure GDA0003337008830000081
θz=atan2(r4,r1)
in the formula, θ x, θ y, and θ z are three-coordinate rotation angles at which the three-dimensional world coordinate system 21 rotates with respect to the camera coordinate system 22;
the calculation formula of the actual deviation ratio judgment unit 42 is as follows:
Figure GDA0003337008830000082
in the formula, eta represents the deviation rate of the coordinate value under the camera coordinate system 22 relative to the coordinate value actually measured by the infrared remote sensor 12, ABS represents the absolute value taking function, d represents the transverse moving axis coordinate value of the coordinate value actually measured by the infrared remote sensor 12, X represents the transverse moving axis coordinate value of the camera coordinate system 22, m represents the preset reference value, and the preset reference value is manually input;
wherein the calculation formula of the axis re-fitting unit 43 is as follows:
Figure GDA0003337008830000083
x 'in the formula'wThe coordinate value of the optical center of the camera after being corrected in the three-dimensional world coordinate system 21 is represented, the output end of the robot joint axis calibration module 30 is electrically connected with the input end of the robot model building module 50, the robot model building module 50 adopts a D-H model to build a connecting rod coordinate system, and the output end of the calibration feedback and fitting module 40 is electrically connected with the input end of the robot model building module 50.
A robot monocular stereoscopic vision calibration method comprises the following steps:
and S1, respectively acquiring the image of the target and the distance from the infrared remote sensor 12 to the target by the high-definition camera 11 and the infrared remote sensor 12, and respectively transmitting the data to the camera modeling module 20 and the actual deviation ratio judgment unit 42.
And S2, converting and storing the data transmitted by the high-definition camera 11 according to the coordinate system conversion formula, and transmitting the data to the calibration point acquisition unit 31.
S3, the calibration point obtaining unit 31 randomly obtains a point on the target, calculates the calibration unit parameters, and transmits the parameters to the camera optical center coordinate conversion unit 32.
S4, the coordinate value of the camera optical center in the three-dimensional world coordinate system 21 is converted by the camera optical center coordinate conversion unit 32, and is fitted to the axis equation by the axis equation fitting unit 33, and the data is transmitted to the euler angle calculation unit 41.
S5, the Euler angle acquisition unit 41 calculates the Euler rotation angle, the actual deviation rate judgment unit 42 calculates the relative deviation between the image acquired by the high-definition camera 11 and the distance measurement of the infrared remote sensor 12, and the judgment is made with m. If the number of the input data is less than m, outputting the data to a robot model building module 50; if the value is larger than m, the data is output to the axis re-fitting unit 43, the axis equation fitting unit 33 fits the axis equation after the data is corrected by the axis re-fitting unit 43, and the corrected axis equation is directly sent to the robot model building module 50 by the axis equation fitting unit 33.
And S6, dividing the finally fitted axis equation into two cases of an approximately vertical axis and an approximately parallel axis, and respectively establishing a connecting rod coordinate system.
In this embodiment, by setting the calibration feedback and fitting module 40, the actual deviation ratio determination unit 42 calculates the deviation between the data collected by the current high-definition camera 11 and the data collected by the infrared remote sensor 12, and determines the deviation. Then, whether the axis re-fitting unit 43 re-fits the data is judged, so that the data collected by the high-definition camera 11 can be corrected, and the high-definition camera 11 and the infrared remote sensor 12 work in a double-matching mode. This can reduce to rely on to high definition camera 11's definition to reduce the cost of equipment, be favorable to the large tracts of land popularization of robot. By providing the euler angle obtaining unit 41 and performing mutual calculation using the euler angle and the rotation orthogonal matrix, it is possible to easily calculate the rotation angle between the three-dimensional world coordinate system 21 and the camera coordinate system 22, and further, to calculate projection data of the actual value projected on the three coordinate axes of the three-dimensional world coordinate system 21 by the infrared remote sensor 12 from the rotation angle. The calculation of the deviation ratio is convenient to carry out, so that the calculation process of correcting the data acquired by the high-definition camera 11 by the calibration feedback and fitting module 40 is further optimized, and the system architecture is more reasonable. The problem of traditional axis measurement method because excessively rely on the definition of camera for the cost of equipment improves greatly, influences the large tracts of land of robot and promotes is solved.
It should be noted that, the steps in the robot monocular stereoscopic vision calibration method provided by the present invention may be implemented by using corresponding modules, devices, units, etc. in the robot monocular stereoscopic vision calibration system, and those skilled in the art may implement the step flow of the method by referring to the technical scheme of the system, that is, the embodiment in the system may be understood as a preferred example for implementing the method, and details are not described herein.
The foregoing description of specific embodiments of the present invention has been presented. It is to be understood that the present invention is not limited to the specific embodiments described above, and that various changes or modifications may be made by one skilled in the art within the scope of the appended claims without departing from the spirit of the invention. The embodiments and features of the embodiments of the present application may be combined with each other arbitrarily without conflict.

Claims (9)

1. A robot monocular stereoscopic vision calibration system is characterized by comprising: the robot joint axis calibration system comprises an information acquisition module, a camera modeling module, a robot joint axis calibration module, a calibration feedback and fitting module and a robot model establishing module; wherein:
the output end of the information acquisition module is in communication connection with the camera modeling module and the calibration feedback and fitting module, and is used for acquiring a target image and the linear distance from the target to the robot joint and respectively transmitting the target image and the linear distance from the target to the robot joint to the camera modeling module and the calibration feedback and fitting module;
the output end of the camera modeling module is in communication connection with the robot joint axis calibration module and is used for establishing a conversion model of a three-dimensional world coordinate system and a computer image coordinate system and outputting image information;
the output end and the input end of the robot joint axis calibration module are respectively in communication connection with the input end and the output end of the calibration feedback and fitting module; the output end of the robot joint axis calibration module is in communication connection with the input end of the robot model building module and is used for fitting and calibrating the robot joint axis according to the image information and the actual deviation fed back by the calibration feedback and fitting module and outputting the joint axis data of the robot;
the output end of the calibration feedback and fitting module is in communication connection with the input ends of the robot model establishing module and the robot joint axis calibration module and is used for calculating an actual deviation of a target image and a linear distance from the target to a robot joint, and the actual deviation represents a deviation of a coordinate value under a camera coordinate system relative to an actually measured coordinate value of the infrared remote sensor; wherein, the calibration feedback and fitting module comprises: the device comprises an Euler angle solving unit, an actual deviation rate judging unit and an axis re-fitting unit; the input end of the Euler angle solving unit is in communication connection with the output end of the axis equation fitting unit; the output end of the Euler angle solving unit is respectively in communication connection with the input ends of the camera optical center coordinate conversion unit and the actual deviation rate judging unit; the output end of the actual deviation rate judging unit is respectively in communication connection with the axis re-fitting unit and the input end of the robot model building system; the output end of the axis re-fitting unit is electrically connected with the input end of the axis equation fitting unit;
the input end of the robot model establishing module is in communication connection with the calibration feedback and fitting module and the output end of the robot joint axis calibration module, and is used for establishing a robot kinematic model according to joint axis data and actual deviation of the robot.
2. The system of claim 1, wherein the information collection module comprises: the high-definition camera and the infrared remote sensor are respectively used for collecting a target image and the linear distance from the target to the robot joint; the high-definition camera and the infrared remote sensor are fixedly arranged at the tail end of the robot and are distributed vertically.
3. The system of claim 1, wherein the camera modeling module is specifically configured to:
establishing a conversion model of a three-dimensional world coordinate system and a computer image coordinate system;
and outputting image information corresponding to the target image according to the conversion model.
4. The system of claim 3, wherein the establishing a conversion model of the three-dimensional world coordinate system and the computer image coordinate system comprises:
establishing a transformation formula from a three-dimensional world coordinate system to a camera coordinate system:
Figure FDA0003344253920000021
wherein R represents a rotation orthogonal matrix and T representsTranslation vector (r)1,r4,r7)、(r2,r5,r8)、(r3,r6,r9) Respectively represent xw、yw、zwUnit direction vector component of coordinate axis in camera coordinate system, tx、ty、tzRespectively representing the unit direction vector component, x, of the translation vector in the camera coordinate systemw、yw、zwRespectively representing projection vectors of the target on each coordinate axis in a three-dimensional world coordinate system, and x, y and z respectively representing projection vectors of the target on each coordinate axis in a camera coordinate system;
establishing a coordinate perspective transformation relation formula from coordinates in a three-dimensional world coordinate system to a computer image coordinate system:
Figure FDA0003344253920000022
wherein, (Nx, Ny) is the image plane coordinate of the image plane coordinate system of the camera, f is the focal length of the lens, and (u)0,v0) ρ is a principal point, and u, v, and l each represent a vector of a target projected on each coordinate axis in the computer image coordinate system.
5. The system of claim 1, wherein the robot joint axis calibration module comprises: the device comprises a calibration point acquisition unit, a camera optical center coordinate conversion unit and an axis equation fitting unit; the output end of the calibration point acquisition unit is in communication connection with the input end of the camera optical center coordinate conversion unit; the input end of the camera optical center coordinate conversion unit is in communication connection with the output end of the camera modeling module, and the output end of the camera optical center coordinate conversion unit is in communication connection with the input end of the axis equation fitting unit; the input end and the output end of the axis equation fitting unit are respectively in communication connection with the output end and the input end of the calibration feedback and fitting module, and the output end of the axis equation fitting unit is in communication connection with the input end of the robot model building module.
6. The system of claim 5, wherein the robot joint axis calibration module is specifically configured to:
the calibration point acquisition unit takes a point on the target as an initial value, takes five pixels respectively from top to bottom and from left to right by taking the point as a center, and respectively obtains image plane coordinates u and v according to columns and rows, wherein u represents an image plane horizontal coordinate, v represents an image plane vertical coordinate, and the calculation formula is as follows:
Figure FDA0003344253920000031
Figure FDA0003344253920000032
wherein f (ui, vi) is the gray value of the pixel point (ui, vi), n represents the maximum value of the subscript i, the value of i is 1,2,3.. n, and n is less than or equal to 120;
the camera optical center coordinate conversion unit converts the optical center coordinate of the high-definition camera in a camera coordinate system into a coordinate in a three-dimensional world coordinate system, and the calculation formula is as follows:
Figure FDA0003344253920000033
an axis equation fitting unit for fitting multiple groups of camera optical center coordinate data to a circle, fitting a straight line passing through the center of the circle and perpendicular to the plane of the circle as an axis equation, wherein R represents a rotation orthogonal matrix, T represents a translation vector, and (R)1,r4,r7)、(r2,r5,r8)、(r3,r6,r9) Respectively represent xw、yw、zwUnit direction vector component of coordinate axis in camera coordinate system, tx、ty、tzRespectively representing the translation vector at the camera coordinatesUnit direction vector component in system, xw、yw、zwThe projection vectors of the target on each coordinate axis in the three-dimensional world coordinate system are respectively represented, and x, y and z respectively represent the projection vectors of the target on each coordinate axis in the camera coordinate system.
7. The system of claim 6, wherein the calibration feedback and fitting module is specifically configured to:
the Euler angle calculating unit is used for calculating the Euler rotation angle, and the calculation formula is as follows:
θx=atan2(r8,r9)
Figure FDA0003344253920000034
θz=atan2(r4,r1)
wherein atan is an arctangent function of a trigonometric function tan, and thetax, thetay and thetaz are three-coordinate rotation angles of a three-dimensional world coordinate system rotating relative to a camera coordinate system respectively;
the actual deviation rate judging unit is used for calculating the deviation rate of the coordinate values under the camera coordinate system relative to the coordinate values actually measured by the infrared remote sensor, and the calculation formula is as follows:
Figure FDA0003344253920000041
in the formula, eta represents the deviation rate of coordinate values under a camera coordinate system relative to coordinate values actually measured by an infrared remote sensor, ABS represents an absolute value taking function, d represents a transverse moving axis coordinate value of the coordinate values actually measured by the infrared remote sensor, X represents a transverse moving axis coordinate value of the camera coordinate system, and m represents a preset reference value and is manually input;
an axis re-fitting unit for fitting an axis equation, the calculation formula is as follows:
Figure FDA0003344253920000042
wherein, x'wAnd the coordinate values of the corrected camera optical center in the three-dimensional world coordinate system are shown.
8. The system of claim 1, wherein the robot model building module builds a link coordinate system using a D-H model.
9. A calibration method for robot monocular stereoscopic vision, which is characterized in that the calibration method is applied to the robot monocular stereoscopic vision calibration system of any one of claims 1 to 8 to execute the calibration operation of the robot monocular stereoscopic vision.
CN201811515700.9A 2018-12-11 2018-12-11 Robot monocular stereoscopic vision calibration system and method Active CN109465830B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811515700.9A CN109465830B (en) 2018-12-11 2018-12-11 Robot monocular stereoscopic vision calibration system and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811515700.9A CN109465830B (en) 2018-12-11 2018-12-11 Robot monocular stereoscopic vision calibration system and method

Publications (2)

Publication Number Publication Date
CN109465830A CN109465830A (en) 2019-03-15
CN109465830B true CN109465830B (en) 2021-12-28

Family

ID=65676144

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811515700.9A Active CN109465830B (en) 2018-12-11 2018-12-11 Robot monocular stereoscopic vision calibration system and method

Country Status (1)

Country Link
CN (1) CN109465830B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109990705B (en) * 2019-03-21 2020-07-14 上海交通大学 Robot tail end temperature measuring gun coordinate system calibration method and system based on vision
CN110179468B (en) * 2019-05-22 2022-04-05 福建双驰智能信息技术有限公司 Foot measuring device, multi-dimensional foot feature analysis system and method
CN112904883B (en) * 2021-01-26 2022-08-05 德鲁动力科技(成都)有限公司 Terrain perception method, motion control method and system for quadruped robot
CN112743524B (en) * 2021-01-27 2022-11-25 上海应用技术大学 Target device, and pose detection system and method based on binocular vision measurement
CN115014398B (en) * 2022-07-27 2023-01-24 湖南科天健光电技术有限公司 Monocular stereoscopic vision measurement system position and attitude calibration method, device and system

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6101455A (en) * 1998-05-14 2000-08-08 Davis; Michael S. Automatic calibration of cameras and structured light sources
CN100480004C (en) * 2004-07-15 2009-04-22 上海交通大学 Method for rapid calibrating hand-eye relationship of single eye vision sensor of welding robot
US20100246899A1 (en) * 2009-03-26 2010-09-30 Rifai Khalid El Method and Apparatus for Dynamic Estimation of Feature Depth Using Calibrated Moving Camera
CN102980528B (en) * 2012-11-21 2015-07-08 上海交通大学 Calibration method of pose position-free constraint line laser monocular vision three-dimensional measurement sensor parameters
CN105678783B (en) * 2016-01-25 2018-10-19 西安科技大学 Refractive and reflective panorama camera merges scaling method with laser radar data

Also Published As

Publication number Publication date
CN109465830A (en) 2019-03-15

Similar Documents

Publication Publication Date Title
CN109465830B (en) Robot monocular stereoscopic vision calibration system and method
CN110728715B (en) Intelligent inspection robot camera angle self-adaptive adjustment method
CN105716542B (en) A kind of three-dimensional data joining method based on flexible characteristic point
CN110276806B (en) Online hand-eye calibration and grabbing pose calculation method for four-degree-of-freedom parallel robot stereoscopic vision hand-eye system
CN107214703B (en) Robot self-calibration method based on vision-assisted positioning
CN103278138B (en) Method for measuring three-dimensional position and posture of thin component with complex structure
CN100470590C (en) Camera calibration method and calibration apparatus thereof
CN105066884A (en) Robot tail end positioning deviation correction method and system
CN106457562A (en) Method for calibrating a robot and a robot system
CN109934878B (en) Linear calibration system and method based on camera coordinate system
CN107300382B (en) Monocular vision positioning method for underwater robot
CN107516326A (en) Merge monocular vision and the robot localization method and system of encoder information
CN111091076B (en) Tunnel limit data measuring method based on stereoscopic vision
CN111260720A (en) Target height measuring system based on deep learning method
CN105374067A (en) Three-dimensional reconstruction method based on PAL cameras and reconstruction system thereof
CN106737859A (en) The method for calibrating external parameters of sensor and robot based on invariable plane
CN111105467B (en) Image calibration method and device and electronic equipment
CN112017248A (en) 2D laser radar camera multi-frame single-step calibration method based on dotted line characteristics
CN114998447A (en) Multi-view vision calibration method and system
CN112837314B (en) Fruit tree canopy parameter detection system and method based on 2D-LiDAR and Kinect
CN108627103A (en) A kind of 2D laser measurement methods of parts height dimension
CN109773589A (en) Method and device, the equipment of on-line measurement and processing guiding are carried out to workpiece surface
CN111710002B (en) Camera external parameter calibration method based on Optitrack system
CN111145267B (en) 360-degree panoramic view multi-camera calibration method based on IMU assistance
CN106840137B (en) Automatic positioning and orienting method of four-point type heading machine

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant