CN110788863B - Machine vision calibration method and mechanical arm positioning and grabbing method - Google Patents
Machine vision calibration method and mechanical arm positioning and grabbing method Download PDFInfo
- Publication number
- CN110788863B CN110788863B CN201911152788.7A CN201911152788A CN110788863B CN 110788863 B CN110788863 B CN 110788863B CN 201911152788 A CN201911152788 A CN 201911152788A CN 110788863 B CN110788863 B CN 110788863B
- Authority
- CN
- China
- Prior art keywords
- calibration
- mechanical arm
- opx
- opy
- target object
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1679—Programme controls characterised by the tasks executed
- B25J9/1692—Calibration of manipulator
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J19/00—Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
- B25J19/0095—Means or methods for testing manipulators
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Manipulator (AREA)
Abstract
The invention relates to a machine vision calibration method, which can directly calculate the projection relation of world coordinates and image coordinates through a series of data and calibrate and calculate the relative position relation of a CCD (charge coupled device) and a mechanical arm, thus directly calculating the relative displacement parameters of a mechanical arm to be regulated through the image coordinates of a target object, further realizing the positioning and grabbing of the machine vision without manually calibrating the coordinate values of key points. In addition, the invention also discloses a mechanical arm positioning and grabbing method, which can automatically position and grab the target object by using the result calibrated by the machine vision calibration method.
Description
Technical Field
The invention relates to a calibration method, in particular to a machine vision calibration method.
Background
The mechanical arm is a device for automatically positioning and grabbing a product through a machine vision recognition system, wherein the machine vision recognition system needs to be calibrated before positioning the product so as to realize conversion between a world coordinate system and an image coordinate system and confirmation of key point position coordinate values, thereby realizing accurate identification and positioning grabbing of the product, and therefore, the mechanical arm is one of keys capable of realizing accurate grabbing of the product. At present, the mainstream calibration method of machine vision obtains the projection relationship between world coordinates and image coordinates through a series of data acquisition, and manually confirms the coordinate values of key points through a manual mode, so that the calibration efficiency is low.
Disclosure of Invention
In order to solve the technical problems, the invention aims to provide a machine vision calibration method which does not need manual confirmation and has high calibration efficiency. In addition, the invention also designs a mechanical arm positioning and grabbing method for automatically positioning and grabbing the target object by using the result calibrated by the machine vision calibration method.
The invention relates to a machine vision calibration method, which comprises the following steps:
s1, the mechanical arm grabs the calibration object and places the calibration object in the calibration area, and world coordinates (RWX) of the mechanical arm when the mechanical arm places the calibration object in the calibration area are recorded1,RWY1);
S2, moving the mechanical arm until the calibration object is clearly imaged in the visual field range of the machine vision;
s3, recording world coordinates (RWX) of the mechanical arm after the calibration object is clearly imaged in the visual field range of the machine vision2,RWY2) (ii) a Shooting a calibration object by machine vision, and identifying and recording image coordinates (OPX) of the calibration object in a view field range1,OPY1);
S4, moving the arm D along the X direction of the world coordinate1After the distance, the calibration object is shot, and the image coordinate (OPX) of the calibration object at the moment is identified and recorded2,OPY2);
S5, moving the mechanical arm until the mechanical arm returns to the position (RXX)1,RWY1);
S6, moving the robot arm D along the Y direction of the world coordinate2After the distance, the calibration object is shot, and the image coordinate (OPX) of the calibration object at the moment is identified and recorded3,OPY3);
S7, calculating the scale of the world coordinate and the image coordinate according to the following formula:
P2mmX=SQRT((OPX1-OPX2)2+(OPY1-OPY2)2)/D1
P2mmY=SQRT((OPX1-OPX3)2+(OPY1-OPY3)2)/D2
wherein SQRT is an open square root symbol;
the angular difference α between the world coordinates and the image coordinates is calculated according to the following formula:
α=atctan((OPY2-OPY1)/(OPX2-OPX1) Either) or
α=π/2-atctan((OPY3-OPY1)/(OPX3-OPX1));
Wherein atctan is an arc tangent function, and pi is a circumferential ratio;
the position (RWX) of the robot arm in step S3 is calculated according to the following formula2,RWY2) And the position of the robot gripping point (RWX) in step S11,RWY1) The relationship of (1):
DX=RWX2-RWX1;
DY=RWY2-RWY1;
comprises DX, DY, P2mmX, P2mmY, alpha, OPX1,OPY1The included data set is the calibration result of the machine vision.
By the scheme, the invention at least has the following advantages: according to the calibration method of the machine vision, the projection relation of the world coordinate and the image coordinate is calculated through a series of data, and the relative position relation of the CCD and the mechanical arm is calibrated and calculated, so that the relative displacement parameters needing to be adjusted of the mechanical arm can be directly calculated through the image coordinate of the target object, the positioning and grabbing of the machine vision are realized, and the coordinate value of a key point is not required to be calibrated manually.
In summary, the calibration method of machine vision of the present invention does not need to manually confirm the coordinate values of the key points, and has higher calibration efficiency compared with the conventional calibration method.
The calibration result of the mechanical arm positioning and grabbing method is (DX, DY, P2mmX, P2mmY, alpha, OPX)1,OPY1) The calibration result is calibrated by adopting the calibration method of the machine vision;
the mechanical arm positioning and grabbing method comprises the following steps:
a1, moving the mechanical arm to the target object, shooting the target object after the target object is clearly imaged in the visual field range of the machine vision, identifying and calculating the image Coordinate (CPX) of the target object1,CPY1) Recording world coordinates of the robot arm (CWX)1,CWY1);
A2, calculating the image coordinate difference of the target object and the calibration object:
DPX1=CPX1-OPX1;
DPY1=CPY1–OPY1;
a3, rotate the image coordinate system α, and calculate the new image coordinate difference (DPX, DPY) according to the following formula:
L=SQRT(DPX1 2+DPY1 2);
α2=arctan(DPY1/DPX1);
DPX=L*sin(α+α2);
DPY=L*cos(α+α2);
wherein L is the image distance between the target object and the calibration object;
α2representing vectors corresponding to LAngle, vector, with the X-axis of the image coordinate systemNamely a vector from the coordinate of the calibration point image to the coordinate of the target object image;
α+α2the representation is that the image coordinate system is rotated by alpha to be parallel to the world coordinate system, namely the X axis of the image coordinate system is parallel to the X axis of the world coordinate device, and the Y axis of the image coordinate system is parallel to the Y axis of the world coordinate device;
DPY stands for vectorA modulus of the Y-direction component of the image coordinate system after rotation;
a4, calculating the displacement (DWX, DWY) of the mechanical arm required to move in the world coordinate system according to the following formula:
DWX=DPX/P2mmX-DX;
DWY=DPY/P2mmY-DY;
wherein DPX/P2mmX is the X-direction distance between the target object and the calibration object, namely the actual X-direction distance of the target object in world coordinates,
DPY/P2mmY is the Y-direction distance between the target object and the calibration object, namely the actual Y-direction distance of the target object in world coordinates,
DX and DY are initial offset of the mechanical arm relative to the calibration object;
a5, moving the displacement of the mechanical arm (DWX, DWY), and grabbing the target object.
Further, the robot arm positioning and grabbing method of the present invention, step a1, includes recording world coordinates of the robot arm (CWX)1,C WY1) The step (2).
According to the robot arm positioning and grabbing method, the positioning result of the machine vision positioning method is utilized, the image coordinates of the target object are combined, and the displacement of the robot arm needing to move is finally obtained through a series of operations, manual calibration is not needed, the calibration efficiency is high, and the grabbing efficiency of the target object is further improved.
The foregoing description is only an overview of the technical solutions of the present invention, and in order to make the technical solutions of the present invention more clearly understood and to implement them in accordance with the contents of the description, the following detailed description is given with reference to the preferred embodiments of the present invention and the accompanying drawings.
Drawings
FIG. 1 is a schematic diagram of a calibration object and a target object located within an image coordinate system;
wherein, 1 is a visual sensor imaging surface; 2 is an image coordinate axis; 3 is the image coordinate axis after rotating; 4 is the target object image; and 5, a calibration object image.
Detailed Description
The following detailed description of embodiments of the present invention is provided in connection with the accompanying drawings and examples. The following examples are intended to illustrate the invention but are not intended to limit the scope of the invention.
Referring to fig. 1, a machine vision calibration method according to a preferred embodiment of the present invention includes the following steps:
s1, the mechanical arm grabs the calibration object and places the calibration object in the calibration area, and world coordinates (RWX) of the mechanical arm when the mechanical arm places the calibration object in the calibration area are recorded1,RWY1);
The calibration area is artificially arranged, in the calibration area, the mechanical arm can take and place the calibration object through a grabbing device on the mechanical arm, namely the mechanical arm, and the machine vision on the mechanical arm can shoot the calibration object. The machine vision is a vision sensing system based on an image sensor, and the image sensor is preferably a CCD sensor because the CCD sensor has better imaging quality than a CMOS sensor. During installation, the imaging surface of the image sensor is parallel to the XY surface of the mechanical arm, namely the Z axis of the mechanical arm is perpendicular to the imaging surface of the visual sensor, so that the mechanical arm is parallel to the Z axis of the visual sensor, and the calibration complexity is reduced.
S2, moving the mechanical arm until the calibration object is clearly imaged in the visual field range of the machine vision;
s3, recording world coordinates (RWX) of the mechanical arm after the calibration object is clearly imaged in the visual field range of the machine vision2,RWY2) (ii) a Shooting a calibration object by machine vision, and identifying and recording image coordinates (OPX) of the calibration object in a view field range1,OPY1);
S4, moving the arm D along the X direction of the world coordinate1After the distance, the calibration object is shot, and the image coordinate (OPX) of the calibration object at the moment is identified and recorded2,OPY2);
S5, moving the mechanical arm until the mechanical arm returns to the position (RXX)1,RWY1);
S6, moving the robot arm D along the Y direction of the world coordinate2After the distance, the calibration object is shot, and the image coordinate (OPX) of the calibration object at the moment is identified and recorded3,OPY3);
In the above steps, the execution order of step S4 and step S6 may be reversed.
S7, calculating the scale of the world coordinate and the image coordinate according to the following formula:
P2mmX=SQRT((OPX1-OPX2)2+(OPY1-OPY2)2)/D1;
P2mmY=SQRT((OPX1-OPX3)2+(OPY1-OPY3)2)/D2;
wherein SQRT is an open square root symbol;
SQRT((OPX1-OPX2)2+(OPY1-OPY2)2) Represents the X-direction distance of the marker image in the image coordinate system in step S3 and step S4 and the moving distance D of the robot arm in the world coordinate system1The ratio P2mmX is a scale of the image coordinate and the world coordinate in the X direction, similarly, P2mmY is a scale of the image coordinate and the world coordinate in the Y direction, and the scale and the image coordinate of the target object are combined with the coordinate of the calibration object, so that the coordinate of the calibration object can be converted into the corresponding world coordinate, and the mechanical arm is moved, and the positioning and the grabbing of the target object are further realized.
The angular difference α between the world coordinates and the image coordinates is calculated according to the following formula:
α=atctan((OPY2-OPY1)/(OPX2-OPX1) Either) or
α=π/2-atctan((OPY3-OPY1)/(OPX3-OPX1));
Wherein atctan is an arc tangent function, and pi is a circumferential ratio;
due to image coordinates (OPX)2,OPY2) The robot arm moves in the direction of world coordinate X D in step S41The image coordinates of the calibration object photographed after the distance are taken, and therefore, a vector corresponding to D1 is formed in the image coordinate system from the movement locus of the calibration object patternThe vectorIs parallel to the X axis of the world coordinate system, so the included angle alpha between the X axis of the image coordinate system and the X axis of the world coordinate system is the clip between the X axis of the image coordinate system and the X axis of the world coordinate systemAngle, i.e. the angle between the image coordinate system and the world coordinate system. In addition, the included angle between the image world coordinate Y axis and the image coordinate X axis can also be calculated through the image coordinates in step S6, and then converted into the included angle between the world coordinate system and the image coordinate system, which is not described herein again.
The position (RWX) of the robot arm in step S3 is calculated according to the following formula2,RWY2) And the position of the robot gripping point (RWX) in step S11,RWY1) The relationship of (1):
DX=RWX2-RWX1;
DY=RWY2-RWY1;
wherein DX is the distance of the robot arm in the X-axis direction between step S3 and step S1, that is, the X-direction distance of the robot arm between the calibration object shooting point and the calibration object grasping point;
DY is the distance of the mechanical arm in the Y-axis direction in the steps S3 and S1, namely the Y-direction distance of the mechanical arm between the calibration object shooting point and the calibration object grabbing point;
comprises DX, DY, P2mmX, P2mmY, alpha, OPX1,OPY1The included data set is the calibration result of the machine vision.
According to the robot arm positioning and grabbing method, calibration results of the robot arm are (DX, DY, P2mmX, P2mmY, alpha and OPX)1,OPY1) The calibration result is calibrated by the machine vision calibration method of claim 1;
the mechanical arm positioning and grabbing method comprises the following steps:
a1, moving the mechanical arm to the target object, shooting the target object after the target object is clearly imaged in the visual field range of the machine vision, identifying and calculating the image Coordinate (CPX) of the target object1,CPY1);
A2, calculating the image coordinate difference of the target object and the calibration object:
DPX1=CPX1-OPX1;
DPY1=CPY1–OPY1;
wherein, DPX1For object imageMarking the X-direction distance between the image coordinates of the calibration object;
DPY1the Y-direction distance between the image coordinate of the target object and the image coordinate of the calibration object is taken as the distance;
a3, rotate the image coordinate system α, and calculate the new image coordinate difference (DPX, DPY) according to the following formula:
L=SQRT(DPX1 2+DPY1 2);
α2=arctan(DPY1/DPX1);
DPX=L*sin(α+α2);
DPY=L*cos(α+α2);
wherein L is the image distance between the target object and the calibration object;
α2representing vectors corresponding to LAngle, vector, with the X-axis of the image coordinate systemNamely a vector from the coordinate of the calibration point image to the coordinate of the target object image;
α+α2the representation is that the image coordinate system is rotated by alpha to be parallel to the world coordinate system, namely the X axis of the image coordinate system is parallel to the X axis of the world coordinate device, and the Y axis of the image coordinate system is parallel to the Y axis of the world coordinate device;
DPY stands for vectorA modulus of the Y-direction component of the image coordinate system after rotation;
a4, calculating the displacement (DWX, DWY) of the mechanical arm required to move in the world coordinate system according to the following formula:
DWX=DPX/P2mmX-DX;
DWY=DPY/P2mmY-DY;
wherein DPX/P2mmX is the X-direction distance between the target object and the calibration object, namely the actual X-direction distance of the target object in world coordinates,
DPY/P2mmY is the Y-direction distance between the target object and the calibration object, namely the actual Y-direction distance of the target object in world coordinates,
DX and DY are initial offset of the mechanical arm relative to the calibration object;
a5, moving the displacement of the mechanical arm (DWX, DWY), and grabbing the target object.
Preferably, step A1 includes recording world coordinates of the robot arm (CWX)1,CWY1) The step (2).
In practice, if A5 requires a relative adjustment displacement of the return arm (CWX)1,CWY1) This dot coordinate may not be recorded; if A5 needs to return the absolute coordinates of the robot arm, then the world coordinate point needs to be recorded.
It will be evident to those skilled in the art that the invention is not limited to the details of the foregoing illustrative embodiments, and that the present invention may be embodied in other specific forms without departing from the spirit or essential attributes thereof. The present embodiments are, therefore, to be considered in all respects as illustrative and not restrictive, the scope of the invention being indicated by the appended claims rather than by the foregoing description.
In addition, the above is only a preferred embodiment of the present invention, and is not intended to limit the present invention, it should be noted that, for those skilled in the art, several modifications and variations can be made without departing from the technical principle of the present invention, and these modifications and variations should also be regarded as the protection scope of the present invention. Also, it should be understood that although the present description refers to embodiments, not every embodiment may contain only a single embodiment, and such description is for clarity only, and those skilled in the art should integrate the description, and the embodiments may be combined as appropriate to form other embodiments understood by those skilled in the art.
Claims (3)
1. A machine vision calibration method is characterized in that: the method comprises the following steps:
s1, the mechanical arm grabs the calibration object and places the calibration object in the calibration area, and world coordinates (RWX) of the mechanical arm when the mechanical arm places the calibration object in the calibration area are recorded1,RWY1);
S2, moving the mechanical arm until the calibration object is clearly imaged in the visual field range of the machine vision;
s3, recording world coordinates (RWX) of the mechanical arm after the calibration object is clearly imaged in the visual field range of the machine vision2,RWY2) (ii) a Shooting a calibration object by machine vision, and identifying and recording image coordinates (OPX) of the calibration object in a view field range1,OPY1);
S4, moving the arm D along the X direction of the world coordinate1After the distance, the calibration object is shot, and the image coordinate (OPX) of the calibration object at the moment is identified and recorded2,OPY2);
S5, moving the mechanical arm until the mechanical arm returns to the position (RXX)1,RWY1);
S6, moving the robot arm D along the Y direction of the world coordinate2After the distance, the calibration object is shot, and the image coordinate (OPX) of the calibration object at the moment is identified and recorded3,OPY3);
S7, calculating the scale of the world coordinate and the image coordinate according to the following formula:
P2mmX=SQRT((OPX1-OPX2)2+(OPY1-OPY2)2)/D1
P2mmY=SQRT((OPX1-OPX3)2+(OPY1-OPY3)2)/D2
wherein SQRT is an open square root symbol;
the angular difference α between the world coordinates and the image coordinates is calculated according to the following formula:
α=atctan((OPY2-OPY1)/(OPX2-OPX1) Either) or
α=π/2-atctan((OPY3-OPY1)/(OPX3-OPX1));
Wherein atctan is an arc tangent function, and pi is a circumferential ratio;
the position (RWX) of the robot arm in step S3 is calculated according to the following formula2,RWY2) And the position of the robot gripping point (RWX) in step S11,RWY1) The positional relationship of (2):
DX=RWX2-RWX1;
DY=RWY2-RWY1;
comprises DX, DY, P2mmX, P2mmY, alpha, OPX1,OPY1The included data set is the calibration result of the machine vision.
2. The mechanical arm positioning and grabbing method is characterized in that the calibration result of the mechanical arm is (DX, DY, P2mmX, P2mmY, alpha, OPX)1,OPY1) The calibration result is calibrated by the machine vision calibration method as claimed in claim 1;
the mechanical arm positioning and grabbing method comprises the following steps:
a1, moving the mechanical arm to the target object, shooting the target object after the target object is clearly imaged in the visual field range of the machine vision, identifying and calculating the image Coordinate (CPX) of the target object1,CPY1);
A2, calculating the image coordinate difference of the target object and the calibration object:
DPX1=CPX1-OPX1
DPY1=CPY1–OPY1
a3, rotating the image coordinate system α, calculating new image coordinate differences (DPX, DPY) according to the following formula:
L=SQRT(DPX1 2+DPY1 2);
α2=arctan(DPY1/DPX1);
DPX=L*sin(α+α2);
DPY=L*cos(α+α2);
wherein L is the image distance between the target object and the calibration object;
α2representing vectors corresponding to LAngle, vector, with the X-axis of the image coordinate systemNamely a vector from the coordinate of the calibration point image to the coordinate of the target object image;
α+α2the representation is that the image coordinate system is rotated by alpha to be parallel to the world coordinate system, namely the X axis of the image coordinate system is parallel to the X axis of the world coordinate device, and the Y axis of the image coordinate system is parallel to the Y axis of the world coordinate device;
DPY stands for vectorA modulus of the Y-direction component of the image coordinate system after rotation;
a4, calculating the displacement (DWX, DWY) of the mechanical arm required to move in the world coordinate system according to the following formula:
DWX=DPX/P2mmX-DX;
DWY=DPY/P2mmY-DY;
wherein DPX/P2mmX is the X-direction distance between the target object and the calibration object, namely the actual X-direction distance of the target object in world coordinates,
DPY/P2mmY is the Y-direction distance between the target object and the calibration object, namely the actual Y-direction distance of the target object in world coordinates,
DX and DY are initial offset of the mechanical arm relative to the calibration object;
a5, moving the displacement of the mechanical arm (DWX, DWY), and grabbing the target object.
3. The robot arm positioning and grabbing method of claim 2, wherein step a1 includes recording robot arm world Coordinates (CWX)1,CWY1) The step (2).
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911152788.7A CN110788863B (en) | 2019-11-22 | 2019-11-22 | Machine vision calibration method and mechanical arm positioning and grabbing method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911152788.7A CN110788863B (en) | 2019-11-22 | 2019-11-22 | Machine vision calibration method and mechanical arm positioning and grabbing method |
Publications (2)
Publication Number | Publication Date |
---|---|
CN110788863A CN110788863A (en) | 2020-02-14 |
CN110788863B true CN110788863B (en) | 2020-11-10 |
Family
ID=69445943
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201911152788.7A Active CN110788863B (en) | 2019-11-22 | 2019-11-22 | Machine vision calibration method and mechanical arm positioning and grabbing method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110788863B (en) |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111390910A (en) * | 2020-03-31 | 2020-07-10 | 广州富港万嘉智能科技有限公司 | Manipulator target grabbing and positioning method, computer readable storage medium and manipulator |
CN112001967A (en) * | 2020-08-14 | 2020-11-27 | 苏州华兴源创科技股份有限公司 | Method and device for guiding manipulator to carry object by camera |
CN114170246B (en) * | 2021-12-08 | 2024-05-17 | 广东奥普特科技股份有限公司 | Positioning method for precision displacement platform |
CN115026828B (en) * | 2022-06-23 | 2023-07-28 | 池州市安安新材科技有限公司 | Robot arm grabbing control method and system |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104217441B (en) * | 2013-08-28 | 2017-05-10 | 北京嘉恒中自图像技术有限公司 | Mechanical arm positioning fetching method based on machine vision |
US9193073B1 (en) * | 2014-10-15 | 2015-11-24 | Quanta Storage Inc. | Robot calibration apparatus for calibrating a robot arm |
CN105234943B (en) * | 2015-09-09 | 2018-08-14 | 大族激光科技产业集团股份有限公司 | A kind of industrial robot teaching device and method of view-based access control model identification |
WO2017133756A1 (en) * | 2016-02-02 | 2017-08-10 | Abb Schweiz Ag | Robot system calibration |
CN109633612B (en) * | 2018-10-18 | 2020-06-16 | 浙江大学 | Single-line laser radar and camera external reference calibration method without common observation |
CN110335310B (en) * | 2019-07-09 | 2021-07-02 | 中国大恒(集团)有限公司北京图像视觉技术分公司 | Calibration method under non-common vision field |
-
2019
- 2019-11-22 CN CN201911152788.7A patent/CN110788863B/en active Active
Also Published As
Publication number | Publication date |
---|---|
CN110788863A (en) | 2020-02-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110788863B (en) | Machine vision calibration method and mechanical arm positioning and grabbing method | |
EP0528054B1 (en) | Detected position correcting method | |
CN111775146B (en) | Visual alignment method under industrial mechanical arm multi-station operation | |
CN107443377B (en) | Sensor-robot coordinate system conversion method and robot eye calibration method | |
CN111862220B (en) | Correction method and device for UVW platform calibration, deviation correction method and alignment system | |
JP3208953B2 (en) | Three-dimensional position and posture recognition method based on vision and three-dimensional position and posture recognition device based on vision | |
CN106426172A (en) | Calibration method and system for industrial robot tool coordinate system | |
CN107614195B (en) | Assembling device and assembling method for gear mechanism | |
CN112330749A (en) | Hand-eye calibration method and hand-eye calibration device for camera mounted on robot arm | |
CN106341956B (en) | A kind of fixed camera bearing calibration | |
JP2003065714A (en) | Guiding device and guiding method for camera calibration, and camera calibration apparatus | |
CN111459176B (en) | Automatic charging positioning control method, calibration method and vehicle attitude calculation method for vehicle | |
CN113781558A (en) | Robot vision locating method with decoupled posture and position | |
CN110815177A (en) | Migration method for 2D visual guidance teaching of composite robot | |
JP3465047B2 (en) | A stereoscopic system that detects flat areas during vertical descent | |
JP3511551B2 (en) | Robot arm state detection method and detection system | |
CN111652945A (en) | Camera calibration method | |
CN111768383A (en) | Three-dimensional target and method for recovering working function of visual sensor by using same | |
Fasogbon et al. | Calibration of fisheye camera using entrance pupil | |
CN108413896B (en) | mechanical arm calibration method | |
CN113240751B (en) | Calibration method for robot tail end camera | |
CN112584041B (en) | Image identification dynamic deviation rectifying method | |
CN112509035A (en) | Double-lens image pixel point matching method for optical lens and thermal imaging lens | |
CN112954203B (en) | Camera optical axis correction method for multiple optical components | |
JPH087102A (en) | Correspondent point extracting device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |