CN117260712A - Method, system, device and medium for automatically calibrating coordinates of end assembly of robot - Google Patents

Method, system, device and medium for automatically calibrating coordinates of end assembly of robot Download PDF

Info

Publication number
CN117260712A
CN117260712A CN202311149043.1A CN202311149043A CN117260712A CN 117260712 A CN117260712 A CN 117260712A CN 202311149043 A CN202311149043 A CN 202311149043A CN 117260712 A CN117260712 A CN 117260712A
Authority
CN
China
Prior art keywords
tool
arm
coordinate
target
angle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311149043.1A
Other languages
Chinese (zh)
Inventor
冯杨广
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhuhai Haifeng Robot System Co ltd
Original Assignee
Zhuhai Haifeng Robot System Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhuhai Haifeng Robot System Co ltd filed Critical Zhuhai Haifeng Robot System Co ltd
Priority to CN202311149043.1A priority Critical patent/CN117260712A/en
Publication of CN117260712A publication Critical patent/CN117260712A/en
Pending legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1628Programme controls characterised by the control loop
    • B25J9/1653Programme controls characterised by the control loop parameters identification, estimation, stiffness, accuracy, error analysis
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • B25J9/161Hardware, e.g. neural networks, fuzzy logic, interfaces, processor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1661Programme controls characterised by programming, planning systems for manipulators characterised by task planning, object-oriented languages
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • B25J9/1692Calibration of manipulator

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Fuzzy Systems (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Manipulator (AREA)

Abstract

The invention discloses a method, a system, a device and a medium for automatically calibrating coordinates of a robot tail end assembly, wherein the method comprises the following steps: acquiring image data of an extract at the tail end of a first tool of a robot, identifying the image data to obtain a first characteristic point, acquiring a second characteristic point of reference image data, determining pixel difference according to the first characteristic point and the second characteristic point, further obtaining a first coordinate offset of the tail end of the first tool, acquiring data corresponding to the first tool, calculating to obtain a center coordinate, acquiring target data, determining a first target coordinate according to the tail end coordinate, the center coordinate, the target data and the first coordinate offset, and moving the tail end of the first tool to the first target coordinate. The method can quickly acquire the offset of the hole coordinates and correct Kong Zuobiao before the robot moves the holes, reduces the takt time of production, increases the productivity, and can be widely applied to the technical field of robots.

Description

Method, system, device and medium for automatically calibrating coordinates of end assembly of robot
Technical Field
The invention relates to the technical field of robots, in particular to a method, a system, a device and a medium for automatically calibrating coordinates of a tail end assembly of a robot.
Background
A large number of industrial robot automated production lines have been used in the current industry, including the automotive industry, the electronics industry, the engineering machinery industry, and the like. The industrial robot integrates advanced manufacturing technologies such as precision, flexibility, intelligence, software application development and the like, and realizes the highest implementation of the industrial automation level by detecting, controlling, optimizing, scheduling, managing and deciding the process, thereby increasing the yield, improving the quality, reducing the cost and reducing the resource consumption and the environmental pollution.
The SCARA robot is a robot arm applied to assembly work, has 3 rotary joints and is most suitable for plane positioning. The SCARA robot arm is used for replacing the vertical six-joint robot to carry the plate parts, so that the use cost can be reduced, and the control is simple.
However, when the current SCARA robot corrects the hole coordinates, the tail end component of the robot needs to be moved to the position of the screw hole, the C-axis coordinates fed back by the robot at the moment are obtained, the correction movement is carried out according to the C-axis coordinates, and finally the screw is locked in the screw hole. Therefore, action waste in the production process of the robot is caused, the takt time of production is increased, and the productivity is reduced.
Disclosure of Invention
In view of the above, an object of the embodiments of the present invention is to provide a method, a system, a device and a medium for automatically calibrating coordinates of a robot end assembly, which can quickly acquire an offset of a hole coordinate and correct the hole coordinate before the robot moves to the hole.
In a first aspect, an embodiment of the present invention provides a method for automatically calibrating coordinates of a robot end assembly, including the steps of:
acquiring image data of an extract at the tail end of the first tool, and moving the extract to a preset position for photographing after the extract is sucked by the tail end of the first tool;
performing identification processing on the image data to obtain first characteristic points of the extract;
acquiring a second characteristic point of the reference image data;
determining a pixel difference according to the first characteristic point and the second characteristic point, and determining a first coordinate offset of the tail end of the first tool according to the pixel difference and a preset conversion rule, wherein the pixel difference represents the difference between the pixel coordinates of the first characteristic point and the pixel coordinates of the second characteristic point;
acquiring a first tool length, a first tool angle, a first body arm length and a second body arm length of the tail end assembly;
Determining the tail end coordinates of the tail end of a first tool according to the first tool length, the first tool angle, the first body arm length and the second body arm length, wherein the first tool angle represents an included angle between the first tool and the second body arm;
acquiring target data of a target point location, wherein the target point location represents a hole site to be aligned after the extract at the tail end of the first tool moves;
acquiring a central coordinate of a connecting central point of the first arm of the body and the second arm of the body, and determining a first target coordinate according to the terminal coordinate, the central coordinate, the target data and the first coordinate offset, wherein the first target coordinate represents a coordinate of the terminal of the first tool when the extract is aligned to the target point;
the first tool tip is moved to the first target coordinate.
Optionally, the determining a pixel difference according to the first feature point and the second feature point, and determining a first coordinate offset of the first tool end according to the pixel difference and a preset conversion rule specifically includes:
acquiring first pixel coordinates of the first feature points in the image data;
Acquiring second pixel coordinates of the second feature points in the reference image data;
determining the pixel difference according to the first pixel coordinate and the second pixel coordinate;
acquiring scale data between a pixel coordinate system and a robot coordinate system;
and calculating the first coordinate offset according to the scale data and the pixel difference.
Optionally, the determining the first target coordinate according to the end coordinate, the center coordinate, the target data and the first coordinate offset specifically includes:
determining a first C-axis angle from the tip coordinates, the center coordinates, and the target data, the first C-axis angle characterizing a C-axis angle of the first tool tip when the extract is aligned with the target point;
and determining a first target coordinate according to the first coordinate offset and the first C-axis angle.
Optionally, the determining the first C-axis angle based on the end coordinates, the center coordinates and the target data specifically includes:
calculating the length of a tool two-arm according to the tail end coordinates and the center coordinates, wherein the tool two-arm represents a connecting straight line from the first tool tail end to the connecting center point;
Calculating to obtain a tool two-arm angle according to the length of the tool two-arm, the tail end coordinates and the center coordinates, wherein the tool two-arm angle represents an included angle between the tool two-arm and the body two-arm;
and calculating the first C-axis angle according to the target data, the tool two-arm angle, the length of the first body arm and the length of the tool two-arm.
Optionally, the calculating the first C-axis angle according to the target data, the tool two-arm angle, the lengths of the body one-arm and the tool two-arm specifically includes:
calculating a tool one-arm joint angle and a tool two-arm joint angle according to the tool two-arm angle, the tool one-arm length and the tool two-arm length, wherein the tool one-arm joint angle represents an included angle between the body one-arm after movement and the body one-arm before non-movement when the extract at the tail end of the first tool is aligned with the target point, and the tool two-arm joint angle represents an included angle between the body one-arm after movement and the tool two-arm;
and calculating to obtain a first C-axis angle according to the first arm joint angle of the tool and the second arm joint angle of the tool.
Optionally, the tip assembly further includes a second tool rotatably connected to the body arm, the first tool and the second tool having a non-zero included angle therebetween, the method further comprising:
respectively calculating the first target coordinate of the first tool tail end and the second target coordinate of the second tool tail end;
after moving the first tool tip to the first target coordinate, the second tool tip is moved to the second sitting target coordinate.
Optionally, before the acquiring the image data of the extract of the first tool end, the method specifically includes:
calibrating a theoretical origin of the tail end assembly;
and calibrating and acquiring and storing the reference image data of the first tool tail end according to the theoretical origin.
In a second aspect, embodiments of the present invention provide a system for automatically calibrating coordinates of a robotic end assembly, comprising:
the first module is used for acquiring image data of the extract at the tail end of the first tool, and the image data is obtained by taking pictures by moving the extract to a preset position after the extract is sucked by the tail end of the first tool;
The second module is used for carrying out identification processing on the image data to obtain a first characteristic point of the extract;
a third module for acquiring a second feature point of the reference image data;
a fourth module, configured to determine a pixel difference according to the first feature point and the second feature point, and determine a first coordinate offset of the first tool end according to the pixel difference and a preset conversion rule, where the pixel difference represents a difference between a pixel coordinate of the first feature point and a pixel coordinate of the second feature point;
a fifth module for acquiring a first tool length, a first tool angle, a body first arm length, and a body second arm length of the tip assembly;
a sixth module, configured to determine an end coordinate of an end of a first tool according to the first tool length, the first tool angle, the first body arm length, and the second body arm length, where the first tool angle represents an included angle between the first tool and the second body arm;
a seventh module, configured to obtain target data of a target point, where the target point represents a hole site to be aligned after the extract at the end of the first tool moves;
An eighth module, configured to obtain a center coordinate of a connection center point of the first arm of the body and the second arm of the body, and determine a first target coordinate according to the end coordinate, the center coordinate, the target data, and the first coordinate offset, where the first target coordinate represents a coordinate of the end of the first tool when the extract is aligned to the target point;
a ninth module for moving the first tool tip to the first target coordinates.
In a third aspect, an embodiment of the present invention provides an apparatus for automatically calibrating coordinates of a robot end assembly, including:
at least one processor;
at least one memory for storing at least one program;
the at least one program, when executed by the at least one processor, causes the at least one processor to implement the method as described above.
In a fourth aspect, embodiments of the present invention provide a computer readable storage medium having stored therein a processor executable program for performing the method as described above when executed by a processor.
The embodiment of the invention has the following beneficial effects: the embodiment of the invention provides a method for automatically calibrating coordinates of a robot tail end assembly, which comprises the following steps: acquiring image data of an extract at the tail end of the first tool, and moving the extract to a preset position for photographing after the extract is sucked by the tail end of the first tool; performing identification processing on the image data to obtain first characteristic points of the extract; acquiring a second characteristic point of the reference image data; determining a pixel difference according to the first characteristic point and the second characteristic point, and determining a first coordinate offset of the tail end of the first tool according to the pixel difference and a preset conversion rule, wherein the pixel difference represents the difference between the pixel coordinates of the first characteristic point and the pixel coordinates of the second characteristic point; acquiring a first tool length, a first tool angle, a first body arm length and a second body arm length of the tail end assembly; determining the tail end coordinates of the tail end of a first tool according to the first tool length, the first tool angle, the first body arm length and the second body arm length, wherein the first tool angle represents an included angle between the first tool and the second body arm; acquiring target data of a target point location, wherein the target point location represents a hole site to be aligned after the extract at the tail end of the first tool moves; acquiring a central coordinate of a connecting central point of the first arm of the body and the second arm of the body, and determining a first target coordinate according to the terminal coordinate, the central coordinate, the target data and the first coordinate offset, wherein the first target coordinate represents a coordinate of the terminal of the first tool when the extract is aligned to the target point; the first tool tip is moved to the first target coordinate. The reference image data are established in advance, the acquired image data are compared with the reference image data, the coordinate offset of the tail end of the first tool is obtained, the specific information of the tail end assembly stored in advance is further obtained, the first C-axis angle of the tail end of the first tool, which is located at the target point, is solved according to the target data of the target point, the offset of the hole coordinates can be quickly obtained and Kong Zuobiao correction can be carried out before the robot moves the hole, the takt time of production is reduced, and the productivity is increased.
Drawings
FIG. 1 is a flowchart illustrating steps of a method for automatically calibrating coordinates of a robot end assembly according to an embodiment of the present invention;
FIG. 2 is a schematic view of a tip assembly according to an embodiment of the present invention;
FIG. 3 is a schematic view of a terminal assembly according to an embodiment of the present invention at a target site;
FIG. 4 is a block diagram of a system for automatically calibrating coordinates of a robotic end assembly according to an embodiment of the present invention;
FIG. 5 is a block diagram of an apparatus for automatically calibrating coordinates of a robot end assembly according to an embodiment of the present invention;
reference numerals: body one arm 1, body two arms 2, first instrument 3, instrument two arms 4, instrument two arms angle 5, first instrument angle 6, body one arm and body two arm's connection structure 7, first instrument end 8, instrument two arm joint angle 9, instrument one arm joint angle 10, body one arm length 11, first C axle angle 12, first instrument and body two arm's connection center 13, body one arm and body two arm's connection center 14, body one arm and robot's connection center 15, body two arm length 21.
Detailed Description
Embodiments of the present invention are described in detail below, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to like or similar elements or elements having like or similar functions throughout. The embodiments described below by referring to the drawings are illustrative only and are not to be construed as limiting the invention.
In the description of the present invention, it should be understood that references to orientation descriptions such as upper, lower, front, rear, left, right, etc. are based on the orientation or positional relationship shown in the drawings, are merely for convenience of description of the present invention and to simplify the description, and do not indicate or imply that the apparatus or elements referred to must have a particular orientation, be constructed and operated in a particular orientation, and thus should not be construed as limiting the present invention.
In the description of the present invention, a number means one or more, a number means two or more, and greater than, less than, exceeding, etc. are understood to not include the present number, and above, below, within, etc. are understood to include the present number. The description of the first and second is for the purpose of distinguishing between technical features only and should not be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated or implicitly indicating the precedence of the technical features indicated.
In the description of the present invention, unless explicitly defined otherwise, terms such as arrangement, installation, connection, etc. should be construed broadly and the specific meaning of the terms in the present invention can be reasonably determined by a person skilled in the art in combination with the specific contents of the technical scheme.
As shown in fig. 1, an embodiment of the present invention provides a method for automatically calibrating coordinates of a robot end assembly, which includes the following steps.
S100, acquiring image data of the extract of the first tool end 8, and moving the extract to a preset position for photographing after the extract is sucked by the first tool end 8.
The extract includes articles to be aligned, such as screws, studs, nuts, etc., and the specific articles may be set according to the needs, which are not limited herein.
Specifically, the first tool end 8 of the end assembly of the robot moves to the screw position to suck the screw, and then moves to a preset photographing position to photograph, so that image data is obtained.
In a specific embodiment, the end assembly of the robot comprises a plurality of tools, a first body arm 1 and a second body arm 2, each tool is rotatably connected with the second body arm 2, the second body arm 2 is rotatably connected with the first body arm 1, and the first body arm 1 is rotatably connected with the robot arm of the robot. After a tool is touched to extract the screw, the camera shoots the image data of the screw, the camera specifically comprises an upper camera and a lower camera, the upper camera is responsible for shooting and acquiring the position of the screw in the vibration disc, the lower camera is responsible for shooting the screw at the end of the tool and acquiring the offset, and the position correction of the screw hole is realized by matching with the robot; the specific image data can comprise one image or a plurality of images, the shooting direction is any direction or any direction in six views, and the required image data is obtained by integrating the shot images.
And S200, performing identification processing on the image data to obtain first characteristic points of the extract.
Specifically, after the image data is obtained, visual recognition and extraction are performed on the image data, and then a first characteristic point of the required extract is extracted from the image data, wherein the first characteristic point comprises the exact center of the extract or a certain preset point, and the first characteristic point is specifically set according to the requirement and is not limited herein.
S300, acquiring second characteristic points of the reference image data.
Specifically, the pixel coordinates of a second feature point in the reference image data are obtained, and the second feature point corresponds to the first feature point, namely, the coordinate difference between the second feature point and the first feature point can be obtained by comparing the second feature point with the first feature point.
S400, determining a pixel difference according to the first characteristic point and the second characteristic point, and determining a first coordinate offset of the tail end of the first tool according to the pixel difference and a preset conversion rule, wherein the pixel difference represents the difference between the pixel coordinates of the first characteristic point and the pixel coordinates of the second characteristic point.
Specifically, the reference image data is preset stored image data, the image data of the photographed extract is compared with the reference image data, the pixel difference between the first feature point and the second feature point is calculated by identifying the pixel coordinate of the second feature point of the reference image data and the pixel coordinate of the first feature point of the obtained image data, in a specific embodiment, the specific feature point can be set as the center point of the photographed screw top view or other points located near the center point, and the difference between the pixel coordinates of the feature point is obtained according to the feature pixel coordinate of the reference image and the pixel coordinate of the feature point of the photographed image, and then the offset of the tool end corresponding to the screw is obtained through processing.
Optionally, determining a pixel difference according to the first feature point and the second feature point, and determining a first coordinate offset of the first tool end according to the pixel difference and a preset conversion rule specifically includes:
s310, acquiring first pixel coordinates of the first feature points in the image data;
s320, acquiring second pixel coordinates of the second feature points in the reference image data;
s330, determining the pixel difference according to the first pixel coordinate and the second pixel coordinate;
s340, acquiring scale data between a pixel coordinate system and a robot coordinate system;
and S340, calculating the first coordinate offset according to the scale data and the pixel difference.
Specifically, after the end of the tool absorbs the extract, the camera arranged on the robot shoots to obtain image data, and a first pixel coordinate of a preset pixel point (namely a characteristic point and arranged according to the requirement) is obtained according to the shot image data; acquiring a second pixel coordinate of the stored preset pixel point in the reference image data, and subtracting the second pixel coordinate from the first pixel coordinate to obtain a pixel coordinate difference; acquiring scale data of pixel coordinates and robot coordinates, namely the scale data of the pixel coordinates and the robot coordinates mutually converted; and calculating and converting according to the scale data to obtain a first coordinate offset under the robot coordinates.
In one particular embodiment, the preparation is: a reference image is registered, and the pixel coordinates of the center of the screw bottom surface in the reference image are (100 ). In the production process: the robot sucks a new screw, and shoots a new image at the camera, the pixel coordinate of the center of the bottom surface of the screw in the new image is (130, 125), and the pixel coordinate difference exists between the pixel coordinate of the new image and the pixel coordinate of the reference image (30, 25). (30, 25) this pixel coordinate difference is the coordinate difference in the pixel coordinate system of the vision itself. When the robot is ready to work, the vision and the robot are calibrated, namely, a vision pixel coordinate system and a robot coordinate system are bound, for example, 1 pixel is 0.5mm, and the included angle between the pixel coordinate system and the robot geodetic coordinate system is 30 degrees. And (5) converting the pixel coordinates (30, 25) into the geodetic coordinates (14.231, 11.824) through a homogeneous transformation matrix by a visual coordinate conversion algorithm, and sending the geodetic coordinates to the robot. The geodetic coordinates (14.231, 11.824) are first coordinate offsets, wherein 14.231 is X-axis offset x_run_dev and 11.824 is Y-axis offset y_run_dev.
S500, acquiring the length of a first tool 3, a first tool angle 6, the length of one arm 11 and the length of two arms 21 of the body of the tail end assembly;
S600, determining the end coordinates of the first tool end 8 according to the length of the first tool 3, the first tool angle 6, the first body arm length 11 and the second body arm length 21, wherein the first tool angle 6 represents the included angle between the first tool 3 and the second body arm 2.
Specifically, referring to fig. 2, the first tool 3 length, the first tool angle 6, the first body arm length 11 and the second body arm length 21 are stored in the robot in advance, and when a certain tool number is input, the tool length corresponding to the number is automatically acquired, for example, the number of the first tool 3 is output, so that the first tool 3 length, the first tool angle 6, the first body arm length 11 and the second body arm length 21 can be acquired. The end coordinates of the first tool end 8 are calculated using trigonometric functions from the body first arm length 11, the body second arm length 21, the first tool 3 length and the first tool angle 6. The first tool angle 6 is the included angle between the first tool 3 and the two arms 2 of the body.
Optionally, the acquiring the first tool 3 length, the first tool angle 6, the first body arm length 11 and the second body arm length 21 of the tip assembly determines the tip coordinates of the first tool tip 8 according to the first tool 3 length, the first tool angle 6, the first body arm length 11 and the second body arm length 21, and specifically includes:
S610, acquiring a first tool 3 number of the first tool 3;
s620, acquiring a pre-stored length of the first tool 3, the first tool angle 6, the body one-arm length 11 and the body two-arm length 21 according to the number of the first tool 3;
s630, calculating the end coordinates according to the length of the first tool 3, the first tool angle 6, the length of the first arm 11 of the body, the length of the second arm 21 of the body and the trigonometric function.
Specifically, the tool number of the first tool 3 is input into the robot, and the fixed tool number may be obtained through a preset fixed position (i.e., a fixed position on the two arms 2 of the body), or may be automatically obtained through vision, and the specific obtaining mode is not limited herein. The corresponding relation table of the tool number, the tool length, the tool included angle, the first arm length 11 of the body and the second arm length 21 of the body is preset, and the required first tool 3 length, the first tool angle 6, the first arm length 11 of the body and the second arm length 21 of the body can be obtained by reading the relation table. And calculating the end coordinates of the first tool end 8 by using the acquired first tool 3 length, first tool angle 6, body one-arm length 11 and body two-arm length 21 through trigonometric functions.
S700, acquiring target data of a target point position, wherein the target point position represents a hole position to be aligned after the extract of the first tool tail end 8 moves.
Specifically, the target point is a hole site to be aligned, the first tool end 8 moves the extract and aligns the extract to the target point, and the target point includes data such as a point number, a point name or data, and the like, which is not limited herein; target data of the target point positions are stored and arranged in the robot in advance; the target data includes data information such as coordinates of the target point location, a coordinate hand system (including a left hand system and/or a right hand system), and the like.
S800, obtaining a center coordinate of a connection center point of the body first arm 1 and the body second arm 2, and determining a first target coordinate based on the terminal coordinate, the center coordinate, the target data and the first coordinate offset, wherein the first target coordinate represents a coordinate of the first tool terminal when the extract is aligned to the target point.
Specifically, referring to fig. 2, the center coordinate of the connection center point of the first body arm 1 and the second body arm 2 is the connection center 14 of the first body arm 1 and the second body arm 2, the center coordinate is calibrated according to the robot coordinate system, the first C-axis angle 12 is calculated according to the end coordinate, the center coordinate and the target data through inverse kinematics, and the first C-axis angle 12 is the C-axis angle when the first tool end 8 is aligned with the target point.
Optionally, the determining the first target coordinate according to the end coordinate, the center coordinate, the target data and the first coordinate offset specifically includes:
s810, determining a first C-axis angle according to the tail end coordinates, the center coordinates and the target data, wherein the first C-axis angle represents the C-axis angle when the extract of the first tool tail end is aligned with the target point;
s820, determining a first target coordinate according to the first coordinate offset and the first C-axis angle.
In a specific embodiment, the 3-axis Scara robot has no C-axis in hardware, so that the method for representing the C-axis coordinates of the 3-axis Scara robot is also different from that of the 4-axis Scara robot. The 4-axis Scara robot moves from the point A to the point B, the X coordinate and the Y coordinate are changed, and the C-axis motor can simultaneously perform coupling motion in the process of moving the robot from the point A to the point B because the 4-axis Scara robot is provided with the C-axis on a hardware structure, so that the C-axis coordinate is always kept at an initial angle. Therefore, the C-axis angle remains the same as the A-axis angle when the C-axis is moved to the B-axis, regardless of the C-axis angle when the C-axis is moved to the A-axis without human intervention. In the same case, the 3-axis Scara robot moves from the point a to the point B, and the 3-axis Scara robot has no C-axis in terms of hardware structure and no C-axis motor performs coupling motion, so that the C-axis follows the motion of the robot without human intervention of the C-axis. The direction of the C-axis coordinate of the 3-axis Scara robot is always parallel to the two joints of the tail end assembly of the robot, the axis of the two joints is directed to the tail end of the robot (the X-axis direction of a Cartesian coordinate system), and the rotation center of the C-axis is positioned at the axis of the two joints.
Optionally, the determining the first C-axis angle 12 based on the end coordinates, the center coordinates, and the target data specifically includes:
s811, calculating the length of the tool two arms 4 according to the tail end coordinates and the center coordinates, wherein the tool two arms 4 represent a connecting straight line from the first tool tail end 8 to the connecting center point;
s812, calculating a tool two-arm angle 5 according to the length of the tool two-arm 4, the tail end coordinates and the center coordinates, wherein the tool two-arm angle 5 represents an included angle between the tool two-arm 4 and the body two-arm 2;
and S813, calculating the first C-axis angle 12 according to the target data, the tool two-arm angle 5, the body one-arm length 11 and the length of the tool two-arm 4.
Referring to fig. 2, specifically, a connection straight line from the first tool end 8 to the connection center point is calculated according to the end coordinates and the center coordinates, the length of the connection straight line is the length of the tool two arms 4, the tool two arm angle 5 is obtained by calculating the included angle between the tool two arms 4 and the body two arms 2, and specifically, the calculation is performed according to the first tool angle 6, the body two arm length 21, the length of the tool two arms 4 and the inverse trigonometric function. Then, the length of the tool arm 4, the target data and the length 11 of the body arm are used for solving the real C-axis angle when the first tool tail end 8 is positioned at the input target point, namely the first C-axis angle 12.
Optionally, the calculating the first C-axis angle 12 according to the target data, the tool two-arm angle 5, the length of the body one-arm 1 and the tool two-arm 4 specifically includes:
s814, calculating a tool one-arm joint angle 10 and a tool two-arm joint angle 9 according to the tool two-arm angle 5, the tool one-arm length 11 and the length of the tool two-arm 4, wherein the tool one-arm joint angle 10 represents an included angle between the body one-arm 1 after movement and the body one-arm 1 before non-movement when the first tool tail end 8 is aligned with the target point, and the tool two-arm joint angle 9 represents an included angle between the body one-arm 1 after movement and the tool two-arm 4;
s815, calculating a first C-axis angle 12 according to the tool one-arm joint angle 10 and the tool two-arm joint angle 9.
Referring to fig. 3, specifically, the tool arm joint angle 10 and the tool arm joint angle 9 are solved by inverse kinematics of a robot by using the tool arm angle 5, the tool arm length 11 and the length of the tool arm 4, specifically, the tool arm joint angle 10 and the tool arm joint angle 9 are solved directly by an analytic method; the first-arm joint angle 10 is the included angle between the first arm 1 of the body after movement and the first arm 1 of the body before movement when the first tool tail end 8 is aligned with the target point, the second-arm joint angle 9 is the included angle between the first arm 1 of the body after movement and the second arm 4 of the tool, the obtained first-arm joint angle and the second-arm joint angle 9 of the first tool tail end 8 are subjected to opposite vertex angle relation calculation, and then the actual C-axis angle when the first tool tail end 8 is positioned at the target point is obtained, namely the first C-axis angle 12.
The calculation formula can be expressed as: true_c=j1_angle+j2_angle-tool_j2_angle, wherein true_c represents the first C-axis Angle, j1_angle represents the Tool one-arm joint Angle, and j2_angle represents the Tool two-arm joint Angle.
S600, determining a first target coordinate according to the first coordinate offset and the first C-axis angle 12, wherein the first target coordinate represents the coordinate when the first tool tail end 8 is aligned with the target point.
Specifically, the obtained first C-axis angle 12 and the first coordinate offset are input into a preset calculation model, so that a first target coordinate can be calculated.
In one embodiment, the calculation formula for calculating the first target coordinate (dev_point_x, dev_point_y) according to the first coordinate offset (x_run_dev, y_run_dev) and the first C-axis angle true_c is as follows:
true_c is located in the first quadrant 0-90 °:
Dev_Point_X=Point_X+X_Run_Dev*cos(True_C)-Y_Run_Dev*sin(True_C)
Dev_Point_Y=Point_Y+X_Run_Dev*sin(True_C)+Y_Run_Dev*cos(True_C)
true_c is located in the second quadrant 90-180 °:
Dev_Point_X=Point_X+X_Run_Dev*cos(True_C-90)-Y_Run_Dev*sin(True_C-90)
Dev_Point_Y=Point_Y+X_Run_Dev*sin(True_C-90)+Y_Run_Dev*cos(True_C-90)
true_c is located in the third quadrant 180-270 °:
Dev_Point_X=Point_X+X_Run_Dev*cos(True_C-180)-Y_Run_Dev*sin(True_C-180)
Dev_Point_Y=Point_Y+X_Run_Dev*sin(True_C-180)+Y_Run_Dev*cos(True_C-180)
true_c is located in the fourth quadrant 270-360 °:
Dev_Point_X=Point_X+X_Run_Dev*cos(True_C-270)-Y_Run_Dev*sin(True_C-270)
Dev_Point_Y=Point_Y+X_Run_Dev*sin(True_C-270)+Y_Run_Dev*cos(True_C-270)
wherein sin () represents a sine function, cos () represents a cosine function, x_run_dev represents an X-axis coordinate of the first coordinate offset in the robot coordinate system, y_run_dev represents a Y-axis coordinate of the first coordinate offset in the robot coordinate system, point_x represents an X-axis coordinate of the target Point in the robot coordinate system, and point_y represents a Y-axis coordinate of the target Point in the robot coordinate system.
S900, moving the first tool end 8 to the first target coordinates.
Specifically, after the first target coordinates are calculated, the robot moves the first tool end 8 to the first target coordinates, and fixes the extract after aligning the extract with the hole site, thereby completing the alignment of the extract.
Optionally, the tip assembly further includes a second tool rotatably connected to the body arms, the first tool having a non-zero included angle with the second tool, the method further comprising:
s910, respectively calculating to obtain the first target coordinate of the first tool tail end and the second target coordinate of the second tool tail end;
s920, after the first tool end is moved to the first target coordinate, the second tool end is moved to the second sitting target coordinate.
Specifically, the tools are provided in a plurality, and the specific number of tools is not limited herein. The tools are all rotationally connected with the two arms 2 of the body, included angles which are different from zero are formed among the tools, and certain included angles are formed among the tools. In some embodiments, the tools may also be arranged in parallel, and the specific arrangement is not limited herein.
When a plurality of tools absorb the extracts simultaneously, photographing the extracts of the tools respectively, for example, photographing the extract at the tail end of a first tool, photographing the extract at the tail end of a second tool, and calculating to obtain a first target coordinate at the tail end of the first tool and a second target coordinate at the tail end of the second tool respectively by the calculation method; directly calculating a second C-axis angle of the second tool end at a second target coordinate at the target point after the extract of the first tool end 8 is aligned with the target point; the second tool tip is then moved to a second target coordinate. The other tool ends are sequentially calculated to obtain the target coordinates according to the calculation method, and details are omitted here.
Optionally, before the acquiring the image data of the extract of the first tool end 8 of the robot, the method specifically includes:
s930, calibrating a theoretical origin of the tail end assembly;
s940, obtaining and storing the reference image data of the first tool tail end 8 according to the theoretical origin calibration.
In a specific embodiment, the robot teaches the coordinates of the tool tip based on the power bit. Because the assembly error of the electric screwdriver head is large, the screwdriver head is far from the theoretical origin of the robot model, and therefore, the coordinates of the tail end of the teaching tool are required to compensate the error. The robot and vision perform 9-point calibration. Since the C-axis of the 3-axis Scara is changing at any time, for controlling the variables, the TCS coordinate system (tool coordinate system) is calibrated according to the current C-axis angle during the visual calibration, and the directions of the coordinate axes in the cartesian coordinate system and the TCS coordinate system are determined. The calibrated TCS coordinate system is calibrated according to the theoretical origin of the robot model, and is not calibrated according to the tool coordinates of the batch head. Because the orientation of the coordinate axes of the TCS coordinate system will change due to the switching of the tool coordinates. For controlling the variables, only the theoretical origin can be calibrated. The reference image is registered visually, and the robot teaches the working point position by using the reference screw. Sticking a screw on the electric screwdriver head to serve as a reference screw and teaching the screw hole position on the workpiece by the reference screw. The vision is to taking an image of the bottom surface of the reference screw as a reference image, and storing the image as reference image data.
In a specific embodiment, the method of the present application is validated. Selecting different three points as target points; teaching tool coordinates of different quadrants; randomly selecting an offset; the traditional method is used for carrying out deviation correcting movement based on a target point position of a second and third number according to the offset by using tools (tools of the first, second, third and fourth numbers) with different tool coordinates, and the position of a theoretical origin point after the deviation correcting movement is observed and recorded. Under the condition of not moving the robot, the calibration algorithm of the method is used for directly carrying out calibration calculation based on the target point position with the number two and three according to the offset and the tool coordinate, and observing and comparing whether the calibration calculation accords with the theoretical origin position of the traditional handling method.
Table 1 experimental data
From the data in table 1, it can be seen that the accuracy of the calibration algorithm was maintained at ±1um in each case. The precision of the common Scara robot in the market is 0.02mm (equal to 20 um), so that the error of the algorithm is in the range that the robot cannot distinguish, and the operation accuracy of the robot is not greatly influenced.
The embodiment of the invention has the following beneficial effects: the embodiment of the invention provides a method for automatically calibrating coordinates of a robot tail end assembly, which comprises the following steps: acquiring image data of an extract of the first tool tip 8 of the robot; acquiring a second characteristic point of the reference image data, determining a pixel difference according to the first characteristic point and the second characteristic point, and acquiring a first coordinate offset of the first tool tail end 8, wherein the pixel difference represents the difference between the pixel coordinate of the first characteristic point and the pixel coordinate of the second characteristic point; acquiring a first tool 3 length, a first tool angle 6, a first body arm length 11 and a second body arm length 21 of the end assembly, and determining end coordinates of a first tool end 8 according to the first tool 3 length, the first tool angle 6, the first body arm length 11 and the second body arm length 21, wherein the first tool angle 6 represents an included angle between the first tool 3 and the second body arm 2; acquiring target data of a target point position, wherein the target point position represents a hole position to be aligned after the first tool tail end 8 moves; acquiring a central coordinate of a connecting central point of the body first arm 1 and the body second arm 2, and determining a first C-axis angle 12 based on the terminal coordinate, the central coordinate and the target data, wherein the first C-axis angle 12 represents a C-axis angle when the first tool terminal 8 is aligned with the target point; determining a first target coordinate from the first coordinate offset and the first C-axis angle 12, wherein the first target coordinate characterizes a coordinate of the first tool tip 8 when aligned with the target point; the first tool end 8 is moved to the first target coordinates. The reference image data are established in advance, the acquired image data are compared with the reference image data, the offset of the first tool tail end 8 is obtained, the specific information of the tail end assembly stored in advance is obtained, the first C-axis angle 12 of the first tool tail end 8 positioned at the target point is solved according to the target data of the target point, the offset of the hole coordinates can be quickly obtained and Kong Zuobiao correction can be carried out before the robot moves the hole, the takt time of production is reduced, and the productivity is increased.
As shown in fig. 4, an embodiment of the present invention further provides a system for automatically calibrating coordinates of a robot end assembly, including:
the first module is used for acquiring image data of the extract of the first tool tail end 8, and the image data is obtained by taking a picture by moving the extract to a preset position after the extract is sucked by the first tool tail end 8;
the second module is used for carrying out identification processing on the image data to obtain a first characteristic point of the extract;
a third module for acquiring a second feature point of the reference image data;
a fourth module, configured to determine a pixel difference according to the first feature point and the second feature point, and determine a first coordinate offset of the first tool end 8 according to the pixel difference and a preset conversion rule, where the pixel difference represents a difference between a pixel coordinate of the first feature point and a pixel coordinate of the second feature point;
a fifth module for acquiring a first tool length, a first tool angle, a body one-arm length 11, and a body two-arm length 21 of the tip assembly;
a sixth module, configured to determine an end coordinate of the first tool end 8 according to the first tool length, the first tool angle, the first body first arm length 11, and the second body second arm length 21, where the first tool angle characterizes an included angle between the first tool 3 and the second body second arm 2;
A seventh module, configured to obtain target data of a target point, where the target point represents a hole site to be aligned after the extract of the first tool end 8 moves;
an eighth module, configured to obtain a center coordinate of a connection center point of the first body arm 1 and the second body arm 2, and determine a first target coordinate according to the end coordinate, the center coordinate, the target data, and the first coordinate offset, where the first target coordinate represents a coordinate of the first tool end 8 when the extract is aligned to the target point;
a ninth module for moving the first tool end 8 to the first target coordinates.
It can be seen that the content in the above method embodiment is applicable to the system embodiment, and the functions specifically implemented by the system embodiment are the same as those of the method embodiment, and the beneficial effects achieved by the method embodiment are the same as those achieved by the method embodiment.
As shown in fig. 5, the embodiment of the present invention further provides an apparatus for automatically calibrating coordinates of a robot end assembly, including:
at least one processor;
at least one memory for storing at least one program;
The at least one program, when executed by the at least one processor, causes the at least one processor to carry out the method steps described in the method embodiments above.
It can be seen that the content in the above method embodiment is applicable to the embodiment of the present device, and the functions specifically implemented by the embodiment of the present device are the same as those of the embodiment of the above method, and the beneficial effects achieved by the embodiment of the above method are the same as those achieved by the embodiment of the above method.
Furthermore, embodiments of the present application disclose a computer program product or a computer program, which is stored in a computer readable storage medium. The computer program may be read from a computer readable storage medium by a processor of a computer device, the processor executing the computer program causing the computer device to perform the method as described above. Similarly, the content in the above method embodiment is applicable to the present storage medium embodiment, and the specific functions of the present storage medium embodiment are the same as those of the above method embodiment, and the achieved beneficial effects are the same as those of the above method embodiment.
It is to be understood that all or some of the steps, systems, and methods disclosed above may be implemented in software, firmware, hardware, and suitable combinations thereof. Some or all of the physical components may be implemented as software executed by a processor, such as a central processing unit, a digital information processor, or a microprocessor, or as hardware, or as an integrated circuit, such as an application specific integrated circuit. Such software may be distributed on computer readable media, which may include computer storage media (or non-transitory media) and communication media (or transitory media). The term computer storage media includes both volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data, as known to those skilled in the art. Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital Versatile Disks (DVD) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by a computer. Furthermore, as is well known to those of ordinary skill in the art, communication media typically embodies computer readable instructions, data structures, program modules or other data in a modulated data message such as a carrier wave or other transport mechanism and includes any information delivery media.
The embodiments of the present invention have been described in detail with reference to the accompanying drawings, but the present invention is not limited to the above embodiments, and various changes can be made within the knowledge of one of ordinary skill in the art without departing from the spirit of the present invention.

Claims (10)

1. A method of automatically calibrating coordinates of a robotic end assembly, the end assembly comprising a first tool, a first body arm, and a second body arm, the first body arm and the second body arm being rotatably coupled, the first tool being rotatably coupled to the second body arm, the method comprising:
acquiring image data of an extract at the tail end of the first tool, and moving the extract to a preset position for photographing after the extract is sucked by the tail end of the first tool;
performing identification processing on the image data to obtain first characteristic points of the extract;
acquiring a second characteristic point of the reference image data;
determining a pixel difference according to the first characteristic point and the second characteristic point, and determining a first coordinate offset of the tail end of the first tool according to the pixel difference and a preset conversion rule, wherein the pixel difference represents the difference between the pixel coordinates of the first characteristic point and the pixel coordinates of the second characteristic point;
Acquiring a first tool length, a first tool angle, a first body arm length and a second body arm length of the tail end assembly;
determining the tail end coordinates of the tail end of a first tool according to the first tool length, the first tool angle, the first body arm length and the second body arm length, wherein the first tool angle represents an included angle between the first tool and the second body arm;
acquiring target data of a target point location, wherein the target point location represents a hole site to be aligned after the extract at the tail end of the first tool moves;
acquiring the center coordinates of the connection center point of the first arm of the body and the second arm of the body;
determining a first target coordinate according to the end coordinate, the center coordinate, the target data and the first coordinate offset, wherein the first target coordinate characterizes a coordinate of the first tool end when the extract is aligned with the target point;
the first tool tip is moved to the first target coordinate.
2. The method according to claim 1, wherein the determining a pixel difference according to the first feature point and the second feature point, and determining a first coordinate offset of the first tool tip according to the pixel difference and a preset conversion rule, specifically includes:
Acquiring first pixel coordinates of the first feature points in the image data;
acquiring second pixel coordinates of the second feature points in the reference image data;
determining the pixel difference according to the first pixel coordinate and the second pixel coordinate;
acquiring scale data between a pixel coordinate system and a robot coordinate system;
and calculating the first coordinate offset according to the scale data and the pixel difference.
3. The method according to claim 1, wherein said determining a first target coordinate from said end coordinate, said center coordinate, said target data and said first coordinate offset comprises:
determining a first C-axis angle from the tip coordinates, the center coordinates, and the target data, the first C-axis angle characterizing a C-axis angle of the first tool tip when the extract is aligned with the target point;
and determining a first target coordinate according to the first coordinate offset and the first C-axis angle.
4. A method according to claim 3, wherein said determining a first C-axis angle from said end coordinates, said center coordinates and said target data comprises:
Calculating the length of a tool two-arm according to the tail end coordinates and the center coordinates, wherein the tool two-arm represents a connecting straight line from the first tool tail end to the connecting center point;
calculating to obtain a tool two-arm angle according to the length of the tool two-arm, the tail end coordinates and the center coordinates, wherein the tool two-arm angle represents an included angle between the tool two-arm and the body two-arm;
and calculating the first C-axis angle according to the target data, the tool two-arm angle, the length of the first body arm and the length of the tool two-arm.
5. The method of claim 4, wherein the calculating the first C-axis angle based on the target data, the tool arm angle, the length of the body first arm and the tool arm, specifically comprises:
calculating a tool one-arm joint angle and a tool two-arm joint angle according to the tool two-arm angle, the tool one-arm length and the tool two-arm length, wherein the tool one-arm joint angle represents an included angle between the body one-arm after movement and the body one-arm before non-movement when the extract at the tail end of the first tool is aligned with the target point, and the tool two-arm joint angle represents an included angle between the body one-arm after movement and the tool two-arm;
And calculating to obtain a first C-axis angle according to the first arm joint angle of the tool and the second arm joint angle of the tool.
6. The method of claim 1, wherein the tip assembly further comprises a second tool rotatably coupled to the body arm, the first tool and the second tool having a non-zero included angle therebetween, the method further comprising:
respectively calculating the first target coordinate of the first tool tail end and the second target coordinate of the second tool tail end;
after moving the first tool tip to the first target coordinate, the second tool tip is moved to the second sitting target coordinate.
7. The method according to claim 1, characterized in that before said obtaining image data of the extract of the first tool end, it comprises in particular:
calibrating a theoretical origin of the tail end assembly;
and calibrating and acquiring and storing the reference image data of the first tool tail end according to the theoretical origin.
8. A system for automatically calibrating coordinates of a robotic end assembly, comprising:
the first module is used for acquiring image data of the extract at the tail end of the first tool, and the image data is obtained by taking pictures by moving the extract to a preset position after the extract is sucked by the tail end of the first tool;
The second module is used for carrying out identification processing on the image data to obtain a first characteristic point of the extract;
a third module for acquiring a second feature point of the reference image data;
a fourth module, configured to determine a pixel difference according to the first feature point and the second feature point, and determine a first coordinate offset of the first tool end according to the pixel difference and a preset conversion rule, where the pixel difference represents a difference between a pixel coordinate of the first feature point and a pixel coordinate of the second feature point;
a fifth module for acquiring a first tool length, a first tool angle, a body first arm length, and a body second arm length of the tip assembly;
a sixth module, configured to determine an end coordinate of an end of a first tool according to the first tool length, the first tool angle, the first body arm length, and the second body arm length, where the first tool angle represents an included angle between the first tool and the second body arm;
a seventh module, configured to obtain target data of a target point, where the target point represents a hole site to be aligned after the extract at the end of the first tool moves;
An eighth module, configured to obtain a center coordinate of a connection center point of the first arm of the body and the second arm of the body, and determine a first target coordinate according to the end coordinate, the center coordinate, the target data, and the first coordinate offset, where the first target coordinate represents a coordinate of the end of the first tool when the extract is aligned to the target point;
a ninth module for moving the first tool tip to the first target coordinates.
9. An apparatus for automatically calibrating coordinates of a robotic end assembly, comprising:
at least one processor;
at least one memory for storing at least one program;
the at least one program, when executed by the at least one processor, causes the at least one processor to implement the method of any of claims 1-7.
10. A computer readable storage medium, in which a processor executable program is stored, characterized in that the processor executable program is for performing the method according to any of claims 1-7 when being executed by a processor.
CN202311149043.1A 2023-09-06 2023-09-06 Method, system, device and medium for automatically calibrating coordinates of end assembly of robot Pending CN117260712A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311149043.1A CN117260712A (en) 2023-09-06 2023-09-06 Method, system, device and medium for automatically calibrating coordinates of end assembly of robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311149043.1A CN117260712A (en) 2023-09-06 2023-09-06 Method, system, device and medium for automatically calibrating coordinates of end assembly of robot

Publications (1)

Publication Number Publication Date
CN117260712A true CN117260712A (en) 2023-12-22

Family

ID=89220574

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311149043.1A Pending CN117260712A (en) 2023-09-06 2023-09-06 Method, system, device and medium for automatically calibrating coordinates of end assembly of robot

Country Status (1)

Country Link
CN (1) CN117260712A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118101905A (en) * 2024-04-29 2024-05-28 浙江中煤液压机械有限公司 Automatic machine following method based on image recognition

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118101905A (en) * 2024-04-29 2024-05-28 浙江中煤液压机械有限公司 Automatic machine following method based on image recognition

Similar Documents

Publication Publication Date Title
CN111775146B (en) Visual alignment method under industrial mechanical arm multi-station operation
CN110238849B (en) Robot hand-eye calibration method and device
CN106426172B (en) A kind of scaling method and system of industrial robot tool coordinates system
JP3946711B2 (en) Robot system
CN110125926B (en) Automatic workpiece picking and placing method and system
CN112894823B (en) Robot high-precision assembling method based on visual servo
EP0489919A1 (en) Calibration system of visual sensor
CN113379849B (en) Robot autonomous recognition intelligent grabbing method and system based on depth camera
CN110842919B (en) Visual guide method for screwing of robot
CN117260712A (en) Method, system, device and medium for automatically calibrating coordinates of end assembly of robot
CN112720458B (en) System and method for online real-time correction of robot tool coordinate system
CN112907682B (en) Hand-eye calibration method and device for five-axis motion platform and related equipment
CN111791226A (en) Method and device for realizing assembly through robot and robot
US7957834B2 (en) Method for calculating rotation center point and axis of rotation, method for generating program, method for moving manipulator and positioning device, and robotic system
CN114663500A (en) Vision calibration method, computer device and storage medium
CN115008477A (en) Manipulator movement compensation method, manipulator movement compensation device and computer-readable storage medium
CN113393534A (en) Product laminating method, device, equipment and system
CN111383283B (en) Calibration method and system for tool coordinate system of robot
CN111699445B (en) Robot kinematics model optimization method and system and storage device
CN112631200A (en) Machine tool axis measuring method and device
CN110815177A (en) Migration method for 2D visual guidance teaching of composite robot
CN114178832B (en) Robot guide assembly robot method based on vision
CN115847426A (en) Robot motion control method, device, electronic equipment and storage medium
CN115609586A (en) Robot high-precision assembling method based on grabbing pose constraint
CN214323368U (en) Visual positioning and hand-eye calibration device for loading and unloading samples of mechanical arm of feeding turntable

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination