CN114918926B - Mechanical arm visual registration method and device, control terminal and storage medium - Google Patents

Mechanical arm visual registration method and device, control terminal and storage medium Download PDF

Info

Publication number
CN114918926B
CN114918926B CN202210859860.5A CN202210859860A CN114918926B CN 114918926 B CN114918926 B CN 114918926B CN 202210859860 A CN202210859860 A CN 202210859860A CN 114918926 B CN114918926 B CN 114918926B
Authority
CN
China
Prior art keywords
mechanical arm
conversion matrix
offset
matrix
conversion
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210859860.5A
Other languages
Chinese (zh)
Other versions
CN114918926A (en
Inventor
陈鹏
黄志俊
刘金勇
钱坤
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lancet Robotics Co Ltd
Original Assignee
Lancet Robotics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lancet Robotics Co Ltd filed Critical Lancet Robotics Co Ltd
Priority to CN202210859860.5A priority Critical patent/CN114918926B/en
Publication of CN114918926A publication Critical patent/CN114918926A/en
Application granted granted Critical
Publication of CN114918926B publication Critical patent/CN114918926B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • B25J9/161Hardware, e.g. neural networks, fuzzy logic, interfaces, processor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • B25J9/1607Calculation of inertia, jacobian matrixes and inverses
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Fuzzy Systems (AREA)
  • Software Systems (AREA)
  • Manipulator (AREA)

Abstract

The embodiment of the invention discloses a mechanical arm visual registration method, a device, a control terminal and a storage medium, wherein the method comprises the following steps: acquiring tooling data of a tail end tool of the mechanical arm, and calculating the offset of the tail end tool and a tail end flange of the mechanical arm according to the tooling data; acquiring mark coordinates of mark points arranged on a mechanical arm through a binocular vision sensor; obtaining a first transformation matrix from a base to a terminal flange of the mechanical arm; establishing a conversion equation according to the mark coordinate and the first conversion matrix, and obtaining a second conversion matrix from the coordinate system of the binocular vision sensor to the base according to the conversion equation; and registering the mechanical arm according to the second conversion matrix, the first conversion matrix and the offset to obtain a terminal tool coordinate system of the mechanical arm. The mechanical arm can be accurately calibrated through a binocular vision sensor, and then registration is carried out.

Description

Mechanical arm visual registration method and device, control terminal and storage medium
Technical Field
The invention relates to the field of mechanical arm control, in particular to a mechanical arm visual registration method, a device, a control terminal and a storage medium.
Background
In the process of performing surgery such as a knee joint surgery robot, the robot surgery system needs the mechanical arm to adjust 5 postures, namely, an anterior condyle, an anterior oblique condyle, a posterior oblique condyle and a far end relative to a bone to perform plane cutting, in the cutting process, the posture of the mechanical arm is changed greatly, if the tail end tcp of the mechanical arm is worn or an optical binocular positioning instrument deviates in the process, the loss of precision is easily achieved in the precision teaching process. If the relationship between the flange at the tail end of the mechanical arm and the drilling point is inaccurate or errors are generated by the binocular optical positioner under the condition of different postures in the steps, the final mechanical arm registration result is greatly interfered.
Disclosure of Invention
In a first aspect, the present application provides a method for visual registration of a robot arm, including:
acquiring tooling data of a tail end tool of the mechanical arm, and calculating the offset of the tail end tool and a tail end flange of the mechanical arm according to the tooling data;
acquiring mark coordinates of mark points arranged on the mechanical arm through a binocular vision sensor;
obtaining a first transformation matrix of a base of the robotic arm to the end flange;
establishing a conversion equation according to the mark coordinates and the first conversion matrix, and obtaining a second conversion matrix from the coordinate system of the binocular vision sensor to the base according to the conversion equation;
and registering the mechanical arm according to the second conversion matrix, the first conversion matrix and the offset to obtain a terminal tool coordinate system of the mechanical arm.
Further, obtaining a second transformation matrix from the coordinate system of the binocular vision sensor to the base according to the transformation equation includes:
changing the posture of the mechanical arm, collecting observation coordinates of the mark points from the mechanical arm through the binocular vision sensor, collecting observation coordinates of a preset number of groups when every two observation coordinates are in one group, and calculating the rotation component of the second conversion matrix according to the collected observation coordinates of each group and the conversion equation;
determining the offset of the marking point relative to the tail end flange according to the position information of the marking point on the mechanical arm and the tooling information of the marking point;
substituting the offset of the mark point relative to the end flange and the rotation component into the conversion equation to obtain a translation component;
and obtaining the second conversion matrix according to the rotation component and the translation component.
Further, registering the robot arm to obtain an end-of-arm-tool coordinate system comprises:
obtaining a third conversion matrix from the binocular data sensor to a terminal tool coordinate system of the mechanical arm according to the second conversion matrix and the first conversion matrix;
obtaining a fourth conversion matrix from the mark point to the end tool according to the offset of the mark point relative to the end flange and the offset of the end tool and the end flange;
and obtaining a terminal tool coordinate system of the mechanical arm according to the third conversion matrix and the fourth conversion matrix.
Further, each set of the observation coordinates is on a different plane and within the visual range of the binocular vision sensor;
the preset number of groups is at least 5 groups.
Further, the expression of the fourth transformation matrix is:
Figure P_220624104050867_867346001
in the formula,
Figure F_220624104046459_459080001
the offset of the end tool from the end flange of the robot arm,
Figure F_220624104046542_542600002
the offset of the marking point relative to the flange of the mechanical arm,
Figure F_220624104046668_668131003
is the fourth transformation matrix.
Further, obtaining a first transformation matrix from a base of the robot arm to an end of the robot arm:
and according to the positive kinematics of the robot, obtaining the first conversion matrix through a DH (direct-Current) conversion matrix between every two adjacent motors of the mechanical arm.
Further, the expression of the conversion equation is as follows:
Figure F_220624104046791_791633004
in the formula,
Figure F_220624104046869_869772005
in order to be able to identify the coordinates of said marks,
Figure F_220624104046949_949804006
is a transposed matrix of the coordinates of the mark,
Figure F_220624104047028_028456007
for the purpose of said first conversion matrix,
Figure F_220624104047106_106594008
in order to be said second transformation matrix,
Figure F_220624104047172_172493009
the offset of the marker point relative to the end flange,
Figure F_220624104047297_297493010
and the mark points are transposed matrixes of the offset of the mark points relative to the end flange.
In a second aspect, the present application further provides a robot arm visual registration apparatus, including:
the offset calculation module is used for acquiring tooling data of a tail end tool of the mechanical arm and calculating the offset of the tail end tool and a tail end flange of the mechanical arm according to the tooling data;
the mark acquisition module is used for acquiring mark coordinates of mark points arranged on the mechanical arm through a binocular vision sensor;
a calculation module for obtaining a first transformation matrix from a base of the robotic arm to the end flange;
the solving module is used for establishing a conversion equation according to the mark coordinates and the first conversion matrix and obtaining a second conversion matrix from the coordinate system of the binocular vision sensor to the base according to the conversion equation;
and the registering module is used for registering the mechanical arm according to the second conversion matrix, the first conversion matrix and the offset to obtain a terminal tool coordinate system of the mechanical arm.
In a third aspect, the present application further provides a control terminal, which includes a processor and a memory, where the memory stores a computer program, and the computer program executes the robot arm visual registration method when running on the processor.
In a fourth aspect, the present application further provides a readable storage medium storing a computer program which, when run on a processor, performs the robotic arm visual registration method.
The embodiment of the invention discloses a mechanical arm visual registration method, a device, a control terminal and a storage medium, wherein the method comprises the following steps: acquiring tooling data of a tail end tool of the mechanical arm, and calculating the offset of the tail end tool and a tail end flange of the mechanical arm according to the tooling data; acquiring mark coordinates of mark points arranged on a mechanical arm through a binocular vision sensor; obtaining a first transformation matrix from a base to a terminal flange of the mechanical arm; establishing a conversion equation according to the mark coordinates and the first conversion matrix, and calculating a second conversion matrix from the coordinate system of the binocular vision sensor to the base according to the conversion equation; and registering the mechanical arm according to the second conversion matrix, the first conversion matrix and the offset to obtain an end tool coordinate system of the mechanical arm. The mechanical arm can be accurately calibrated through one binocular vision sensor, and then registration is carried out. The registered terminal tool coordinate system comprises a translation variable and a rotation variable, so that the mechanical arm has certain impedance adaptability when changing various postures.
Drawings
In order to more clearly illustrate the technical solution of the present invention, the drawings required in the embodiments will be briefly described below, and it should be understood that the following drawings only illustrate some embodiments of the present invention, and therefore should not be considered as limiting the scope of the present invention. Like components are numbered similarly in the various figures.
FIG. 1 illustrates a surgical scene schematic diagram according to an embodiment of the present application;
fig. 2 is a schematic flowchart illustrating a robot visual registration method according to an embodiment of the present disclosure;
FIG. 3 is a schematic diagram of a robot arm according to an embodiment of the present disclosure;
fig. 4 shows a schematic structural diagram of a robot visual registration apparatus according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments.
The components of embodiments of the present invention generally described and illustrated in the figures herein may be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present invention, presented in the figures, is not intended to limit the scope of the invention, as claimed, but is merely representative of selected embodiments of the invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the present invention without making any creative effort, shall fall within the protection scope of the present invention.
Hereinafter, the terms "including", "having", and their derivatives, which may be used in various embodiments of the present invention, are intended to indicate only specific features, numerals, steps, operations, elements, components, or combinations of the foregoing, and should not be construed as first excluding the presence of or adding to one or more other features, numerals, steps, operations, elements, components, or combinations of the foregoing.
Furthermore, the terms "first," "second," "third," and the like are used solely to distinguish one from another, and are not to be construed as indicating or implying relative importance.
Unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which various embodiments of the present invention belong. The terms (such as those defined in commonly used dictionaries) should be interpreted as having a meaning that is consistent with their contextual meaning in the relevant art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein in various embodiments of the present invention.
The technical scheme of the application is applied to the registration of the surgical mechanical arm, and the registration method is applied to a surgical scene with a binocular vision sensor, as shown in fig. 1, the surgical scene is a schematic view of the application.
The surgical scene includes a binocular vision sensor 100 and a mechanical arm 200, and the visual range of the binocular vision sensor 100 covers the mechanical arm 200 and the area to be operated next.
An end tool 220 for operation is mounted at the end of the robot arm 200, and a marker 210 is mounted near the end of the robot arm. The marker 210 may be a reflective ball or a custom marking device for allowing the binocular vision sensor 100 to determine the position of the robot arm 200 by acquiring coordinates of the marker 210. This binocular vision sensor 100 can be binocular camera, and image data through will shooing transmits for control terminal can control arm 200 and carry out accurate operation, perhaps before the operation, registers the arm.
The coordinates of the mark points 210, the tooling data of the end tool 220, and the position data of the positions of the mark points 210, which are acquired by the binocular vision sensor 100, are used for registering the mechanical arm 200, so as to obtain a required end tool coordinate system.
The technical solution of the present application will be described with specific examples.
Example 1
As shown in fig. 2, the mechanical arm visual registration method of the present application includes the following steps:
and S100, acquiring tooling data of a tail end tool of the mechanical arm, and calculating the offset of the tail end tool and a tail end flange of the mechanical arm according to the tooling data.
The type and the kind of the tool assembled by the mechanical arm at present are known, the tool has specific tool data, namely the length, the thickness, the bending degree and the like of the tool, the origin of the terminal tool coordinate system required to be calculated in the application is located at the tip of the terminal tool far away from the mechanical arm, and therefore the offset of the terminal tool and the terminal flange of the mechanical arm is required to be calculated according to the data.
The end flange of the robot arm is the exact center position where the end of the robot arm fits the end tool. Because the junction of the end tool and the robotic arm may be centered, it may also be considered to capture the offset of the junction of the end tool and the robotic arm end.
For example, when the end tool is a drill, it is known that the drill is substantially straight, and therefore the offset can be considered as the length of the drill, and when the end tool is a forceps, it is necessary to take into account the curvature and shape of the head of the forceps, so as to obtain the offset in a targeted manner.
And S200, acquiring the mark coordinates of the mark points arranged on the mechanical arm through a binocular vision sensor.
The binocular vision sensor can shoot the mechanical arm and the mark point on the mechanical arm, so that the mark coordinate of the mark point under a coordinate system of the binocular vision sensor is obtained, and the offset of the mark point from the tail end flange can be obtained because the position of the mark point is known.
Step S300, obtaining a first transformation matrix from the base of the robot arm to the end flange.
According to the positive motion equation of the robot and the DH transformation matrix, the transformation relationship from the base of the robot arm to the end flange of the robot arm can be obtained, for example, a 6-axis robot arm, then:
Figure F_220624104047394_394210011
in the formula A 1 To A 6 Representing the DH transformation matrix among the axes of the robot, B representing the coordinate system of the base of the robot arm, E representing the coordinate system of the end flange of the robot arm,
Figure F_220624104047503_503578012
it is the transition from the base of the robot arm to the end flange of the robot arm, i.e. the first transition matrix.
And S400, establishing a conversion equation according to the mark coordinate and the first conversion matrix, and obtaining a second conversion matrix from the coordinate system of the binocular vision sensor to the base according to the conversion equation.
In the surgical scene of the present application, as long as the corresponding transformation matrix can be obtained between the coordinate systems, the coordinates in the coordinate systems should be transformed with each other, and for this purpose, a transformation equation can be established according to the transformation relationship, and the expression of the transformation equation is as follows:
Figure F_220624104047568_568017013
(1)
in the formula,
Figure F_220624104047646_646164014
in order to be able to identify the coordinates of the markers,
Figure F_220624104047724_724249015
is a transposed matrix of the coordinates of the mark,
Figure F_220624104047836_836066016
in order to be said first conversion matrix,
Figure F_220624104047914_914195017
in order to be said second transformation matrix,
Figure F_220624104047996_996734018
for offset of said marking point relative to said end flangeThe amount of the compound (A) is,
Figure F_220624104048277_277990019
a transposed matrix of the offsets of the marker points relative to the end flange.
The equation means that coordinates of the mark points can be calculated through the first conversion matrix, the second conversion matrix and offset of the mark points relative to the end flange under a binocular vision sensor coordinate system. In connection with the above steps, the second transformation matrix in the transformation equation is unknown,
Figure F_220624104048342_342407020
the tool parameters can be measured only through the positions of the marking points.
Wherein the first matrix can be converted into a translation variable and a rotation variable
Figure F_220624104048421_421026021
The second transformation matrix can also be transformed into
Figure F_220624104048499_499148022
Form, where R is a rotational variable and T is a translational variable, from which the above-mentioned variables of the transformation equation can be derived:
Figure F_220624104048580_580195023
(2)
in the formula, R 0 And T 0 Is unknown, so solving the second transformation matrix is solving R 0 And T 0
Changing the posture of the mechanical arm, collecting the observation coordinates of the mark points from the mechanical arm through the binocular vision sensor, collecting the observation coordinates of a preset group number when every two observation coordinates are in one group, and calculating the rotation component of the second conversion matrix according to the collected observation coordinates of each group and the conversion equation.
For example, the robotic arm in a first pose, the observed seat acquiredIs marked as
Figure F_220624104048658_658356024
The observation coordinates obtained at the second posture are
Figure F_220624104048736_736449025
Then bringing the formula (2) together respectively
Figure F_220624104048816_816534026
Wherein, T 1 And T 2 And respectively representing corresponding translation variables in the first conversion matrix when the pose of the mechanical arm is the first pose and the pose of the mechanical arm is the second pose. Each set of viewing coordinates needs to be within the visual range of the binocular vision sensor and each set of points is not on the same plane.
By analogy, the positions of the binocular vision sensors are kept still, the coordinates of the mark points under different postures of the mechanical arm can be obtained by changing the posture of the mechanical arm, only the coordinates of the mark points and the parameters of the first matrix are changed by changing the posture of the mechanical arm, and the rest variables are not changed, so that a plurality of the above formulas can be obtained, and the shapes of the above formulas can be known by a matrix singular value (SDV) decomposition method
Figure F_220624104048910_910299027
The matrix equation of (a) is as follows:
Figure F_220624104048979_979128028
(3)
Figure F_220624104049057_057283029
is composed of
Figure F_220624104049119_119790030
A right singular matrix of (a);
Figure F_220624104049200_200812031
is composed of
Figure F_220624104049278_278945032
Left singular matrix of (a). Multiple measurements can be made
Figure F_220624104049376_376589033
Is converted into
Figure F_220624104049454_454737034
In the form of (b), and then using singular value decomposition to obtain
Figure F_220624104049566_566114035
In this embodiment, for accuracy, at least 5 sets of the above observation coordinates may be collected, which also relates to the difference between the coordinates that the binocular vision sensor may capture when the robot arm is in various postures, and on this basis, the R is measured 0 The calculation is performed, and the rotation error can be brought into the second conversion matrix to ensure the accuracy of the second conversion matrix calculated later.
As shown in fig. 3, a partial structure diagram of the end of the robot arm is shown, wherein a point P represents a position of the marking point 210, a point D represents a position of the tip of the end tool 220, and the position of the point P is known and fixed, and an offset between the point P and the end flange is relatively constant no matter how the posture of the robot arm changes. Therefore, the offset can be directly obtained through direct measurement or tool data measurement and calculation.
And substituting the offset of the mark point relative to the end flange and the rotation component into the conversion equation to obtain a translation component.
Therefore, in determining R 0 After that, the sum offset is brought into equation (1) to be found, leaving only T 0 If the equation is solved directly without being solved, T can be obtained directly 0 . Thus, the second transformation matrix is obtained.
Step S500, registering the mechanical arm according to the second conversion matrix, the first conversion matrix and the offset to obtain a terminal tool coordinate system of the mechanical arm.
According to the definition of the first conversion matrix and the second conversion matrix, a third conversion matrix from the binocular data sensor to the terminal tool coordinate system of the mechanical arm can be obtained according to the second conversion matrix and the first conversion matrix.
The computational expression of the third transformation matrix is:
Figure F_220624104049628_628588036
(4)
wherein,
Figure F_220624104049706_706708037
and a third conversion matrix, whereby the coordinates of the image captured from the binocular vision sensor can be converted to the coordinates in the robot arm end coordinate system by the third conversion matrix.
And obtaining a fourth conversion matrix from the mark point to the end tool according to the offset of the mark point relative to the end flange and the offset of the end tool to the end flange.
The fourth conversion matrix is a conversion matrix from the mark point to the end tool, and since the offset from the mark point to the end flange and the offset from the end tool to the end flange can be measured and both are offsets for the end flange, the corresponding fourth conversion matrix can be calculated by the tooling data for the mark point and the tooling data for the end tool.
The expression of the fourth transformation matrix is:
Figure F_220624104049773_773568038
(5)
in the formula,
Figure F_220624104049851_851696039
the offset of the end tool from the end flange of the robotic arm,
Figure F_220624104049929_929831040
the offset of the marking point relative to the flange of the mechanical arm,
Figure F_220624104050011_011862041
is the fourth transformation matrix.
The fourth transformation matrix here is used to derive the coordinates of the end tool tip.
And obtaining the terminal tool coordinate system of the mechanical arm according to the third conversion matrix and the fourth conversion matrix.
After the third conversion matrix is obtained, the coordinates of the marking points obtained from the binocular vision sensor can be converted into the coordinates of the coordinate system at the tail end of the mechanical arm from the coordinates of the coordinate system of the binocular vision sensor, and after the fourth conversion matrix is obtained, the coordinates of the marking points can be further converted into the coordinates of the coordinate system of the tail end tool. Meanwhile, according to the fourth conversion matrix, the coordinates of the tip of the end tool can be obtained, and further the spatial position coordinates of the end tool are determined, so that an end tool coordinate system can be established, and corresponding registration can be carried out.
It can be understood that after the coordinates of the tip of the end tool are obtained, the coordinate system of the end tool is established according to the coordinates of the tip of the end tool, and the conversion between the coordinate system of the end tool and the coordinate system of the binocular vision sensor can be realized through the third conversion matrix and the fourth conversion matrix.
According to the mechanical arm visual registration method, the marking points are arranged on the mechanical arm, and the coordinates of the marking points on the mechanical arm in different postures are collected and calculated, so that the finally calculated conversion matrix can contain a translation error and a rotation error, and the obtained coordinate change matrix and the terminal tool coordinate matrix have certain impedance adaptability in each posture. The mechanical arm can still be accurately positioned under the condition of different postures, and errors caused by inaccurate relation between the tail end flange and the drilling point or due to the binocular optical positioner under the condition of different postures in the step are reduced, so that the operation process is more accurate and safer.
Example 2
In a second aspect, as shown in fig. 4, the present application further provides a robot visual registration apparatus, including:
the offset calculation module 10 is configured to acquire tool data of a terminal tool of the mechanical arm, and calculate an offset between the terminal tool and a terminal flange of the mechanical arm according to the tool data;
the mark acquisition module 20 is used for acquiring mark coordinates of mark points arranged on the mechanical arm through a binocular vision sensor;
a calculation module 30 for obtaining a first transformation matrix of the base of the robot arm to the end flange;
the solving module 40 is used for establishing a conversion equation according to the mark coordinates and the first conversion matrix and obtaining a second conversion matrix from the coordinate system of the binocular vision sensor to the base according to the conversion equation;
and the registering module 50 is configured to register the mechanical arm according to the second conversion matrix, the first conversion matrix and the offset to obtain an end tool coordinate system of the mechanical arm.
In a third aspect, the present application further provides a control terminal, including a processor and a memory, where the memory stores a computer program, and the computer program, when executed on the processor, executes the robot arm visual registration method.
In a fourth aspect, the present application further provides a readable storage medium storing a computer program, which when executed on a processor performs the robot arm vision registration method.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other manners. The apparatus embodiments described above are merely illustrative and, for example, the flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of apparatus, methods and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems that perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
In addition, each functional module or unit in each embodiment of the present invention may be integrated together to form an independent part, or each module may exist separately, or two or more modules may be integrated to form an independent part.
The functions, if implemented in the form of software functional modules and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention or a part of the technical solution that contributes to the prior art in essence can be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a smart phone, a personal computer, a server, or a network device, etc.) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
The above description is only for the specific embodiments of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present invention, and all the changes or substitutions should be covered within the scope of the present invention.

Claims (8)

1. A mechanical arm visual registration method is characterized by comprising the following steps:
acquiring tooling data of a tail end tool of the mechanical arm, and calculating the offset of the tail end tool and a tail end flange of the mechanical arm according to the tooling data;
acquiring mark coordinates of mark points arranged on the mechanical arm through a binocular vision sensor;
obtaining a first transformation matrix of a base of the robotic arm to the end flange;
establishing a conversion equation according to the mark coordinate and the first conversion matrix, and obtaining a second conversion matrix from the coordinate system of the binocular vision sensor to the base according to the conversion equation;
registering the mechanical arm according to the second conversion matrix, the first conversion matrix and the offset to obtain a terminal tool coordinate system of the mechanical arm;
wherein the obtaining a second transformation matrix from the coordinate system of the binocular vision sensor to the base according to the transformation equation comprises:
changing the posture of the mechanical arm, collecting observation coordinates of the mark points from the mechanical arm through the binocular vision sensor, collecting a preset number of groups of observation coordinates when every two observation coordinates are in one group, and calculating the rotation component of the second conversion matrix according to the collected observation coordinates of each group and the conversion equation;
determining the offset of the marking point relative to the tail end flange according to the position information of the marking point on the mechanical arm and the tooling information of the marking point;
substituting the offset of the mark point relative to the end flange and the rotation component into the conversion equation to obtain a translation component;
obtaining the second conversion matrix according to the rotation component and the translation component;
wherein the registering the robot arm to obtain the terminal tool coordinate system of the robot arm comprises:
obtaining a third conversion matrix from the binocular vision sensor to a tail end flange of the mechanical arm according to the second conversion matrix and the first conversion matrix;
obtaining a fourth conversion matrix from the mark point to the end tool according to the offset of the mark point relative to the end flange and the offset of the end tool and the end flange;
and obtaining a terminal tool coordinate system of the mechanical arm according to the third conversion matrix and the fourth conversion matrix.
2. The robotic arm vision registration method of claim 1, wherein each set of the observation coordinates is on a different plane and within a vision range of the binocular vision sensor;
the preset number of groups is at least 5 groups.
3. The robotic arm visual registration method of claim 1, wherein the fourth transformation matrix is expressed by:
Figure F_220913180849165_165495001
in the formula,
Figure F_220913180849291_291939002
the offset of the end tool from the end flange of the robotic arm,
Figure F_220913180849370_370585003
the offset of the marking point relative to the flange of the mechanical arm,
Figure F_220913180849433_433093004
is the fourth transformation matrix.
4. The robotic arm visual registration method of claim 1, wherein obtaining a first transformation matrix of a base of the robotic arm to a tip of the robotic arm comprises:
and according to the positive kinematics of the robot, obtaining the first conversion matrix through a DH conversion matrix between every two adjacent motors of the mechanical arm.
5. The robotic arm vision registration method of claim 1, wherein the conversion equation is expressed as:
Figure F_220913180849512_512672005
in the formula,
Figure F_220913180849575_575174006
in order to be able to identify the coordinates of the markers,
Figure F_220913180849653_653321007
is a transposed matrix of the coordinates of the mark,
Figure F_220913180849720_720201008
in order to be said first conversion matrix,
Figure F_220913180849798_798343009
for the purpose of said second transformation matrix, the transformation matrix,
Figure F_220913180849906_906694010
relative to said end for said marking pointThe offset of the blue is such that,
Figure F_220913180850182_182126011
and the mark points are transposed matrixes of the offset of the mark points relative to the end flange.
6. A robot visual registration apparatus, comprising:
the offset calculation module is used for acquiring tooling data of a tail end tool of the mechanical arm and calculating the offset of the tail end tool and a tail end flange of the mechanical arm according to the tooling data;
the mark acquisition module is used for acquiring mark coordinates of mark points arranged on the mechanical arm through a binocular vision sensor;
a calculation module for obtaining a first transformation matrix from a base of the robotic arm to the end flange;
the solving module is used for establishing a conversion equation according to the mark coordinates and the first conversion matrix and obtaining a second conversion matrix from the coordinate system of the binocular vision sensor to the base according to the conversion equation; obtaining a second transformation matrix from the coordinate system of the binocular vision sensor to the base according to the transformation equation comprises:
changing the posture of the mechanical arm, collecting observation coordinates of the mark points from the mechanical arm through the binocular vision sensor, collecting a preset number of groups of observation coordinates when every two observation coordinates are in one group, and calculating the rotation component of the second conversion matrix according to the collected observation coordinates of each group and the conversion equation;
determining the offset of the marking point relative to the end flange according to the position information of the marking point on the mechanical arm and the tooling information of the marking point;
substituting the offset of the mark point relative to the end flange and the rotation component into the conversion equation to obtain a translation component;
obtaining the second conversion matrix according to the rotation component and the translation component;
registering the robotic arm to obtain an end-of-tool coordinate system of the robotic arm comprises:
obtaining a third conversion matrix from the binocular vision sensor to a tail end flange of the mechanical arm according to the second conversion matrix and the first conversion matrix;
obtaining a fourth conversion matrix from the mark point to the end tool according to the offset of the mark point relative to the end flange and the offset of the end tool to the end flange;
obtaining a terminal tool coordinate system of the mechanical arm according to the third conversion matrix and the fourth conversion matrix;
and the registering module is used for registering the mechanical arm according to the second conversion matrix, the first conversion matrix and the offset to obtain a terminal tool coordinate system of the mechanical arm.
7. A control terminal, characterized in comprising a processor and a memory, the memory storing a computer program which, when run on the processor, performs the robotic arm visual registration method of any one of claims 1 to 5.
8. A readable storage medium, characterized in that it stores a computer program which, when run on a processor, performs the robot arm visual registration method of any of claims 1 to 5.
CN202210859860.5A 2022-07-22 2022-07-22 Mechanical arm visual registration method and device, control terminal and storage medium Active CN114918926B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210859860.5A CN114918926B (en) 2022-07-22 2022-07-22 Mechanical arm visual registration method and device, control terminal and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210859860.5A CN114918926B (en) 2022-07-22 2022-07-22 Mechanical arm visual registration method and device, control terminal and storage medium

Publications (2)

Publication Number Publication Date
CN114918926A CN114918926A (en) 2022-08-19
CN114918926B true CN114918926B (en) 2022-10-25

Family

ID=82815890

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210859860.5A Active CN114918926B (en) 2022-07-22 2022-07-22 Mechanical arm visual registration method and device, control terminal and storage medium

Country Status (1)

Country Link
CN (1) CN114918926B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115227398B (en) * 2022-09-19 2023-03-03 杭州三坛医疗科技有限公司 Automatic positioning method and device for registration plate
CN116423505B (en) * 2023-03-30 2024-04-23 杭州邦杰星医疗科技有限公司 Error calibration method for mechanical arm registration module in mechanical arm navigation operation
CN116277035B (en) * 2023-05-15 2023-09-12 北京壹点灵动科技有限公司 Robot control method and device, processor and electronic equipment

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010172986A (en) * 2009-01-28 2010-08-12 Fuji Electric Holdings Co Ltd Robot vision system and automatic calibration method
CN106272437A (en) * 2016-10-12 2017-01-04 吉林大学 Device is asked in a kind of optimum visual field for parallel robot binocular visual positioning
CN107808401A (en) * 2017-10-30 2018-03-16 大族激光科技产业集团股份有限公司 The hand and eye calibrating method of the one camera of mechanical arm tail end
CN110634164A (en) * 2019-10-16 2019-12-31 易思维(杭州)科技有限公司 Quick calibration method for vision sensor
CN110834333A (en) * 2019-11-14 2020-02-25 中科新松有限公司 Robot hand-eye calibration method and storage medium
CN112140104A (en) * 2019-06-27 2020-12-29 发那科株式会社 Device and method for acquiring tool operation position offset
CN113180828A (en) * 2021-03-25 2021-07-30 北京航空航天大学 Operation robot constrained motion control method based on rotation theory
CN113246135A (en) * 2021-06-03 2021-08-13 季华实验室 Robot hand-eye calibration method and device, electronic equipment and storage medium
CN114750163A (en) * 2022-05-23 2022-07-15 杭州柳叶刀机器人有限公司 Robot end coordinate system switching method and device, robot and storage medium

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080027580A1 (en) * 2006-07-28 2008-01-31 Hui Zhang Robot programming method and apparatus with both vision and force
JP5744587B2 (en) * 2011-03-24 2015-07-08 キヤノン株式会社 Robot control apparatus, robot control method, program, and recording medium
JP2014180720A (en) * 2013-03-19 2014-09-29 Yaskawa Electric Corp Robot system and calibration method
CN112330749B (en) * 2020-10-22 2024-07-05 深圳众为兴技术股份有限公司 Hand-eye calibration method and hand-eye calibration device with camera mounted on robot arm
CN112873213B (en) * 2021-03-02 2022-06-10 南京达风数控技术有限公司 Method for improving coordinate system calibration precision of six-joint robot tool

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010172986A (en) * 2009-01-28 2010-08-12 Fuji Electric Holdings Co Ltd Robot vision system and automatic calibration method
CN106272437A (en) * 2016-10-12 2017-01-04 吉林大学 Device is asked in a kind of optimum visual field for parallel robot binocular visual positioning
CN107808401A (en) * 2017-10-30 2018-03-16 大族激光科技产业集团股份有限公司 The hand and eye calibrating method of the one camera of mechanical arm tail end
CN112140104A (en) * 2019-06-27 2020-12-29 发那科株式会社 Device and method for acquiring tool operation position offset
CN110634164A (en) * 2019-10-16 2019-12-31 易思维(杭州)科技有限公司 Quick calibration method for vision sensor
CN110834333A (en) * 2019-11-14 2020-02-25 中科新松有限公司 Robot hand-eye calibration method and storage medium
CN113180828A (en) * 2021-03-25 2021-07-30 北京航空航天大学 Operation robot constrained motion control method based on rotation theory
CN113246135A (en) * 2021-06-03 2021-08-13 季华实验室 Robot hand-eye calibration method and device, electronic equipment and storage medium
CN114750163A (en) * 2022-05-23 2022-07-15 杭州柳叶刀机器人有限公司 Robot end coordinate system switching method and device, robot and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于机器人手眼标定的末端工具参数优化;张立龙等;《生物医学工程学杂志》;20170425;第34卷(第02期);第271-277页 *

Also Published As

Publication number Publication date
CN114918926A (en) 2022-08-19

Similar Documents

Publication Publication Date Title
CN114918926B (en) Mechanical arm visual registration method and device, control terminal and storage medium
CN109029257B (en) Large-scale workpiece pose measurement system and method based on stereoscopic vision and structured light vision
KR101645392B1 (en) Tracking system and tracking method using the tracking system
EP3157715B1 (en) Method for calibrating a robot and a robot system
CN103020952B (en) Messaging device and information processing method
CN110084854B (en) System and method for runtime determination of camera calibration errors
CN105278673B (en) The method that the part of object is measured for auxiliary operation person
CN114918928B (en) Method and device for accurately positioning surgical mechanical arm, control terminal and storage medium
CN110136204B (en) Sound film dome assembly system based on calibration of machine tool position of bilateral telecentric lens camera
CN112792814B (en) Mechanical arm zero calibration method based on visual marks
CN113768624B (en) Working face positioning control method, device, computer equipment and readable storage medium
US20220383547A1 (en) Hand-eye calibration of camera-guided apparatuses
CN114445506A (en) Camera calibration processing method, device, equipment and storage medium
CN115311371A (en) Calibration method for automatic measurement and marking system of double robots
CN114886567A (en) Method for calibrating hands and eyes of surgical robot with telecentric motionless point constraint
KR102451791B1 (en) System and method for estimating the position of object in image
CN107993227B (en) Method and device for acquiring hand-eye matrix of 3D laparoscope
CN116942314A (en) Positioning method and system for mixing optical positioning and mechanical positioning
CN116394254A (en) Zero calibration method and device for robot and computer storage medium
CN114209433B (en) Surgical robot navigation positioning device
CN113771096A (en) Method and device for processing pose information of mechanical arm
CN114343848A (en) Length measuring system of marking block and surgical robot system
CN109916352B (en) Method and device for acquiring TCP (Transmission control protocol) coordinates of robot
CN108592838B (en) Calibration method and device of tool coordinate system and computer storage medium
US20230252681A1 (en) Method of medical calibration

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant