CN115708128A - Control method of operation arm and surgical robot system - Google Patents

Control method of operation arm and surgical robot system Download PDF

Info

Publication number
CN115708128A
CN115708128A CN202110946424.7A CN202110946424A CN115708128A CN 115708128 A CN115708128 A CN 115708128A CN 202110946424 A CN202110946424 A CN 202110946424A CN 115708128 A CN115708128 A CN 115708128A
Authority
CN
China
Prior art keywords
pose
coordinate system
angle
identification
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110946424.7A
Other languages
Chinese (zh)
Inventor
徐凯
吴百波
王龙飞
刘旭
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shurui Shanghai Technology Co ltd
Original Assignee
Shurui Shanghai Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shurui Shanghai Technology Co ltd filed Critical Shurui Shanghai Technology Co ltd
Priority to CN202110946424.7A priority Critical patent/CN115708128A/en
Publication of CN115708128A publication Critical patent/CN115708128A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Manipulator (AREA)

Abstract

The disclosure relates to the technical field of control, and discloses a control method of an operating arm, a computer device, a computer readable storage medium and a surgical robot system. A method of controlling an operating arm, comprising: acquiring a positioning image; identifying a plurality of pose identifications positioned on the operating arm in the positioning image; identifying an angle identifier positioned on the operating arm based on the plurality of pose identifiers, wherein the angle identifier has a position association relationship with a first pose identifier in the plurality of pose identifiers; determining a current relative pose of the operating arm with respect to a reference coordinate system based on the angle identifier and the plurality of pose identifiers; and determining a driving signal of the operation arm based on the current relative pose and the target pose of the operation arm.

Description

Control method of operation arm and surgical robot system
Technical Field
The disclosure belongs to the technical field of control, and particularly relates to a control method of an operating arm and a surgical robot system.
Background
As technology develops, it is becoming more common for related machine equipment, either human or computer controlled, to perform desired actions to assist or replace operators. For example, sorting of couriers is performed using a logistics robot, a surgical robot is used to assist a doctor in performing a surgery, and the like.
In the above application, the control of the operation arm is required to realize the control of the machine equipment.
Disclosure of Invention
In some embodiments, the present disclosure provides a control method of an operation arm, including:
acquiring a positioning image;
in the positioning image, identifying a plurality of pose identifications positioned on an operation arm;
identifying an angle identifier located on the operating arm based on the plurality of pose identifiers, the angle identifier having a position association relationship with a first pose identifier of the plurality of pose identifiers;
determining a current relative pose of the operating arm with respect to a reference coordinate system based on the angle identifier and the plurality of pose identifiers; and
and determining a driving signal of the operation arm based on the current relative pose and the target pose of the operation arm.
In some embodiments, the present disclosure provides a computer device comprising:
a memory for storing at least one instruction; and
a processor, coupled to the memory, to execute the at least one instruction to perform the methods of the present disclosure.
In some embodiments, the present disclosure provides a computer-readable storage medium having at least one instruction stored therein, the at least one instruction being executable by a processor to cause a computer to perform the method of the present disclosure.
In some embodiments, the present disclosure provides a surgical robotic system comprising:
the surgical tool comprises an operating arm, an actuator arranged at the far end of the operating arm, and at least one angle mark and a plurality of pose marks which are arranged at the tail end of the operating arm;
the image collector is used for collecting a positioning image of the operating arm; and
and the processor is connected with the image collector and is used for executing the method disclosed by the invention to determine the driving signal of the operation arm.
Drawings
Fig. 1 illustrates a schematic diagram of an operating arm control system according to some embodiments of the present disclosure;
FIG. 2 illustrates a segmented schematic view of an operating arm according to some embodiments of the present disclosure;
FIG. 3 illustrates a schematic structural view of an operating arm according to some embodiments of the present disclosure;
FIG. 4 shows a tag schematic including a plurality of pose identifications and a plurality of angle identifications;
fig. 5 is a schematic view showing a cylindrical label formed with the label disposed on the periphery of the distal end of the operation arm;
FIG. 6 illustrates an implementation scenario diagram according to some embodiments of the present disclosure;
fig. 7 illustrates a flow chart of a control method of an operating arm control system according to some embodiments of the present disclosure;
FIG. 8 illustrates a flow diagram of a method for determining a drive signal according to some embodiments of the present disclosure;
FIG. 9 illustrates a flow diagram of a method for determining the pose of the manipulator arm coordinate system with respect to the reference coordinate system, according to some embodiments of the present disclosure;
FIG. 10 illustrates a flow diagram of a method for determining a pose of an manipulator arm coordinate system relative to a reference coordinate system according to further embodiments of the present disclosure;
FIG. 11 shows a schematic diagram of a plurality of pose identifications at a cross-sectional circle, according to some embodiments of the present disclosure;
FIG. 12 illustrates a flow diagram of a method for identifying pose identification, according to some embodiments of the present disclosure;
fig. 13 shows a schematic view of a pose identification pattern according to some embodiments of the present disclosure;
FIG. 14 illustrates a flow diagram of a method for searching for pose identification, according to some embodiments of the present disclosure;
FIG. 15 shows a schematic diagram of search pose identification, according to some embodiments of the present disclosure;
FIG. 16 illustrates a flow diagram of a method of identifying an angle identification, according to some embodiments of the present disclosure;
FIG. 17 shows a schematic block diagram of a computer device, according to some embodiments of the present disclosure;
fig. 18 shows a schematic view of a surgical robotic system according to some embodiments of the present disclosure.
Detailed Description
Exemplary embodiments of the present disclosure are described below with reference to the drawings, and those skilled in the art will understand that the scope of the present disclosure is not limited to these embodiments. The present disclosure may be susceptible to various modifications and changes based on the following embodiments. Such modifications and variations are intended to be included within the scope of the present disclosure. Like reference numerals refer to like parts throughout the various embodiments shown in the figures of the present disclosure.
In the present disclosure, the term "position" refers to the positioning of an object or a portion of an object in three-dimensional space (e.g., three translational degrees of freedom can be described using changes in cartesian X, Y, and Z coordinates, e.g., along cartesian X, Y, and Z axes, respectively). In this disclosure, the term "pose" refers to a rotational setting of an object or a portion of an object (e.g., three rotational degrees of freedom that can be described using roll, pitch, and yaw). In the present disclosure, the term "pose" refers to a combination of position and pose of an object or a portion of an object, such as may be described using six parameters in the six degrees of freedom mentioned above.
In the present disclosure, a reference coordinate system may be understood as a coordinate system capable of describing the pose of an object. According to the actual positioning requirement, the reference coordinate system can be selected to use the origin of the virtual reference object or the origin of the physical reference object as the origin of the coordinate system. In some embodiments, the reference coordinate system may be a world coordinate system or a camera coordinate system or the operator's own perceptual coordinate system, or the like.
In the present disclosure, an object may be understood as an object or target that needs to be positioned, such as a manipulator arm or a manipulator arm tip.
In the present disclosure, the pose of the manipulator arm or a part thereof refers to the pose of the manipulator arm coordinate system defined by the manipulator arm or a part thereof relative to the reference coordinate system.
Fig. 1 illustrates a schematic diagram of an operating arm control system 100 according to some embodiments of the present disclosure. As shown in fig. 1, the manipulator arm control system 100 may include an image capture device 110, at least one manipulator arm 140, and a control apparatus 120. The image capturing device 110 and the at least one operating arm 140 are each communicatively connected to the control apparatus 120. In some embodiments, as shown in fig. 1, the control device 120 may be used to control the movement of the at least one manipulator arm 140 to adjust the pose of the at least one manipulator arm 140, to coordinate with each other, and the like. In some embodiments, at least one manipulator arm 140 may include a manipulator arm tip 130 at a distal or distal end. The control device 120 may control the movement of the at least one manipulator arm 140 to move the manipulator arm tip 130 to a desired position and attitude. It will be appreciated by those skilled in the art that the manipulator arm control system 100 may be applied to a surgical robotic system, such as an endoscopic surgical robotic system. For example, a surgical effector 160 may be disposed at the distal end of manipulator arm tip 130, as shown in fig. 1. It should be appreciated that manipulator arm control system 100 may also find application in other fields (e.g., manufacturing, machinery, etc.) of dedicated or general purpose robotic systems.
In the present disclosure, the control device 120 may be communicatively connected with the driving unit 150 (e.g., a motor) of the at least one manipulation arm 140 and send a driving signal to the driving unit 150, so that the driving unit 150 controls the at least one manipulation arm 140 to move to the corresponding target pose based on the driving signal. For example, the driving unit 150 for controlling the movement of the operation arm 140 may be a servo motor, and may receive a command from the control device to control the movement of the operation arm 140. The control device 120 may also be communicatively connected to a sensor coupled to the driving unit 150, for example, through a communication interface, to receive the motion data of the operation arm 140, so as to monitor the motion state of the operation arm 140. In one example of the present disclosure, the communication interface may be a CAN (Controller Area Network) bus communication interface, which enables the control device 120 to communicate with the drive unit 150 and the sensor connection through a CAN bus.
In some embodiments, the manipulator arm 140 may comprise a continuous body deformable arm, such as a manipulator arm having multiple degrees of freedom comprised of multiple joints, such as a manipulator arm that may achieve 6 degrees of freedom of motion. The image capture device 110 may include, but is not limited to, a dual lens image capture device or a single lens image capture device, such as a binocular or monocular camera.
In some embodiments, image capture device 110 may be used to capture scout images. The positioning image may include an image of part or all of the manipulation arm 140. In some embodiments, the image capture device 110 may be configured to capture an image of the manipulator arm tip 130, with the positioning indicia disposed on the manipulator arm tip 130. The positioning markers may include pose markers and angle markers (described in more detail below). As shown in fig. 1, the distal end 130 of the manipulator arm is within the field of view of the image capture device 110, and the captured positioning image may include an image of the distal end 130 of the manipulator arm.
In some embodiments, the control device 120 may receive the positioning image from the image acquisition apparatus 110 and process the positioning image. For example, the control device 120 may identify a plurality of pose identifiers and at least one angle identifier located on the manipulator arm 140 in the positioning image, and determine a current relative pose of the manipulator arm 140 with respect to a reference coordinate system (e.g., a world coordinate system). The control device 120 may also determine the drive signal of the manipulation arm 140 based on the current relative pose of the manipulation arm 140 and the target pose. The driving signal may be transmitted to the driving unit 150 to perform motion control of the manipulation arm 140.
Fig. 2 illustrates a schematic view of a link 200 of an operating arm, according to some embodiments of the present disclosure. The manipulator arm (e.g., manipulator arm 140) may include at least one deformable link 200. As shown in fig. 2, deformable construct 200 includes a fixation plate 210 and a plurality of structural bones 220. The plurality of structural bones 220 are fixedly coupled at a first end to the fixed plate 210 and coupled at a second end to a driving unit (not shown). In some embodiments, the fixed disk 210 may be a ring structure, a disk structure, and the like, but is not limited thereto, and the cross-section may be a circular shape, a rectangular shape, a polygonal shape, and the like.
The drive unit deforms the construct 200 by driving the structural bone 220. For example, the drive unit drives the structural bone 220 to place the joint 200 in a curved state as shown in fig. 2. In some embodiments, the second ends of the plurality of structural bones 220 are connected to the drive unit through the base plate 230. In some embodiments, similar to the fixed disk 210, the base disk 230 may be a ring structure, a disk structure, and the like, but is not limited thereto, and may have a cross-section of various shapes such as a circle, a rectangle, a polygon, and the like. The drive unit may comprise a linear motion mechanism, a drive link or a combination of both. A linear motion mechanism may be coupled to the structural bone 220 to push or pull the structural bone 220, thereby driving the joint 200 to bend. The drive member may include a fixed disk and a plurality of structural bones, wherein one end of the plurality of structural bones is fixedly connected to the fixed disk. The other ends of the plurality of structural bones of the drive segment are connected to or integrally formed with the plurality of structural bones 220 to drive the bending of the segment 200 by driving the bending of the segment.
In some embodiments, a spacer disk 240 is further included between the fixed disk 210 and the base disk 230, and the plurality of structural bones 220 pass through the spacer disk 240. Similarly, the drive member may also include a spacer disk.
Fig. 3 illustrates a structural schematic of an operating arm 300 according to some embodiments of the present disclosure. As shown in fig. 3, the manipulation arm 300 is a deformable manipulation arm, and the manipulation arm 300 may include a manipulation arm tip 310 and a manipulation arm body 320. The lever arm body 320 may include one or more links, such as a first link 3201 and a second link 3202. In some embodiments, the structure of first and second links 3201, 3202 may be similar to link 200 shown in fig. 2. In some implementations, as shown in fig. 3, the lever arm body 320 also includes a first straight rod segment 3203 located between the first and second links 3201, 3202. A first end of the first straight rod section 3203 is connected with the base plate of the second link 3202, and a second end is connected with the fixed plate of the first link 3201. In some implementations, as shown in fig. 3, the lever arm body 320 further includes a second straight rod segment 3204, a first end of the second straight rod segment 3204 being connected with the base plate of the first joint 3201.
In some embodiments, a plurality of pose identifiers and at least one angle identifier are distributed on the manipulator arm (e.g., on the manipulator arm body 320 or the manipulator arm tip 310). For example, a plurality of pose markers are circumferentially distributed on the manipulator arm tip 310, and a plurality of angle markers are circumferentially distributed on the manipulator arm tip 310. The plurality of posture markers and the plurality of angle markers are arranged in parallel in the axial direction on the manipulator arm end 310. For example, a plurality of posture markers and a plurality of angle markers are provided on the outer surface of the columnar portion of the operation arm end 310.
In some embodiments, each angle identification has a positional association with one of the pose identifications. Based on the position incidence relation, the area where the angle markers are possibly distributed can be determined through the positions of the pose markers. Or the position of the position and orientation mark can be determined according to the position and orientation mark, and the possible distribution area of the position and orientation mark can be determined. The position incidence relation can be determined according to the specific arrangement mode of the pose identification and the angle identification, and can be designed in advance.
In some embodiments, the position association relationship may include a correspondence relationship between the angle identifier and the pose identifier in an axial direction. For example, the positional relationship may include an offset in the axial direction. Based on the correspondence in the axial direction, with knowledge of the position of one or more pose markers on the tip of the manipulator arm, offsetting a certain distance in the axial direction may determine the region where an angular marker may exist. For example, the positional relationship may also include an oblique alignment in the axial direction, or the like.
In some embodiments, the plurality of pose identifications and the plurality of angle identifications may be provided on a label affixed to the tip circumference side of the operation arm.
In some embodiments, the pose identification may include a pose identification pattern and pose identification pattern corner points, and the angle identification may include angle identification patterns and angle identification pattern corner points. In some embodiments, the pose and angle identification patterns may be provided on a label affixed to the manipulator arm tip, or may be printed on the manipulator arm tip, or may be a pattern formed by the physical configuration of the manipulator arm tip itself, e.g., may include depressions or protrusions, and combinations thereof. In some embodiments, the pose or angle identification pattern may include patterns formed in brightness, grayscale, color, or the like. In some embodiments, the pose identification pattern and the angle identification pattern may include patterns that actively (e.g., self-illuminating) or passively (e.g., reflected light) provide information detected by the image acquisition module. Those skilled in the art will appreciate that in some embodiments, the pose of the pose marker may be represented by the pose of the pose marker pattern corner point coordinate system, and the pose of the angle marker may be represented by the pose of the angle marker pattern corner point coordinate system.
In some embodiments, the pose or angle identification pattern is provided on the distal end of the manipulator arm in an area suitable for image acquisition by the image acquisition device, for example, an area that may be covered by the field of view of the image acquisition device during operation or an area that is not easily disturbed or obstructed during operation.
Fig. 4 illustrates a schematic diagram of a tag 400 including multiple pose identifications and multiple angle identifications, according to some embodiments. Fig. 5 shows a schematic view of a label 500 which is provided on the distal end peripheral side of the operation arm and formed in a cylindrical shape. It will be appreciated that for simplicity, tag 400 may include the same pose identification pattern and angle identification pattern as tag 500.
Referring to fig. 4, a plurality of position and orientation marks (the position and orientation mark pattern corner points are denoted by "∘" symbols in the present disclosure) and a plurality of angle marks (the angle mark pattern corner points are denoted by ". DELTA" symbols in the present disclosure) are arranged side by side. The plurality of pose identification patterns 411 may be the same or similar, and a plurality of pose identification pattern corner points are located in the plurality of pose identification patterns 411. The plurality of angle identification patterns 421-426 may be different and the plurality of angle identification pattern corner points are located in the plurality of angle identification patterns 421-426.
Each angle identifier and one of the pose identifiers may have a positional association relationship. For example, as shown in fig. 4, in the direction indicated by the arrow, the partial pose markers (e.g., pose marker pattern 411) and the corresponding angle markers (e.g., angle marker pattern 421) are aligned in the direction of the arrow and have a spacing distance d 1 . Referring to fig. 5, in the circumferentially arranged state, the tag 400 becomes a tag 500 spatially configured in a cylindrical shape, and the positional association relationship of each angle index with one of the attitude indexes may include a correspondence relationship of the angle index with the attitude index in an axial direction (e.g., a positive direction of the Z axis in fig. 5). Based on the correspondence in the axial directionWith knowledge of the position of one or more pose markers on the end of the manipulator arm, the markers are axially offset by a distance (e.g., distance d) 1 ) It is possible to determine the area in which the angular identification may exist. In some embodiments, the axial correspondence between the angle identifier and the pose identifier may be represented by the axial correspondence between the angle identifier pattern corner and the pose identifier pattern corner. In some embodiments, based on the correspondence of the angle identifier and the pose identifier in the axial direction, the angle identifier pattern corner points coincide with the projection of one of the pose identifier pattern corner points along the Z-axis direction.
In some embodiments, the axial angle or the roll angle of the angle identifier or the pose identifier may be represented by the axial angle of the angle identifier pattern corner point or the pose identifier pattern corner point. The angle of the angular marking pattern corner point with respect to the manipulator arm coordinate system (e.g. the coordinate system established at the tip of the manipulator arm, such as the XY coordinate system shown in fig. 5) is known or predetermined, such as the angular marking pattern corner point R in fig. 5 5 The angle to the X-axis in the XY-coordinate system is θ. Based on the position incidence relation, a pose identification pattern corner point P associated with the position can be obtained 5 The angle to the X axis is an angle theta. It will be appreciated that the corner points R of the angular marking pattern 5 And pose identification pattern corner point P 5 The corresponding angle θ may be referred to as an axis angle or roll angle of the angle marker or pose marker about the Z-axis. In this disclosure, the shaft angle or roll angle refers to the angle about the Z-axis. It will be appreciated that for clarity, the angles are identified in figure 5 as the corner points R of the pattern 5 And pose identification pattern corner point P 5 Shown as separate, but they are coincident.
Fig. 6 illustrates a schematic diagram of an implementation scenario 600, according to some embodiments of the present disclosure. As shown in fig. 6, the operation arm 640 includes a tip 630 and a distal end effector 660, and a plurality of pose markers and angle markers may be circumferentially provided on the tip 630. For example, the tag 400 shown in fig. 4 is circumferentially disposed on the arm tip 630, forming a cylindrical angle marking pattern band 610 and a posture marking pattern band 620. A plurality of pose marker pattern corner points are distributed on a cross-sectional circle 621 of the pose marker pattern strip 620 at the arm end 630, and a plurality of angle marker pattern corner points are distributed on a cross-sectional circle 611 of the angle marker pattern strip 610 at the arm end 630.
In some embodiments, the plurality of angular identification patterns are different patterns. Each angle marking pattern is used to indicate or mark a different angle about the axis. In some embodiments, each angle-identified pattern has a one-to-one correspondence with an identified angle-around-axis, and the identified angle-around-axis may be determined based on the angle-identified pattern.
For example, as shown in FIG. 6, a plurality of different angle marking patterns (e.g., a plurality of angle marking patterns 421-426 shown in FIG. 4) are evenly distributed circumferentially along the cylindrical structure, forming angle marking pattern corners A-F. Setting the angle identification pattern corresponding to the angle identification pattern corner point a as a reference pattern (for example, setting the angle identification pattern corresponding to the angle identification pattern corner point a for identifying a 0-degree angle around the shaft), and setting up a plane coordinate system { wm1}, the angles around the shaft of the angle identification pattern corner point identifications included in the other angle identification patterns can be determined according to the position relationship between the other angle identification patterns and the angle identification patterns corresponding to the angle identification pattern corner point a. For example, referring to fig. 6, when the angle identification pattern corresponding to the angle identification pattern corner point B is identified, according to the position relationship between the angle identification pattern corresponding to the angle identification pattern corner point B and the angle identification pattern corresponding to the angle identification pattern corner point a, the angle around the axis of the angle identification pattern corner point B in the two-dimensional plane coordinate system of the cross-sectional circle 611 may be determined to be 60 °. The origin of the two-dimensional plane coordinate system of the cross-sectional circle 611 is the center of the cross-sectional circle 611, the X-axis direction is the origin pointing to the angular point a of the angle identification pattern, and the Y-axis is perpendicular to the X-axis.
In some embodiments, the pose of the effector 660 may be determined by translating the manipulator arm coordinate system { wm } (e.g., the manipulator arm tip coordinate system) a predetermined distance. Alternatively, the pose of the actuator 660 may be approximately equal to the pose of the manipulator arm end coordinate system { wm }.
In some embodiments, the pose of the actuator 660 relative to a reference coordinate system (e.g., the reference coordinate system is the world coordinate system w) is determined based on the pose of the manipulator coordinate system relative to the reference coordinate system. The specific calculation formula is as follows:
w R tipw R wm wm R tip (1)
w P tipw R wm wm P tip + w P wm
wherein, w R tip is the pose of the actuator relative to the world coordinate system, w P tip is the position of the actuator relative to the world coordinate system, wm R tip is the pose of the actuator relative to the world coordinate system, wm P tip is the position of the actuator relative to the world coordinate system, w R wm to determine the pose of the manipulator arm coordinate system relative to the world coordinate system, w P wm is the position of the manipulator arm coordinate system relative to the world coordinate system.
Some embodiments of the present disclosure provide a control method of an operation arm. Fig. 7 illustrates a flow chart of a control method 700 of a manipulator arm control system (e.g., manipulator arm control system 100) according to some embodiments of the present disclosure. As shown in fig. 7, some or all of the steps of the method 700 may be performed by a control device (e.g., control device 120) of the manipulator arm control system 100. The control 120 may be configured on a computing device. Method 700 may be implemented by software, firmware, and/or hardware. In some embodiments, method 700 may be implemented as computer readable instructions. Which may be read and executed by a general-purpose or special-purpose processor, such as the processor 1820 shown in fig. 18. In some embodiments, these instructions may be stored on a computer-readable medium.
Referring to fig. 7, in step 701, a positioning image is acquired. In some embodiments, the positioning image includes a plurality of pose identifiers and at least one angle identifier on the manipulator arm. In some embodiments, the positioning image may be received from an image acquisition device 110 as shown in FIG. 1. For example, the control device 120 may receive positioning images actively transmitted by the image acquisition apparatus 110. Alternatively, the control device 120 may send an image request instruction to the image capturing apparatus 110, and the image capturing apparatus 110 sends the positioning image to the control device 120 in response to the image request instruction.
With continued reference to fig. 7, in step 703, a plurality of pose identifications located on the manipulator arm are identified in the positioning image. For example, an exemplary method of identifying a plurality of pose identifications located on an operating arm may include a method as shown in fig. 12 and 14. In some embodiments, the control device 120 may identify the pose identification of some or all of the positioning images through image processing algorithms. In some embodiments, the image processing algorithms may include feature recognition algorithms that may extract or recognize features of the pose identification. For example, the image processing algorithm may include a corner detection algorithm for detecting pose marker pattern corners. The corner detection algorithm may be one of, but not limited to, a gray-scale image-based corner detection, a binary image-based corner detection, and a contour curve-based corner detection. For example, the image processing algorithm may be a color feature extraction algorithm for detecting color features in the pose identification pattern. For another example, the image processing algorithm may be a contour detection algorithm for detecting contour features of the pose identification pattern.
In some embodiments, the control device may identify the pose identification of some or all of the positioning images by the recognition model.
With continued reference to FIG. 7, at step 705, an angle marker located on the manipulator arm is identified based on the plurality of pose markers, the angle marker having a position association with a first pose marker of the plurality of pose markers. In some embodiments, after the plurality of pose identifications are identified, the angle identifications located on the operation arm are identified according to the position association relation. In some embodiments, the position association relationship of the angle identifier and the first posture identifier may be a position association relationship as illustrated in fig. 4 or fig. 5. In some embodiments, the first pose identifier (e.g., the first pose identifier pattern or the first pose identifier pattern corner) refers to a pose identifier having a position association relationship with the angle identifier in the plurality of pose identifiers. Exemplary methods of identifying the angle identification include a method as shown in fig. 16.
With continued reference to fig. 7, at step 707, a current relative pose of the manipulator arm with respect to the reference coordinate system is determined based on the angle identifier and the plurality of pose identifiers. Exemplary methods of determining the relative pose of the manipulator arm with respect to the reference frame include methods as shown in fig. 9 or fig. 10. In some embodiments, the pose of the manipulator arm relative to the reference coordinate system may be determined based on the angle identifier, the first pose identifier, and the plurality of pose identifiers.
In some embodiments, method 700 further comprises: and determining the transformation relation between the coordinate system of the operating arm and the coordinate system of the pose identification based on the angle identification and the plurality of pose identifications. In some embodiments, the three-dimensional coordinates in the pose identification coordinate system may be converted to corresponding three-dimensional coordinates in the manipulator coordinate system based on a transformation relationship of the manipulator coordinate system and the pose identification coordinate system. In some embodiments, the pose of the manipulator coordinate system relative to the reference coordinate system is obtained according to the transformation relationship between the manipulator coordinate system and the pose identification coordinate system and the pose of the pose identification coordinate system relative to the reference coordinate system.
In some embodiments, the transformation relationship of the manipulator arm coordinate system and the pose identification coordinate system may include a roll angle of the pose identification coordinate system relative to the manipulator arm coordinate system. In some embodiments, a roll angle of the pose identification coordinate system relative to the manipulator arm coordinate system may be determined based on the angle identification and the first pose identification. It should be appreciated that the roll angle of the pose identification coordinate system relative to the manipulator arm coordinate system may be the angle of rotation of the pose identification coordinate system about the Z-axis of the manipulator arm coordinate system.
In some embodiments, the manipulator arm coordinate system may be a fixed coordinate system set on the object based on the plurality of pose identifications or the plurality of angle identifications. In some embodiments, the Z axis of the manipulator arm coordinate system is parallel to the axial direction of the manipulator arm, and the XY plane of the manipulator arm coordinate system is in the same plane with the plurality of pose identification pattern corner points or in the same plane with the plurality of angle identification pattern corner points.
In some embodiments, a pose identification coordinate system may be determined to facilitate determining the location of a plurality of pose identifications. In some embodiments, the position of the pose marker may be represented by the position of a pose marker pattern corner point. In some embodiments, the Z axis of the pose identification coordinate system is parallel to or coincides with the axial direction of the manipulator arm, and the XY plane of the pose identification coordinate system is in the same plane as the corner points of the plurality of pose identification patterns.
Illustratively, referring to FIG. 6, the manipulator arm coordinate system [ wm ] ≡ [ X ] wm Y wm Z wm ] T The origin of the point is the center of a cross-section circle 621 where a plurality of pose identification pattern corner points are located, the direction of the X axis is the direction of the origin pointing to one of the pose identification pattern corner points, the direction of the Z axis is parallel to the axial direction of the tail end 630 of the operating arm, and the Y axis is perpendicular to the XZ plane. The X-axis of the manipulator arm coordinate system [ wm ] is aligned with the two-dimensional plane coordinate system [ wm1 ] ≡ X of the cross-sectional circle 611 wm1 Y wm1 ] T Is parallel to the X-axis of the operation arm coordinate system, and the Y-axis of the two-dimensional plane coordinate system { wm1} of the cross-sectional circle 611 is parallel to the Y-axis. The angle-identifying pattern corner points may identify an angle-around-axis in the two-dimensional planar coordinate system { wm1} of cross-sectional circle 611 that is equal to the angle-around-axis identified in the manipulator arm coordinate system { wm }. Pose identification coordinate system { wm0 }. Ident [ X ] wm0 Y wm0 Z wm0 ] T The origin of (2) is the center of a cross-section circle 621 where a plurality of pose identification pattern corner points are located, the direction of the X axis is the direction of the origin pointing to one of the pose identification pattern corner points, the direction of the Z axis is parallel to the axial direction of the tail end 630 of the object operating arm, and the Y axis is perpendicular to the XZ plane. With continued reference to fig. 6, the Z-axis of the manipulator coordinate system { wm } coincides with the Z-axis of the pose identification coordinate system { wm0 }. The transformation relation of the coordinate system { wm } of the operation arm relative to the coordinate system { wm0} of the pose identification can be realized by the rolling angle alpha of the coordinate system { wm0} of the pose identification relative to the coordinate system { wm } of the operation arm 0 And (4) determining. Roll angle alpha 0 It may refer to the rotation angle of the pose identification coordinate system { wm0} around the Z axis with respect to the manipulator coordinate system { wm }.
In some embodiments, referring to FIG. 6, roll angle α 0 Calculated by the following formula:
α 0 =α 12 (2)
wherein alpha is 1 At a first axial angle, α 2 Is a second axial angle. The first pivoting angle is an angular marking pattern corner point (e.g., angular marking pattern corner point R) 6 ) The angle around the axis identified in the manipulator arm coordinate system. A second angle around the shaft ofFirst pose identification pattern corner point (e.g., pose identification pattern corner point P) 6 ) And identifying the angle around the axis in the pose identification coordinate system.
With continued reference to fig. 7, at step 709, a drive signal for the manipulator arm is determined based on the current relative pose and the target pose of the manipulator arm. In some embodiments, method 700 may further include determining a drive signal to operate the arm at a predetermined period to achieve real-time control over a plurality of motion control cycles.
In some embodiments, method 700 may further include: determining a pose difference based on the current relative pose of the operating arm and the target pose of the operating arm; and determining a drive signal of the operation arm based on the pose difference and the inverse kinematics model of the operation arm. For example, based on the difference between the target pose and the current pose of the tip of the manipulator in the world coordinate system, the drive values of the plurality of joints included in the manipulator (or the drive values of the corresponding plurality of motors controlling the motion of the manipulator) in the current motion control cycle may be determined by an inverse kinematics numerical iteration algorithm of the manipulator kinematics model. It should be understood that the kinematic model may represent a mathematical model of the kinematic relationship of the joint space and the task space of the manipulator arm. For example, the kinematic model can be established by a DH (Denavit-Hartenberg) parametric method, an exponential product representation method, or the like.
In some embodiments, the target pose of the manipulator arm is a target pose of the manipulator arm in a world coordinate system. The method 700 may further include: determining the current pose of the operating arm in a world coordinate system based on the current relative pose; and determining a pose difference based on the target pose of the manipulator and the current pose of the manipulator in the world coordinate system. In some embodiments, the pose differences include position differences and pose differences.
In the kth motion control cycle, the pose difference can be expressed as follows:
Figure BDA0003216862940000081
wherein,
Figure BDA0003216862940000082
the positional difference of the operation arm at the k-th motion control cycle,
Figure BDA0003216862940000083
for the angular difference of the operating arm at the kth motion control cycle, P t k For the target position of the operating arm in the kth motion control cycle, R t k The target attitude of the manipulator arm at the kth motion control cycle,
Figure BDA0003216862940000084
for the current position of the operating arm at the kth motion control cycle, R t k For operating the arm during the k-th movement control cycle
Figure BDA0003216862940000085
The current posture of the user is changed,
Figure BDA0003216862940000086
to represent
Figure BDA0003216862940000087
And R t k The angle of rotation therebetween.
In some embodiments, the pose of the target of the manipulator arm is updated at a predetermined period before or during each motion control cycle. In some embodiments, multiple motion control loops are performed iteratively, and at each motion control loop, a method according to some embodiments of the present disclosure, e.g., steps 701-709, may be performed to control the movement of the manipulator arm to the target pose. By iteratively executing a plurality of motion control cycles, real-time closed-loop control of the pose of the tail end of the operating arm can be realized, and the pose control precision of the operating arm can be improved. It will be appreciated that the implementation of pose control of the manipulator arm via the method of the present disclosure can improve the trajectory tracking error of the manipulator arm (e.g., a continuum deformable arm).
In some embodiments, method 700 may further include receiving a control command; and determining the target pose of the operating arm based on the control command. In some embodiments, the target pose of the manipulator arm tip in the world coordinate system may be input by a user via an input device. By comparison calculation, the difference between the target pose and the current pose at the end of the manipulator arm can be determined. In some embodiments, the control commands may be received in a master-slave motion based control scheme. For example, by acquiring the pose of the master operator or joint information of the master operator in each motion control cycle, the target pose of the operation arm is determined. Through a plurality of motion control cycles, real-time master-slave motion control can be performed.
Fig. 8 illustrates a flow diagram of a method 800 for determining a drive signal according to some embodiments of the present disclosure. As shown in fig. 8, some or all of the steps in the method 800 may be performed by a control device (e.g., control device 120) operating the arm control system 100. The control 120 may be configured on a computing device. Method 800 may be implemented by software, firmware, and/or hardware. In some embodiments, method 800 may be implemented as computer-readable instructions. Which may be read and executed by a general-purpose or special-purpose processor, such as the processor 1820 shown in fig. 18. In some embodiments, these instructions may be stored on a computer-readable medium.
Referring to fig. 8, in step 801, based on the pose difference, a cartesian spatial velocity is determined. In some embodiments, the cartesian spatial velocity includes a cartesian spatial linear velocity and a cartesian spatial angular velocity. The method 800 may further include: based on the position difference, a cartesian spatial linear velocity is determined, and based on the attitude difference, a cartesian spatial angular velocity is determined. In some embodiments, the cartesian spatial angular velocity may be determined by a proportional-integral-derivative controller or a proportional-derivative controller based on the attitude difference. In some embodiments, the Cartesian space velocity of the kth motion control cycle
Figure BDA0003216862940000091
The following:
Figure BDA0003216862940000092
wherein v is k Cartesian space linear velocity, ω, for the kth motion control cycle k Cartesian spatial angular velocity, P, for the kth motion control cycle v Is the linear velocity proportionality coefficient, D v Is a linear velocity differential coefficient, P ω Is the proportional coefficient of angular velocity, D ω Is a differential coefficient of the angular velocity,
Figure BDA0003216862940000093
the position difference of the operation arm at the k-1 motion control cycle,
Figure BDA0003216862940000094
the angle difference of the operation arm at the k-1 motion control cycle.
Referring to fig. 8, in step 803, a parameter spatial velocity is determined based on the cartesian spatial velocity. Joint parameter space velocity for kth motion control cycle
Figure BDA0003216862940000095
The following:
Figure BDA0003216862940000096
J + is a Moore-Penrose pseudo-inverse of the velocity-jacobian matrix J of the kinematic model of the manipulator arm, which can be determined based on the structure of the manipulator arm.
Referring to fig. 8, in step 805, target joint parameters are determined based on the parameter spatial velocity and the current joint parameters. Target joint parameters for the kth motion control cycle
Figure BDA0003216862940000097
The following were used:
Figure BDA0003216862940000098
wherein,
Figure BDA0003216862940000099
is the current joint parameter for the kth motion control cycle, and Δ t is the period of the motion control cycle.
It should be understood that when the manipulator arm has a plurality of links (e.g., the manipulator arm 300 shown in fig. 3), the target joint parameter of the manipulator arm may be the target joint parameter of all of the links, or may be the target joint parameter of one or some of the plurality of links.
Referring to fig. 8, in step 807, a drive signal is determined based on the target joint parameter. For example, based on the mapping relationship between the target joint parameter and the driving amount, the driving amount of the plurality of joints included in the operation arm in the current motion control cycle may be determined, and then the driving signal of the driving unit (e.g., motor) may be determined based on the driving amount. In some embodiments, the mapping of the joint parameter of a single joint to the drive amount may be a mapping as shown in equation (15).
In some embodiments, the manipulator arm is exemplified as a deformable moving arm (e.g., a continuum deformable arm). The continuous body deformable arm may be an operating arm 300 as shown in fig. 3. As shown in fig. 3, each segment (first segment 3201 and second segment 3202) may include a base tray, a mounting tray, and a plurality of structural bones extending through the base tray and mounting tray, which may be fixedly connected to the mounting tray and slidably connected to the base tray. The continuum deformable arm and the segments it contains can be described by a kinematic model. In some embodiments, the structure of each segment may be embodied as segment 200 shown in FIG. 2. As shown in fig. 2, the base plate coordinate system
Figure BDA0003216862940000101
Attached to the base plate of the t (t =1,2,3 \8230;) section continuum segment with its origin at the center of the base plate, the XY plane coinciding with the plane of the base plate,
Figure BDA0003216862940000102
from the center of the base plate towards the first structural bone (the first structural bone being understood as meaning a structure with reference to any given one of the structural bonesBone). Curved plane coordinate system
Figure BDA0003216862940000103
The origin of the base plate is coincident with the origin of the coordinate system of the base plate, the XY plane is coincident with the bending plane,
Figure BDA0003216862940000104
and with
Figure BDA0003216862940000105
And (4) overlapping. Coordinate system of fixed disc
Figure BDA0003216862940000106
Attached to the fixed disk of the t-th section continuum segment, the origin of the t-th section continuum segment is positioned at the center of the fixed disk, the XY plane is superposed with the plane of the fixed disk,
Figure BDA0003216862940000107
pointing from the center of the fixation disk to the first structural bone. Curved plane coordinate system
Figure BDA0003216862940000108
The origin is positioned at the center of the fixed disk, the XY plane and the bending plane are superposed,
Figure BDA0003216862940000109
and
Figure BDA00032168629400001010
and (4) overlapping.
The individual segments 200 as shown in fig. 2 may be represented by a kinematic model. The position of the t-th node end (fixed disk coordinate system { te }) relative to the base disk coordinate system { tb } tb P te Posture, posture tb R te Can be expressed as the following equations (7), (8), respectively:
Figure BDA00032168629400001011
tb R tetb R t1 t1 R t2 t2 R te (8)
wherein L is t Is the length of the virtual structural bone of the t-th node (e.g., virtual structural bone 221 shown in FIG. 2), θ t In order to be in the t-th section,
Figure BDA00032168629400001012
about
Figure BDA00032168629400001013
Or alternatively
Figure BDA00032168629400001014
Is rotated to
Figure BDA00032168629400001015
The required angle of rotation is such that, tb R t1 a curved plane coordinate system 1 which is the t-th nodal point is located at the posture of t1 with respect to the base plate coordinate system { tb }, t1 R t2 a curved plane coordinate system 2, which is the t-th segment, has a pose 2, which is a mapped relationship with respect to the curved plane coordinate system 1, the mapped relationship being the curved plane t1, t2 R te the posture of the fixed disk coordinate system { te } which is the t-th section with respect to the curved plane coordinate system 2, t2 }.
tb R t1t1 R t2 And t2 R te can be expressed as the following equation (9), equation (10), and equation (11), respectively:
Figure BDA00032168629400001016
Figure BDA00032168629400001017
Figure BDA0003216862940000111
wherein, delta t In the t-th node, a bending plane and
Figure BDA0003216862940000112
the included angle of (a).
The joint parameters Ψ of a single joint 200 as shown in FIG. 2 t Can be shown as the following equation (12):
ψ t =[θ tt ] T (12)
parameter of joint Ψ t Of (2) a parameter of space velocity
Figure BDA0003216862940000113
Can be shown as the following equation (13):
Figure BDA0003216862940000114
wherein,
Figure BDA0003216862940000115
is theta t The first derivative of (a) is,
Figure BDA0003216862940000116
is delta t The first derivative of (a).
End Cartesian space velocity of a single link 200 as shown in FIG. 2
Figure BDA0003216862940000117
Can be shown as the following equation (14):
Figure BDA0003216862940000118
wherein, J t Velocity Jacobian matrix, J, of a kinematic model that is a single segment tv Linear velocity Jacobian matrix, J, of a kinematic model of a single segment Angular velocity Jacobian matrix, v, of a kinematic model of a single segment t Is the terminal linear velocity, ω, of a single link t Is the end angular velocity of the individual segments.
In some casesIn the embodiment, the driving quantities of a plurality of structural bones and joint parameters have a known mapping relation. Based on the target joint parameters and the mapping relationship of the joints, the driving quantities of the multiple structural bones can be determined. The driving amount of the plurality of structural bones may be understood as a single segment from an initial state (e.g., θ) t = 0) length of the structural bone that is pushed or pulled when bent to the target bending angle. In some embodiments, the mapping of the driving amount of the plurality of structural bones to the joint parameters may be as shown in the following equation (15):
q i ≡-r ti θ t cos(δ tti ) (15) wherein r ti Is the distance from the ith structural bone to the virtual structural bone in the t-th structural section, beta ti Is the angle between the ith structural bone and the first structural bone in the t-th structural joint, q i For the driving amount of the ith structural bone, the driving signal of the driving unit may be determined based on the driving amount of the ith structural bone.
In some embodiments, the terminal cartesian spatial velocity of the individual segments may be determined based on equations (3) and (4), the parametric spatial velocity of the individual segments may be determined based on equation (5), the target joint parameter of the individual segments may be determined based on equation (6), the driving amount of each structural bone may be determined based on equation (15), and the driving signal of the driving unit (e.g., motor) may be determined based on the driving amount.
In some embodiments, the entire deformable arm may be described by a kinematic model. As shown in fig. 3, a transformation may be performed between a plurality of coordinate systems at a plurality of positions of the deformable arm. For example, an end effector of a continuum-deformable arm may be represented in the world coordinate system { w } as:
W T tipW T 1b 1b T 1e 1e T 2b 2b T 2e 2e T tip (16)
wherein, W T tip a homogeneous transformation matrix representing the end effector of the continuum deformable arm relative to a world coordinate system; W T 1b alignment of the base plate representing the first continuous segment relative to the world coordinate systemA secondary transformation matrix; 1b T 1e a homogeneous transformation matrix representing the fixed disks of the first continuous section relative to the base disks of the first continuous section; 1e T 2b a homogeneous transformation matrix representing the base pans of the second continuous section relative to the fixed pans of the first continuous section; 2b T 2e a homogeneous transformation matrix representing the fixed disks of the second continuum segment relative to the base disks of the second continuum segment; 2e T tip a homogeneous transformation matrix representing the end effector of the continuum segment relative to the fixed disk of the second continuum segment. In some embodiments, the end effector is fixedly disposed on the fixed disk, and thus 2e T tip Known or predetermined.
It will be appreciated that the deformable arm has different joint parameters in different operating conditions. For example, the manipulator arm 300 shown in fig. 3 includes at least four operational states. The four operating states of the operating arm 300 are as follows:
the first working state: only the second link 3202 participates in the pose control of the actuator (for example, only the second link 3202 enters the workspace), and the joint parameters of the manipulation arm 300 at this time are as shown in the following equation (17):
Figure BDA0003216862940000121
wherein psi c1 For the joint parameters of the operating arm 300 in the first working condition,
Figure BDA0003216862940000122
to operate the pivoting angle, L, of the arm 300 2 、θ 2 、δ 2 And L in the section 200 shown in FIG. 2 t 、θ t And delta t The physical meanings of (A) are the same.
The second working state: the second joint 3202 and the first linear segment 3203 participate in pose control of the actuator (for example, the second joint 3202 enters the workspace in its entirety, and the first linear segment 3203 enters the workspace in its part), and the joint parameters of the manipulator arm 300 are as shown in the following equation (18):
Figure BDA0003216862940000123
wherein psi c2 Is a joint parameter, L, of the operating arm 300 in the second operating state r Is the feed amount of the first straight line 3203.
The third working state: second link 3202, first straight line segment 3203, and first link 3201 participate in the position control of the actuator (for example, second link 3202 enters the workspace entirely, first straight line segment 3203 enters the workspace entirely, and first link 3201 enters the workspace partially), at which time the joint parameters of manipulator arm 300 are as shown in equation (19) below:
Figure BDA0003216862940000131
wherein psi c3 Is a joint parameter of the operating arm 300 in the third operating state, theta 1 And delta 1 And theta in the segment 200 as shown in FIG. 2 t And delta t The physical meanings of (A) are the same.
The fourth working state: the second link 3202, the first straight line segment 3203, the first link 3201 and the second straight line segment 3204 participate in the pose control of the actuator (for example, the second link 3202 enters the workspace in its entirety, the first straight line segment 3203 enters the workspace in its entirety, the first link 3201 enters the workspace in its entirety, and the second straight line segment 3204 partially enters the workspace), and the joint parameters of the manipulator arm 300 are as shown in the following equation (20):
Figure BDA0003216862940000132
wherein psi c4 Is a joint parameter, L, of the operating arm 300 in the fourth operating state s The feed amount of the second straight line segment 3204.
In some embodiments, similar to a single segment, a cartesian spatial velocity of the distal end of the deformable arm may be determined based on equations (3) and (4), a parameter spatial velocity of the deformable arm may be determined based on equation (5), a target joint parameter of the deformable arm may be determined based on equation (6), wherein the parameter spatial velocity in equation (5) and a specific parameter included in the target joint parameter in equation (6) may be determined based on equations (17), (18), (19), or (20), a driving amount of each structural bone of each segment may be determined based on equation (15), and a driving signal of a driving unit (e.g., a motor) may be determined based on the driving amount.
Fig. 9 illustrates a flow diagram of a method 900 for determining a pose of an manipulator arm coordinate system relative to a reference coordinate system, according to some embodiments of the present disclosure. As with manipulator arm 300 shown in fig. 3, the manipulator arm coordinate system may include a coordinate system of the manipulator arm tip. As shown in fig. 9, some or all of the steps of the method 900 may be performed by a control device (e.g., the control device 120 shown in fig. 1). Some or all of the steps of method 900 may be implemented by software, firmware, and/or hardware. In some embodiments, method 900 may be performed by a robotic system (e.g., surgical robotic system 1800 shown in fig. 18). In some embodiments, method 900 may be implemented as computer readable instructions. Which may be read and executed by a general-purpose or special-purpose processor, such as the processor 1820 shown in fig. 18. In some embodiments, these instructions may be stored on a computer-readable medium.
Referring to fig. 9, in step 901, a roll angle of a pose identification coordinate system relative to an operating arm coordinate system is determined based on an angle identification and a plurality of pose identifications. In some embodiments, determining the angle identifies a first-axis angle identified in the manipulator arm coordinate system. And determining a second axial angle of the first pose identifier in the pose identifier coordinate system. And determining the roll angle of the pose identification coordinate system relative to the operating arm coordinate system based on the first axial angle and the second axial angle. In some embodiments, the roll angle of the pose identification coordinate system relative to the manipulator arm coordinate system may be determined based on equation (2).
At step 903, based on the plurality of pose identifications, a pose of the pose identification coordinate system with respect to the reference coordinate system is determined. The coordinates of the pose identification in the corresponding coordinate system can be represented by the coordinates of the pose identification pattern corner points in the corresponding coordinate system. For example, the two-dimensional coordinates of the pose markers in the positioning image and the three-dimensional coordinates in the pose marker coordinate system may be represented by coordinates of pose marker pattern corner points. In some embodiments, the pose of the pose identification coordinate system with respect to the reference coordinate system is determined based on the two-dimensional coordinates of the pose identification pattern corner points in the positioning image and the three-dimensional coordinates of the pose identification pattern corner points in the pose identification coordinate system. In some embodiments, the pose of the pose identification coordinate system with respect to the reference coordinate system is determined based on the two-dimensional coordinates of the pose identification pattern corner points in the positioning image, the three-dimensional coordinates of the pose identification pattern corner points in the pose identification coordinate system, and the transformation relationship of the camera coordinate system with respect to the reference coordinate system.
In some embodiments, three-dimensional coordinates of the pose marker pattern corner points in the pose marker coordinate system are determined based on a distribution of the pose markers. For example, referring to fig. 11, each of the pose identification pattern corner points is located on the circumference of a cross-sectional circle 1122, and the center of the cross-sectional circle 1122 and the radius r are known. Setting the center of the cross-sectional circle 1122 as the origin of the pose identification coordinate system, with the XY plane located on the cross-sectional circle 1122, the X axis can be designated as the point from the origin pointing to any determined pose identification pattern corner (e.g., pose identification pattern corner P) 11 ) And further determining the three-dimensional coordinates of each pose identification pattern corner point in the pose identification coordinate system based on the distribution of the plurality of pose identifications. For example, as shown in fig. 11, the pose identifies a corner point P of the pattern 11 And (3) when the three-dimensional coordinates of the pose identification coordinate system are (r, 0), the three-dimensional coordinates of the other pose identification pattern corner points in the pose identification coordinate system can be calculated according to the following formula:
C m =[r·cos((m-1)·χ)r·sin((m-1)·χ)0] T (21)
wherein, C m For marking pattern corner points P by pose 11 As a starting point, the m-th pose identification pattern corner point is in the three-dimensional coordinate of the pose identification coordinate system; and x is an axial included angle between adjacent pose identification pattern angular points.
In some embodiments, the transformation relationship of the camera coordinate system relative to the reference coordinate system may be known. For example, the reference coordinate system is a world coordinate system, and the transformation relationship of the camera coordinate system relative to the world coordinate system can be determined according to the pose of the camera. In other embodiments, the reference coordinate system may be the camera coordinate system itself according to actual requirements.
In some embodiments, based on the camera imaging principle and the projection model, the pose of the pose identification coordinate system with respect to the camera coordinate system is determined based on the two-dimensional coordinates of the pose identification pattern corner points in the positioning image and the three-dimensional coordinates of the pose identification pattern corner points in the pose identification coordinate system. Based on the pose of the pose identification coordinate system relative to the camera coordinate system and the transformation relation of the camera coordinate system relative to the reference coordinate system, the pose of the pose identification coordinate system relative to the reference coordinate system can be obtained. In some embodiments, the internal parameters of the camera may also be considered. For example, the internal reference of the camera may be the internal reference of the image capture device 110 as shown in fig. 1. The internal parameters of the camera may be known or calibrated.
In some embodiments, the camera coordinate system may be understood as a coordinate system established with the camera origin. For example, a coordinate system established with the optical center of the camera as the origin or a coordinate system established with the lens center of the camera as the origin. When the camera is a binocular camera, the origin of the camera coordinate system may be the center of the left lens of the camera, or the center of the right lens, or any point on the line connecting the centers of the left and right lenses (e.g., the midpoint of the line).
Referring to fig. 9, in step 905, the pose of the manipulator coordinate system relative to the reference coordinate system is determined based on the roll angle of the pose identification coordinate system relative to the manipulator coordinate system and the pose of the pose identification coordinate system relative to the reference coordinate system. In some embodiments, the pose of the manipulator arm coordinate system relative to the reference coordinate system may be taken as the current relative pose of the manipulator arm relative to the reference coordinate system.
For example, taking the reference coordinate system as a world coordinate system as an example, the pose of the manipulator coordinate system relative to the world coordinate system is as follows:
Figure BDA0003216862940000141
wherein, w R wm to determine the pose of the manipulator arm coordinate system relative to the world coordinate system, w P wm to manipulate the position of the arm coordinate system relative to the world coordinate system, w R wm0 is the pose of the pose coordinate system relative to the world coordinate system, w P wm0 position, rot, of the pose coordinate system relative to the world coordinate system z0 ) Representing the roll angle alpha of rotation about the Z axis of the coordinate system of the operating arm 0
In some embodiments, the specific calculation formula of the pose of the manipulator coordinate system relative to the world coordinate system is as follows:
Figure BDA0003216862940000151
wherein, w R lens is the pose of the camera coordinate system relative to the world coordinate system, w P lens is the position of the camera coordinate system relative to the world coordinate system, lens R wm0 the pose of the coordinate system relative to the camera coordinate system is identified for the pose, lens P wm0 identify the position of the coordinate system relative to the camera coordinate system for the pose, wm0 R wm the pose of the coordinate system is identified for the manipulator coordinate system relative to the pose, wm0 P wm the position of the coordinate system is identified for the manipulator coordinate system relative to the pose.
Fig. 10 illustrates a flow diagram of a method 1000 for determining a pose of an manipulator arm coordinate system relative to a reference coordinate system according to further embodiments of the present disclosure. Method 1000 may be an alternative embodiment of method 900 of fig. 9. As shown in fig. 10, some or all of the steps of the method 1000 may be performed by a data processing device (e.g., the control device 120 shown in fig. 1, the processor 1820 shown in fig. 18). Some or all of the steps of method 1000 may be implemented by software, firmware, and/or hardware. In some embodiments, method 1000 may be performed by a robotic system (e.g., surgical robotic system 1800 shown in fig. 18). In some embodiments, method 1000 may be implemented as computer readable instructions. Which may be read and executed by a general-purpose or special-purpose processor, such as the processor 1820 shown in fig. 18. In some embodiments, these instructions may be stored on a computer-readable medium.
Referring to fig. 10, at step 1001, three-dimensional coordinates of a plurality of pose identifications in a manipulator coordinate system are determined based on a roll angle of the pose identification coordinate system with respect to the manipulator coordinate system and the three-dimensional coordinates of the plurality of pose identifications in the pose identification coordinate system. It can be understood that, given the roll angle of the pose identification coordinate system relative to the manipulator coordinate system, the three-dimensional coordinates of a plurality of pose identification pattern corner points in the pose identification coordinate system can be transformed into the three-dimensional coordinates in the manipulator coordinate system according to coordinate transformation.
In step 1003, the pose of the manipulator coordinate system with respect to the reference coordinate system is determined based on the two-dimensional coordinates of the plurality of pose identifications in the positioning image and the three-dimensional coordinates of the plurality of pose identifications in the manipulator coordinate system. In some embodiments, step 1003 may be implemented similarly to steps 903 and 905 of method 900.
Fig. 12 illustrates a flow diagram of a method 1200 for identifying pose identification, according to some embodiments of the present disclosure. As shown in fig. 12, some or all of the steps of the method 1200 may be performed by a data processing device (e.g., the control device 120 shown in fig. 1, the processor 1820 shown in fig. 18). Some or all of the steps in method 1200 may be implemented by software, firmware, and/or hardware. In some embodiments, method 1200 may be performed by a robotic system (e.g., surgical robotic system 1800 shown in fig. 18). In some embodiments, the method 1200 may be implemented as computer readable instructions. Which may be read and executed by a general-purpose or special-purpose processor, such as the processor 1820 shown in fig. 18. In some embodiments, these instructions may be stored on a computer-readable medium.
Referring to fig. 12, at step 1201, a plurality of candidate pose identifications are determined from the positioning image. In some embodiments, candidate pose identifications may be represented by candidate pose identification pattern corner points. In some embodiments, the candidate pose identification pattern corner points may refer to possible pose identification pattern corner points obtained through preliminary processing or preliminary identification on the positioning image. In some embodiments, an ROI (Region of Interest) may be first cut out from the scout image, and a plurality of candidate pose identifications may be determined from the ROI. The ROI may be a full image of the positioning image or a partial region. For example, the ROI of the current frame may be truncated based on a region within a certain range of the corner points of the pose identification pattern determined from the previous frame of image (e.g., the positioning image of the previous image processing cycle). For a positioning image of a non-first frame, the ROI may be a region within a certain distance range centered on an imaginary point formed by coordinates of a plurality of pose identification pattern corner points in a previous image processing period. The certain distance range may be a fixed multiple, for example, twice, of the average spacing distance of the pose identification pattern corner points. It should be understood that the predetermined multiple may also be a variable multiple of the average spacing distance of the corner points of the plurality of candidate pose identification patterns in the last image processing cycle.
In some embodiments, the method 1200 may include: and determining the Corner Likelihood value (CL) of each pixel point in the positioning image. In some embodiments, the corner likelihood value of a pixel point may be a numerical value characterizing the likelihood of the pixel point as a feature point (e.g., a corner). In some embodiments, the positioning image may be preprocessed before calculating the corner likelihood value of each pixel point, and then the corner likelihood value of each pixel point in the preprocessed image is determined. The pre-processing of the image may include, for example: at least one of image graying, image denoising and image enhancement.
For example, image pre-processing may include: and intercepting the ROI from the positioning image, and converting the ROI into a corresponding gray image.
In some embodiments, the manner of determining the corner likelihood value of each pixel point in the ROI may include, for example, performing a convolution operation on each pixel point within the ROI to obtain a first and/or second derivative of each pixel point. And solving the corner likelihood value of each pixel point by using the first-order and/or second-order derivatives of each pixel point in the ROI. Exemplarily, the corner likelihood value of each pixel point may be calculated according to the following formula:
Figure BDA0003216862940000161
wherein τ is a set constant, for example set to 2; i is x 、I 45 、I y 、I n45 Respectively the first derivatives of the pixel points in four directions of 0, pi/4, pi/2 and-pi/4; I.C. A xy And I 45_45 The second derivatives of the pixel points in the directions of 0, pi/2 and pi/4, -pi/4, respectively.
In some embodiments, the ROI is divided into a plurality of sub-images. For example, a non-maximum suppression method may be used to equally divide multiple sub-images in a ROI region. In some embodiments, the ROI may be equally divided into a plurality of sub-images of 5 x 5 pixels. The above embodiments are exemplary and not limiting, it being understood that the positioning image or ROI may also be segmented into a plurality of sub-images of other sizes, for example, into a plurality of sub-images of 9 x 9 pixels. The pixel point with the maximum CL value in each sub-image can be determined, the pixel point with the maximum CL value in each sub-image is compared with the first threshold value, and the pixel point set with the CL value larger than the first threshold value is determined. In some embodiments, the first threshold may be set to 0.06. It should be understood that the first threshold may also be set to other values. In some embodiments, pixel points with CL values greater than the first threshold may be used as candidate pose identification pattern corners.
Referring to fig. 12, at step 1203, an initial pose identification is identified from a plurality of candidate pose identifications based on the pose pattern matching template. In some embodiments, the candidate pose identification pattern corner reaching the preset pose pattern matching degree standard is determined as the initial pose identification pattern corner through matching the pose pattern matching template with the image at one of the candidate pose identification pattern corners.
In some embodiments, the pose pattern matching template has the same or similar features as the image of the area near the pose identification pattern corner. If the matching degree of the pose pattern matching template and the image of the area near the candidate pose identification pattern corner reaches a preset pose pattern matching degree standard (for example, the matching degree is higher than a threshold), the pattern of the area near the candidate pose identification pattern corner and the pose pattern matching template can be considered to have the same or similar characteristics, and then the current candidate pose identification pattern corner can be considered as the pose identification pattern corner.
In some embodiments, the pixel point with the maximum CL value in the pixel point set is determined to serve as the candidate pose identification pattern corner point to be matched. For example, all the pixel points in the pixel point set may be sorted in the order of the CL values from large to small, and the pixel point with the largest CL value may be used as the candidate pose identification pattern corner point to be matched. After the candidate pose identification pattern corner points to be matched are determined, matching is carried out by using a pose pattern matching template and patterns at the candidate pose identification pattern corner points to be matched, and if the preset pose pattern matching degree standard is reached, the candidate pose identification pattern corner points to be matched are determined to be the identified initial pose identification pattern corner points. And if the candidate pose identification pattern angular point to be matched does not reach the preset matching degree standard, selecting pixel points with the secondary CL value (pixel points with the second largest CL value) as the candidate pose identification pattern angular point to be matched, matching the candidate pose identification pattern angular point with the image at the candidate pose identification pattern angular point by using a pose pattern matching template, and repeating the steps until an initial pose identification pattern angular point is identified.
In some embodiments, the pose identification pattern may be a checkerboard pattern alternating between black and white, so the pose pattern matching template may be the same checkerboard pattern, using the gray scale distribution G of the pose pattern matching template M Pixel neighborhood gray scale distribution G of pixel points corresponding to candidate pose identification pattern corner points image The Correlation Coefficient (CC) between the two signals. Pixel neighborhood gray distribution G of pixel points image The gray distribution of pixels within a certain range (for example, 10 × 10 pixels) centered on the pixel point. The specific formula is as follows:
Figure BDA0003216862940000171
where Var is a variance function and Cov is a covariance function. In some embodiments, when the CC value is less than 0.8, the correlation between the gray scale distribution in the pixel domain and the pose pattern matching template is low, then the candidate pose identification pattern corner with the largest corner likelihood value is determined as the pose identification pattern corner, otherwise, the candidate pose identification pattern corner with the largest corner likelihood value is considered as the pose identification pattern corner.
In some embodiments, the method 1200 includes: and determining the edge direction of the candidate pose identification pattern corner point. For example, as shown in fig. 13, fig. 13 includes a pose identification pattern 1311, and the corner point of the candidate pose identification pattern is the corner point P in fig. 13 13 Then the corner point P 13 The edge direction of (a) may refer to the formation of a corner point P 13 As indicated by the dashed arrows in fig. 13.
In some embodiments, the edge direction may be determined by calculating the first derivative values (I) in the X-direction and the Y-direction of the planar coordinate system for each pixel of a range of neighborhoods (e.g., 10 × 10 pixels) centered on the corner point of the candidate pose identification pattern x And I y ) And (4) determining. For example, the edge direction may be calculated by the following formula:
Figure BDA0003216862940000172
wherein the first derivative (I) x And I y ) The method can be obtained by performing convolution operation on each pixel point in a certain range of neighborhood. In some embodiments, the edge direction I is determined by calculating the edge direction of the pixels in each range neighborhood angle And corresponding weight I weight Clustering to obtain edge direction of the pixel point, and selecting weight I weight I corresponding to the largest class angle As the edge direction. It should be noted that if there are multiple edge directions, the weight I is selected weight I corresponding to a plurality of classes with the largest proportion angle As the edge direction.
In some embodiments, the method used for the Clustering calculation may be any one of a K-means method, a BIRCH (Balanced Iterative Clustering method Based on hierarchical structure) method, a DBSCAN (Density-Based Clustering method with Noise) method, a GMM (Gaussian Mixed Model) method.
In some embodiments, the method 1200 includes: and rotating the pose pattern matching template according to the edge direction. And rotating the pose pattern matching template according to the edge direction to align the pose pattern matching template with the images at the corner points of the candidate pose identification pattern.
The edge direction of the candidate pose identification pattern corner point can be used for determining the setting direction of the image at the candidate identification pattern corner point in the positioning image. In some embodiments, rotating the pose pattern matching template according to the edge orientation may adjust the pose pattern matching template to be the same or nearly the same as the image orientation at the candidate pose identification pattern corner point for image matching.
Referring to fig. 12, at step 1205, the pose identification is searched for with the initial pose identification as a starting point.
For example, fig. 14 shows a flow diagram of a method 1400 for searching for pose identifications, according to some embodiments of the present disclosure. As shown in fig. 14, some or all of the steps of the method 1400 may be performed by a data processing device (e.g., the control device 120 shown in fig. 1, the processor 1820 shown in fig. 18). Some or all of the steps in method 1400 may be implemented by software, firmware, and/or hardware. In some embodiments, method 1400 may be performed by a robotic system (e.g., surgical robotic system 1800 shown in fig. 18). In some embodiments, the method 1400 may be implemented as computer-readable instructions. Which may be read and executed by a general-purpose or special-purpose processor, such as the processor 1820 shown in fig. 18. In some embodiments, these instructions may be stored on a computer-readable medium.
Referring to FIG. 14, at step 1401, a second pose identification is determined, starting with the initial pose identification as a starting point. In some embodiments, the initial pose identification pattern corner point is used as a starting point, and the second pose identification pattern corner point is searched in the set search direction. In some embodiments, the set search direction may include: the initial pose identifies at least one of a front direction (corresponding to an angular direction of 0 °), a rear direction (corresponding to an angular direction of 180 °), an upper direction (angular direction of 90 °), a lower direction (-angular direction of 90 °), and an oblique direction (for example, an angular direction of ± 45 °) of the angular point of the pattern.
In some embodiments, the set search directions are n, for example, 8 directions are searched, and each search direction vsn may be calculated according to the following formula:
v sn =[cos(n·π/4)sin(n·π/4)],(n=1,2,…,8) (27)
in some embodiments, the search direction set in the current step may be determined according to a deviation angle between adjacent pose marker pattern corner points in the plurality of pose marker pattern corner points determined in the previous frame. Illustratively, the predetermined search direction may be calculated according to the following formula:
Figure BDA0003216862940000181
wherein (x) j ,y j ) Two-dimensional coordinates of a plurality of pose identification pattern corner points determined for a previous frame (or a previous image processing cycle); n is last The number of the plurality of pose identification pattern angular points determined for the previous frame; v. of s1 A search direction set for the first; v. of s2 For the second set search direction.
In some embodiments, as shown in FIG. 15, the pattern corner points P are identified with an initial pose 151 The coordinate position of the first position and the second position is used as a searching starting point, and a second position and pose identification pattern corner point P is searched in the set searching direction 152 The coordinate positions of (a) may specifically include: marking pattern corner point P with initial pose 151 Is used as a search starting point, and is searched in a set search direction V by a search frame (e.g., a dotted line frame in fig. 15) in a certain search step length 151 And searching pose identification pattern corner points. If searchingIf at least one candidate pose identification pattern corner exists in the frame, preferentially selecting the candidate pose identification pattern corner with the maximum likelihood value of the corner in the search frame as a second pose identification pattern corner P 152 . Under the condition that the search box is limited to a proper size, marking the angular point P of the pattern with an initial pose 151 The coordinate position of the pattern is used as a search starting point for carrying out second position identification on the corner point P of the pattern 152 During searching, the candidate pose identification pattern corner with the largest corner likelihood value in the candidate pose identification pattern corners appearing in the search box has higher possibility of being the pose identification pattern corner. Therefore, the candidate pose identification pattern corner point with the maximum likelihood value of corner points in the search frame can be regarded as the second pose identification pattern corner point P 152 In order to increase the data processing speed. In other embodiments, in order to improve the accuracy of the pose identification pattern corner point identification, when at least one candidate pose identification pattern corner point exists in the search box, the candidate pose identification pattern corner point with the largest corner likelihood value among the candidate pose identification pattern corner points appearing in the search box is selected to perform the corner point identification, so as to determine whether the candidate pose identification pattern corner point with the largest corner likelihood value is the pose identification pattern corner point. For example, matching the pose pattern matching template with an image in a certain range at a candidate pose identification pattern corner with the maximum corner likelihood value, and considering the candidate pose identification pattern corner meeting the preset pose pattern matching degree standard as a searched second pose identification pattern corner P 152
In some embodiments, with continued reference to FIG. 15, the size of the search box may be increased in steps, such that the search range is increased in steps. The search step size may vary synchronously with the side length of the search box. In other embodiments, the size of the search box may be fixed.
In some embodiments, the pose identification pattern may be a checkerboard pattern between black and white, and the correlation coefficient CC in formula (25) may be used for pattern matching. And if the CC is greater than the threshold value, the candidate pose identification pattern corner with the largest corner likelihood value is regarded as the pose identification pattern corner and is marked as a second pose identification pattern corner.
Referring to FIG. 14, at step 1403, a search direction is determined based on the initial pose identification and the second pose identification. In some embodiments, the search direction comprises: a first search direction and a second search direction. The first search direction may be a direction starting from the coordinate position of the initial pose identification pattern corner point and away from the second pose identification pattern corner point. The second search direction may be a direction starting from the coordinate position of the second pose identification pattern corner point and away from the first pose identification pattern corner point. For example, the search direction V shown in FIG. 15 152
At step 1405, the pose identification is searched for in the search direction, with the initial pose identification or the second pose identification as a starting point. In some embodiments, if the first pose identification pattern corner point is taken as a new starting point, the first search direction in the above embodiments may be taken as a search direction to search for the pose identification pattern corner point. If the second pose identification pattern corner point is taken as a new search starting point, the second search direction in the above embodiment may be taken as a search direction to search for the pose identification pattern corner point. In some embodiments, a new pose marker pattern corner point (e.g., the third pose marker pattern corner point P in fig. 15) is searched for 153 ) May be performed similarly to step 1401. In some embodiments, the search step size may be a distance L between an initial pose marker pattern corner point and a second pose marker pattern corner point 1
In some embodiments, in response to the number of pose identification pattern corner points being greater than or equal to the pose identification pattern corner point number threshold, the search for pose identification pattern corner points is stopped. For example, when four pose identification pattern corner points are searched (identified), the search for the pose identification pattern corner points is stopped.
In some embodiments, in response to the searched distance being greater than a set multiple of the distance between the (N-1) th pose identification pattern corner point and the (N-2) th pose identification pattern corner point, stopping the search for the Nth pose identification pattern corner point, wherein N is greater than or equal to 3. For example, the end condition of the search may be that the distance of the first two pose identification pattern corner points of the search is greater than twice. Thus, the maximum search distance for searching the corner points of the third pose identification pattern is twice the distance between the corner points of the initial pose identification pattern and the corner points of the second pose identification pattern. If the pose identification pattern corner point is not searched for after the search distance is reached, the third pose identification pattern corner point is considered to be not found and the search is finished.
In some embodiments, if the total number of the searched pose identification pattern corner points is greater than or equal to a set threshold (for example, the set threshold is 4), it is considered that enough pose identification pattern corner points are successfully identified. And if the total number of the found pose identification pattern corner points is less than the set numerical value, the searching based on the initial pose identification pattern corner points in the steps is considered to be unsuccessful. And under the condition that the searching is unsuccessful, re-determining a new initial pose identification pattern corner point from the candidate pose identification pattern corner points, and then searching the rest pose identification pattern corner points based on the re-determined initial pose identification pattern corner point as a searching starting point. Similar to the method 1200, new initial pose marker pattern corner points may be re-determined, and similar to the method 1400, the remaining pose marker pattern corner points may be searched for with the new pose marker pattern corner points as search starting points.
In some embodiments, after the pose identification pattern corner point is searched or identified, sub-pixel positioning may be performed on the determined pose identification pattern corner point to improve the position accuracy of the pose identification pattern corner point.
In some embodiments, model-based fitting may be performed on the CL values of the pixel points to determine the coordinates of the sub-pixel located pose identification pattern corner points. For example, the fitting function of the CL value of each pixel point in the ROI may be a quadratic function, and the extreme point of the function is a sub-pixel point. The fitting function may be as follows:
S(x,y)=ax 2 +by 2 +cx+dy+exy+f (29)
Figure BDA0003216862940000201
where S (x, y) is all pixels in each ROIFitting a function to the CL values of the points, wherein a, b, c, d, e and f are coefficients; x is the number of c X-coordinate, y, for pose identification c The y coordinate of the pose mark.
Fig. 16 illustrates a flow diagram of a method 1600 of identifying an angle identification, according to some embodiments of the present disclosure. As shown in fig. 16, some or all of the steps of the method 1600 may be performed by a data processing device (e.g., the control device 120 shown in fig. 1, the processor 1820 shown in fig. 18). Some or all of the steps of method 1600 may be implemented by software, firmware, and/or hardware. In some embodiments, method 1600 may be performed by a robotic system (e.g., surgical robotic system 1800 shown in fig. 18). In some embodiments, method 1600 may be implemented as computer readable instructions. Which may be read and executed by a general-purpose or special-purpose processor, such as the processor 1820 shown in fig. 18. In some embodiments, these instructions may be stored on a computer-readable medium.
Referring to fig. 16, in step 1601, an imaging transformation relationship is determined based on the two-dimensional coordinates of the plurality of pose identifications in the positioning image and the three-dimensional coordinates of the plurality of pose identifications in the pose identification coordinate system. In some embodiments, the pose identification coordinate system may be the pose identification coordinate system detailed in the illustrated embodiment of method 700. For example, the pose identification coordinate system is as shown in fig. 6. In some embodiments, the imaging transformation relationship may refer to a transformation relationship of three-dimensional coordinates in the pose identification coordinate system and two-dimensional coordinates in the positioning image. It should be understood that based on the imaging transformation relationship, the two-dimensional coordinates in the positioning image may also be transformed into three-dimensional coordinates in the pose identification coordinate system. In some embodiments, the three-dimensional coordinates of the plurality of pose identifications in the pose identification coordinate system may be determined based on equation (21). In some embodiments, the number of the plurality of pose identifications may be greater than or equal to 4, for example, the imaging transformation relationship may be derived based on the two-dimensional coordinates of the 4 pose identifications in the positioning image and the corresponding 4 three-dimensional coordinates in the pose identification coordinate system.
Referring to fig. 16, in step 1603, a plurality of angle identification candidate regions are determined in the positioning image based on the imaging transformation relationship, the three-dimensional coordinates of the plurality of pose identifications in the pose identification coordinate system, and the position association relationship. In some embodiments, the angle identification candidate region may represent a candidate region of the angle identification pattern. In some embodiments, a plurality of candidate three-dimensional coordinates of the angle identification pattern corner points are determined in the pose identification coordinate system based on the three-dimensional coordinates and the position association relationship of the plurality of pose identification pattern corner points in the pose identification coordinate system. For example, according to the three-dimensional coordinates of the pose identification pattern corner points in the pose identification coordinate system, the three-dimensional coordinates in the pose identification coordinate system can be determined by shifting the pose identification pattern corner points by a certain distance along the axial direction. These three-dimensional coordinates are represented by a plurality of angle identification pattern corner candidate three-dimensional coordinates. For example, referring to fig. 4, the positional association is such that the angle markers and corresponding pose markers are spaced a distance along the Z-axis of the pose marker coordinate system. Under the premise of determining the position of the pose identification pattern corner point, the position obtained by moving a certain distance along the positive direction or the negative direction of the Z axis can be regarded as a candidate position of the angle identification pattern corner point under the pose identification coordinate system.
In some embodiments, a plurality of angular identification candidate regions are determined in the scout image based on the imaging transformation relationship and the plurality of angular identification pattern corner candidate three-dimensional coordinates. For example, a plurality of candidate two-dimensional coordinates of the corner of the angle identification pattern are obtained in the positioning image based on the imaging transformation relation and the candidate three-dimensional coordinates of the corner of the angle identification pattern. In some embodiments, a plurality of angular identification pattern candidate regions are determined based on a plurality of angular identification pattern corner candidate two-dimensional coordinates. For example, a region of a certain range size (e.g., 5 × 5 pixels, 10 × 10 pixels, etc.) is determined as an angle identification candidate region in the positioning image with each angle identification pattern corner candidate two-dimensional coordinate as a center. In some embodiments, the area of a range of sizes is greater than or equal to the size of the angular marking pattern after imaging. The size of the imaged angle marking pattern can be obtained based on the actual size of the angle marking pattern and the imaging transformation relation.
Referring to fig. 16, in step 1605, candidate regions are identified from a plurality of angles, identifying an angle identification. In some embodiments, the angle markers comprise angle marker patterns and angle marker pattern corner points. In some embodiments, the method 1600 may include determining a pixel point of each angle identification candidate region having the largest corner likelihood value to form a set of pixels. In some embodiments, the corner likelihood values of the pixel points may be calculated during the execution of the method 1200, or may be recalculated based on the formula (24). The method 1600 further includes determining an angle identification candidate region corresponding to a pixel point with the maximum corner likelihood value in the pixel set as an angle identification candidate region to be identified. The method 1600 further includes matching with the candidate regions of the angle identifier to be identified respectively using a plurality of angle pattern matching templates to identify the angle identifier.
In some embodiments, the angular marking pattern is a pattern with different graphical features. The plurality of angle pattern matching templates may refer to standard angle pattern templates having the same or similar graphic features corresponding to the plurality of angle identification patterns, respectively.
In some embodiments, by determining the plurality of angle identification candidate regions, the angle identification can be identified in the plurality of angle identification candidate regions, so that the identification of the angle identification in the whole image range is avoided, and the data processing speed is increased.
In some embodiments, the matching operation of the angle pattern matching template and the angle identification candidate region may be performed using any one of a square error matching method, a normalized square error matching method, a correlation matching method, a normalized correlation matching method, a correlation coefficient matching method, and a normalized correlation coefficient matching method.
In some embodiments, since the angle pattern matching template has the same or similar graphic features as the angle identification pattern, the pattern information of the angle identification may include the pattern information of the corresponding angle pattern matching template. For example, the angular pattern matches the shape of the template, features in the image that can be identified, and so on. In some embodiments, each angle pattern matching template has a one-to-one correspondence with the axial angle identified by the corresponding angle identification pattern. And determining a first axial angle based on the specific angle pattern matching template or the pattern information of the angle identification pattern corresponding to the identified angle identification.
In some embodiments, the method 1600 may include, in response to a failure in matching, determining an angle identification candidate region corresponding to a pixel point with a maximum corner likelihood value among remaining pixel points of the pixel set as the angle identification candidate region to be identified. In some embodiments, after determining a new angle identification candidate region to be identified, a plurality of angle pattern matching templates are used to respectively match with the angle identification candidate regions to be identified to identify the angle identification.
In some embodiments, a first pose identification having a position association relationship with the angle identification is determined based on the angle identification candidate region in which the identified angle identification is located. In some embodiments, the plurality of angle identifier candidate regions respectively correspond to at least one of the plurality of identified pose identifier pattern corner points, and after determining the angle identifier candidate region in which the identified angle identifier is located, the first pose identifier pattern corner point may be determined based on the correspondence between the plurality of angle identifier candidate regions and the plurality of pose identifier pattern corner points.
In some embodiments of the present disclosure, the present disclosure also provides a computer device comprising a memory and a processor. The memory may be configured to store at least one instruction, and the processor is coupled to the memory and configured to execute the at least one instruction to perform some or all of the steps of the methods of the present disclosure, such as some or all of the steps of the methods disclosed in fig. 7, 8, 9, 10, 12, 14, and 16.
Fig. 17 illustrates a schematic block diagram of a computer device 1700 in accordance with some embodiments of the present disclosure. Referring to FIG. 17, the computer device 1700 may include a Central Processing Unit (CPU) 1701, a system memory 1704 including a Random Access Memory (RAM) 1702 and a Read Only Memory (ROM) 1703, and a system bus 1705 connecting the various components. Computer device 1700 may also include an input/output system, and mass storage device 1707 for storing an operating system 1713, application programs 1714, and other program modules 1715. The input/output devices include an input/output controller 1710 that is primarily comprised of a display 1708 and an input device 1709.
The mass storage device 1707 is connected to the central processing unit 1701 through a mass storage controller (not shown) connected to the system bus 1705. The mass storage device 1707 or computer-readable medium provides non-volatile storage for the computer device. The mass storage device 1707 may include a computer-readable medium (not shown) such as a hard disk or Compact disk Read-Only Memory (CD-ROM) drive.
Without loss of generality, computer readable media may comprise computer storage media and communication media. Computer storage media includes volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data. Computer storage media includes RAM, ROM, flash memory or other solid state storage technology, CD-ROM, or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices. Of course, those skilled in the art will appreciate that computer storage media is not limited to the foregoing. The system memory and mass storage devices described above may be collectively referred to as memory.
The computer device 1700 may connect to the network 1712 through a network interface unit 1711 connected to the system bus 1705.
The system memory 1704 or mass storage device 1707 may also be used to store one or more instructions. The central processor 1701 implements all or part of the steps of the method in some embodiments of the present disclosure by executing the one or more instructions.
In some embodiments of the present disclosure, the present disclosure also provides a computer-readable storage medium having stored therein at least one instruction, which is executable by a processor to cause a computer to perform some or all of the steps of the methods of some embodiments of the present disclosure, such as some or all of the steps of the methods disclosed in fig. 7, 8, 9, 10, 12, 14, and 16. Examples of computer readable storage media include memories of computer programs (instructions), such as Read-Only memories (ROMs), random Access Memories (RAMs), compact Disc Read-Only memories (CD-ROMs), magnetic tapes, floppy disks, and optical data storage devices, among others.
Fig. 18 shows a schematic view of a surgical robotic system 1800, according to some embodiments of the present disclosure. In some embodiments of the present disclosure, referring to fig. 18, a surgical robotic system 1800 may include: a surgical instrument 1850, an image collector 1810, and a processor 1820. The surgical instrument 1850 may include a manipulation arm 1840, an end 1830 disposed at a distal end of the manipulation arm 1840. The tip 1830 may include at least one angle identifier and a plurality of pose identifiers and actuators. Image collector 1810 may be used to collect images of the positioning of manipulator arm 1840. Processor 1820 is connected to image collector 1810 for performing some or all of the steps of the methods according to some embodiments of the present disclosure, such as some or all of the steps of the methods disclosed in fig. 7, 8, 9, 10, 12, 14, and 16.
While particular embodiments of the present disclosure have been illustrated and described, it would be obvious to those skilled in the art that various other changes and modifications can be made without departing from the spirit and scope of the disclosure. Accordingly, it is intended to cover in the appended claims all such changes and modifications that are within the scope of this disclosure.

Claims (21)

1. A control method of an operation arm, comprising:
acquiring a positioning image;
identifying a plurality of pose identifications located on the operating arm in the positioning image;
identifying an angle marker located on the manipulator arm based on the plurality of pose markers, the angle marker having a position association relationship with a first pose marker of the plurality of pose markers;
determining a current relative pose of the manipulator relative to a reference coordinate system based on the angle identifier and the plurality of pose identifiers; and
determining a drive signal of the operation arm based on the current relative pose and the target pose of the operation arm.
2. The control method according to claim 1, comprising:
determining a pose difference based on the current relative pose and a target pose of the manipulator arm; and
determining a drive signal of the manipulator arm based on the pose difference and an inverse kinematics model of the manipulator arm.
3. The control method according to claim 2, the control method comprising:
determining a current pose of the operating arm in a world coordinate system based on the current relative pose; and
determining the pose difference based on a target pose of the operating arm and a current pose of the operating arm in a world coordinate system, wherein the target pose of the operating arm is the target pose of the operating arm in the world coordinate system.
4. The control method according to claim 3, comprising:
determining a Cartesian space velocity based on the pose difference;
determining a parameter space velocity based on the cartesian space velocity;
determining target joint parameters based on the parameter space velocity and the current joint parameters; and
determining the drive signal based on the target joint parameter.
5. The control method according to claim 4, the pose differences including position differences and attitude differences, the Cartesian space velocities including Cartesian space linear velocities and Cartesian space angular velocities, the control method further comprising:
determining the linear cartesian space velocity based on the position difference; and
determining the Cartesian spatial angular velocity based on the attitude difference.
6. The control method according to claim 4, the operation arm comprising:
the structure comprises at least one structure section and a driving unit, wherein the structure section comprises a fixed disc and a plurality of structural bones, the first ends of the structural bones are fixedly connected with the fixed disc, and the second ends of the structural bones are connected with the driving unit;
the control method further comprises the following steps:
determining a driving amount of the plurality of structural bones based on the target joint parameter; and
determining a drive signal for the drive unit based on the drive amounts of the plurality of structural bones.
7. The control method according to any one of claims 1 to 6, further comprising:
receiving a control command; and
and determining the target pose of the operating arm based on the control command.
8. The control method according to any one of claims 1 to 6, further comprising:
determining the driving signal of the operating arm at a predetermined period to achieve real-time control through a plurality of motion control cycles.
9. The control method according to claim 1, comprising:
determining a rolling angle of a pose identification coordinate system relative to an operating arm coordinate system based on the angle identification and the plurality of pose identifications;
determining a pose of the pose identification coordinate system relative to the reference coordinate system based on the plurality of pose identifications; and
and determining the pose of the operating arm coordinate system relative to the reference coordinate system based on the roll angle of the pose identification coordinate system relative to the operating arm coordinate system and the pose of the pose identification coordinate system relative to the reference coordinate system.
10. The control method according to claim 9, comprising:
determining the pose of the pose identification coordinate system relative to the reference coordinate system based on the two-dimensional coordinates of the plurality of pose identifications in the positioning image and the three-dimensional coordinates of the plurality of pose identifications in the pose identification coordinate system.
11. The control method according to claim 1, comprising:
determining a rolling angle of a pose identification coordinate system relative to an operating arm coordinate system based on the angle identification and the plurality of pose identifications;
determining three-dimensional coordinates of the plurality of pose identifications in the operating arm coordinate system based on a roll angle of the pose identification coordinate system relative to the operating arm coordinate system and the three-dimensional coordinates of the plurality of pose identifications in the pose identification coordinate system; and
and determining the pose of the operation arm coordinate system relative to the reference coordinate system based on the two-dimensional coordinates of the pose identifications in the positioning image and the three-dimensional coordinates of the pose identifications in the operation arm coordinate system.
12. The control method according to any one of claims 9-11, comprising:
determining a first axial angle of the angle identifier in an operating arm coordinate system;
determining a second axial angle of the first pose identifier in the pose identifier coordinate system; and
and determining the roll angle of the pose identification coordinate system relative to the operating arm coordinate system based on the first axial angle and the second axial angle.
13. The control method according to any one of claims 1 to 3 and 9 to 11, the positional association relation including:
and the angle mark and the first posture mark are in axial corresponding relation.
14. The control method according to claim 1, comprising:
determining three-dimensional coordinates of the plurality of pose identifications in a pose identification coordinate system based on the distribution of the plurality of pose identifications;
determining an imaging transformation relationship based on the two-dimensional coordinates of the plurality of pose identifications in the positioning image and the three-dimensional coordinates of the plurality of pose identifications in the pose identification coordinate system;
determining a plurality of angle identification candidate areas in the positioning image based on the imaging transformation relation, the three-dimensional coordinates of the plurality of pose identifications in a pose identification coordinate system and the position association relation; and
identifying candidate regions from the plurality of angles, the angle identification being identified.
15. The control method according to claim 14, comprising:
determining a plurality of angle identification candidate three-dimensional coordinates in a pose identification coordinate system based on the three-dimensional coordinates of the plurality of pose identifications in the pose identification coordinate system and the position incidence relation; and
determining the plurality of angle identification candidate regions in the positioning image based on the imaging transformation relationship and the plurality of angle identification candidate three-dimensional coordinates.
16. The control method according to claim 14 or 15, comprising:
determining the pixel with the maximum corner likelihood value in each angle identification candidate area to form a pixel set;
determining an angle identification candidate region corresponding to the pixel with the maximum corner likelihood value in the pixel set as an angle identification candidate region to be identified; and
and respectively matching the candidate regions of the angle identifier to be identified by using a plurality of angle pattern matching templates so as to identify the angle identifier.
17. The control method according to claim 14 or 15, comprising:
and determining the first posture identification having a position association relation with the angle identification based on the angle identification candidate region where the angle identification is located.
18. The control method according to claim 1, comprising:
determining a plurality of candidate pose identifications from the positioning image;
identifying an initial pose identification from the plurality of candidate pose identifications based on a pose pattern matching template; and
and searching the pose identification by taking the initial pose identification as a starting point.
19. A computer device, the computer device comprising:
a memory for storing at least one instruction; and
a processor, coupled with the memory, to execute the at least one instruction to perform the method of any of claims 1-18.
20. A computer-readable storage medium having stored therein at least one instruction, the at least one instruction being executable by a processor to cause a computer to perform the method of any one of claims 1-18.
21. A surgical robotic system comprising:
the surgical tool comprises an operation arm, an executor arranged at the far end of the operation arm, and at least one angle identifier and a plurality of pose identifiers arranged on the tail end of the operation arm;
the image collector is used for collecting a positioning image of the operating arm; and
a processor connected to the image collector for performing the method of any one of claims 1-18 to determine the driving signal of the manipulator arm.
CN202110946424.7A 2021-08-18 2021-08-18 Control method of operation arm and surgical robot system Pending CN115708128A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110946424.7A CN115708128A (en) 2021-08-18 2021-08-18 Control method of operation arm and surgical robot system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110946424.7A CN115708128A (en) 2021-08-18 2021-08-18 Control method of operation arm and surgical robot system

Publications (1)

Publication Number Publication Date
CN115708128A true CN115708128A (en) 2023-02-21

Family

ID=85212263

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110946424.7A Pending CN115708128A (en) 2021-08-18 2021-08-18 Control method of operation arm and surgical robot system

Country Status (1)

Country Link
CN (1) CN115708128A (en)

Similar Documents

Publication Publication Date Title
CN111775146B (en) Visual alignment method under industrial mechanical arm multi-station operation
CN113910219B (en) Exercise arm system and control method
CN114536292A (en) Error detection method based on composite identification and robot system
CN114523471B (en) Error detection method based on association identification and robot system
CN114536399B (en) Error detection method based on multiple pose identifications and robot system
CN114310901B (en) Coordinate system calibration method, device, system and medium for robot
CN116766194A (en) Binocular vision-based disc workpiece positioning and grabbing system and method
CN114347037A (en) Robot system fault detection processing method based on composite identification and robot system
Gratal et al. Scene representation and object grasping using active vision
CN117340879A (en) Industrial machine ginseng number identification method and system based on graph optimization model
CN115708128A (en) Control method of operation arm and surgical robot system
CN114536331B (en) Method for determining external stress of deformable mechanical arm based on association identification and robot system
CN114536402A (en) Robot system fault detection processing method based on associated identification and robot system
CN115946105A (en) Control method of operation arm and surgical robot system
CN115957005A (en) Method for controlling an operating arm and surgical robotic system
CN115731289A (en) Method for determining pose of object and surgical robot system
CN116492064A (en) Master-slave motion control method based on pose identification and surgical robot system
CN116468648A (en) Execution arm detection method based on association identification and robot system
CN115731290A (en) Method for determining pose of object and surgical robot system
CN116468646A (en) Execution arm detection method based on composite identification and robot system
CN116468647A (en) Execution arm detection method based on multiple pose identifiers and robot system
CN114536330A (en) Method for determining external stress of deformable mechanical arm based on multiple pose identifications and robot system
CN114536329A (en) Method for determining external stress of deformable mechanical arm based on composite identification and robot system
CN116100562B (en) Visual guiding method and system for multi-robot cooperative feeding and discharging
CN116460837A (en) Operation arm anti-collision control method based on association identification and operation robot system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination