CN115731290A - Method for determining pose of object and surgical robot system - Google Patents

Method for determining pose of object and surgical robot system Download PDF

Info

Publication number
CN115731290A
CN115731290A CN202111016342.9A CN202111016342A CN115731290A CN 115731290 A CN115731290 A CN 115731290A CN 202111016342 A CN202111016342 A CN 202111016342A CN 115731290 A CN115731290 A CN 115731290A
Authority
CN
China
Prior art keywords
coordinate system
pose
marker
determining
composite
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111016342.9A
Other languages
Chinese (zh)
Inventor
徐凯
吴百波
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shurui Shanghai Technology Co ltd
Original Assignee
Shurui Shanghai Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shurui Shanghai Technology Co ltd filed Critical Shurui Shanghai Technology Co ltd
Priority to CN202111016342.9A priority Critical patent/CN115731290A/en
Publication of CN115731290A publication Critical patent/CN115731290A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Image Analysis (AREA)

Abstract

The present disclosure relates to the field of positioning technology, and discloses a method, a computer device, a computer-readable storage medium, and a surgical robot system for determining a pose of an object. A method for determining the pose of an object, comprising: acquiring a positioning image; identifying a plurality of identifiers located on the object in the positioning image, wherein the plurality of identifiers comprise a plurality of pose identifiers for identifying poses and at least one composite identifier for identifying poses and angles; and determining the pose of the object relative to the reference coordinate system based on the at least one composite marker and the plurality of pose markers.

Description

Method for determining pose of object and surgical robot system
Technical Field
The present disclosure relates to positioning technologies, and in particular, to a method for determining a pose of an object and a surgical robot system.
Background
As technology develops, it is becoming more common for related machine equipment, either human or computer controlled, to perform desired actions to assist or replace operators. For example, sorting of couriers is performed using a logistics robot, a surgical robot is used to assist a doctor in performing a surgery, and the like.
In the above application, the pose of a movable part such as a controlled device or structure needs to be determined, so as to realize the control of the machine equipment.
Disclosure of Invention
In some embodiments, the present disclosure provides a method for determining a pose of an object, comprising: acquiring a positioning image; identifying a plurality of identifiers on the object in the positioning image, wherein the plurality of identifiers comprise a plurality of pose identifiers for identifying poses and at least one composite identifier for identifying poses and angles; and determining the pose of the object relative to the reference coordinate system based on the at least one composite marker and the plurality of pose markers.
In some embodiments, the present disclosure provides a computer device comprising: a memory for storing at least one instruction; and a processor, coupled with the memory, for executing at least one instruction to perform the method of the present disclosure.
In some embodiments, the present disclosure provides a computer-readable storage medium having at least one instruction stored therein, the at least one instruction being executable by a processor to cause a computer to perform the method of the present disclosure.
In some embodiments, the present disclosure provides a surgical robotic system comprising: the surgical tool comprises an operation arm, an actuator arranged on the far end of the tail end of the operation arm, and at least one composite mark and a plurality of pose marks which are arranged on the tail end of the operation arm; the image collector is used for collecting a positioning image of the operating arm; and a processor, coupled to the image grabber, for performing the disclosed method to determine the pose of the actuator.
Drawings
FIG. 1 illustrates a schematic diagram of a control system according to some embodiments of the present disclosure;
FIG. 2 illustrates a schematic diagram of a tag including multiple identifiers, according to some embodiments of the present disclosure;
FIG. 3 shows a schematic view of a label disposed around the distal end of an arm and formed into a cylindrical shape, according to some embodiments of the present disclosure;
FIG. 4 illustrates a schematic diagram of an implementation scenario, according to some embodiments of the present disclosure;
FIG. 5 illustrates a flow chart of a method of determining a pose of an object according to some embodiments of the present disclosure;
FIG. 6 illustrates a flow chart of a method of determining a pose of an object relative to a reference coordinate, according to some embodiments of the present disclosure;
FIG. 7 illustrates a flow chart of a method of determining a pose of an object relative to a reference coordinate according to further embodiments of the present disclosure;
FIG. 8 illustrates a flow diagram of a method for identifying an identity, according to some embodiments of the present disclosure;
fig. 9 shows a schematic view of a pose identification pattern according to some embodiments of the present disclosure;
FIG. 10 illustrates a flow diagram of a method for searching for an identity, according to some embodiments of the present disclosure;
FIG. 11 illustrates a schematic diagram of search identification, according to some embodiments of the present disclosure;
FIG. 12 shows a schematic block diagram of a computer device in accordance with some embodiments of the present disclosure;
fig. 13 illustrates a schematic view of a surgical robotic system, according to some embodiments of the present disclosure.
Detailed Description
Exemplary embodiments of the present disclosure are described below with reference to the drawings, and those skilled in the art will understand that the scope of the present disclosure is not limited to these embodiments. The present disclosure may be susceptible to various modifications and changes based on the following embodiments. Such modifications and variations are intended to be included within the scope of the present disclosure. Like reference numerals refer to like parts throughout the various embodiments shown in the figures of the present disclosure.
In the present disclosure, the term "position" refers to the positioning of an object or a portion of an object in three-dimensional space (e.g., three translational degrees of freedom can be described using changes in cartesian X, Y, and Z coordinates, such as along cartesian X, Y, and Z axes, respectively). In this disclosure, the term "pose" refers to a rotational setting of an object or a portion of an object (e.g., three rotational degrees of freedom that can be described using roll, pitch, and yaw). In the present disclosure, the term "pose" refers to a combination of position and pose of an object or a portion of an object, such as may be described using six parameters in the six degrees of freedom mentioned above.
In the present disclosure, a reference coordinate system may be understood as a coordinate system describing the pose of an object. According to the actual positioning requirement, the origin of the virtual reference object or the origin of the physical reference object can be selected as the origin of the coordinate system. In some embodiments, the reference coordinate system may be a world coordinate system or a camera coordinate system or the operator's own perceptual coordinate system, or the like. In some embodiments, the pose of the object coordinate system is used to represent the pose of the object, and the pose of the object coordinate system relative to the reference coordinate system may represent the pose of the object relative to the reference coordinate system. In some embodiments, the object may be understood as an object or target to be positioned, such as a manipulator arm or manipulator arm tip or an actuator disposed at a distal end of a manipulator arm tip. Wherein the operating arm may be a rigid arm or a deformable arm (e.g., operating arm 140 shown in fig. 1).
In some embodiments, the method for determining the pose of an object of the present disclosure may be applied to application scenarios where the pose of an object needs to be acquired. For example, in the course of grasping, clamping, cutting, electrocoagulation or suturing by an actuator of a surgical robot, in order to achieve precise control of the actuator, the actual position of the actuator relative to the world coordinate system needs to be acquired, and the posture of the actuator relative to the world coordinate system needs to be acquired (for example, the roll angle, pitch angle and yaw angle of the actuator may be included). The surgical robot can be an endoscopic surgical robot, an orthopedic surgical robot, a blood vessel intervention surgical robot and the like.
Fig. 1 illustrates a schematic diagram of a control system 100, according to some embodiments of the present disclosure. As shown in fig. 1, an object whose pose needs to be determined in the control system 100 may include an operation arm 140. The control system 100 may comprise an image acquisition apparatus 110, at least one manipulator arm 140 and a control device 120. The image capturing device 110 and the at least one operating arm 140 are each communicatively connected to the control apparatus 120. In some embodiments, as shown in fig. 1, the control device 120 may be used to control the movement of the at least one manipulator arm 140 to adjust the pose of the at least one manipulator arm 140, to coordinate with each other, and the like. In some embodiments, at least one manipulator arm 140 may include a manipulator arm tip 130 at a distal or distal end. The control device 120 may control the movement of the at least one manipulator arm 140 to move the manipulator arm tip 130 to a desired position and attitude. It will be appreciated by those skilled in the art that the control system 100 may be applied to a surgical robotic system, such as an endoscopic surgical robotic system. For example, an actuator 160 may be provided at the distal end of the manipulator arm tip 130, as shown in fig. 1. It should be understood that the control system 100 may also be applied to special purpose or general purpose robotic systems in other fields (e.g., manufacturing, machinery, etc.).
In the present disclosure, the control device 120 may be communicatively connected with the driving unit 150 (e.g., a motor) of the at least one manipulation arm 140 and send a driving signal to the driving unit 150, so that the driving unit 150 controls the at least one manipulation arm 140 to move to the corresponding target pose based on the driving signal. For example, the driving unit 150 for controlling the movement of the operation arm 140 may be a servo motor, and may receive a command from the control device to control the movement of the operation arm 140. The control device 120 may also be communicatively connected to a sensor coupled to the driving unit 150, for example, through a communication interface, to receive the motion data of the operation arm 140, so as to monitor the motion state of the operation arm 140. In one example of the present disclosure, the communication interface may be a CAN (Controller Area Network) bus communication interface, which enables the control device 120 to communicate with the drive unit 150 and the sensor connection through a CAN bus. In some embodiments, the control device 120 may include a local processor (e.g., a local computer device) or a cloud processor (e.g., a cloud server or cloud computing platform).
In some embodiments, the manipulator arm 140 may comprise a continuous body deformable arm, such as a manipulator arm having multiple degrees of freedom formed from multiple joints, such as a manipulator arm that may implement 6 degrees of freedom of motion.
In some embodiments, image capture device 110 may be used to capture scout images. The positioning image may include an image of part or all of the manipulation arm 140. In some embodiments, the image capture device 110 may be configured to capture an image of the manipulator arm tip 130, and a plurality of different pose markers may be disposed on the manipulator arm tip 130, where the pose markers comprise different pose marker patterns. For example, the handle arm end 130 may have a positioning tab 170 disposed thereon (the positioning tab 170 may be, for example, the tab 200 shown in fig. 2). The localization tag 170 may include a plurality of markers including a plurality of pose markers for identifying a pose and at least one composite marker for identifying a pose and an angle (described in detail below). As shown in fig. 1, the distal end 130 of the manipulator arm is within the field of view of the image capture device 110, and the captured positioning image may include an image of the distal end 130 of the manipulator arm.
In some embodiments, the control device 120 may receive the positioning image from the image acquisition apparatus 110 and process the positioning image. For example, the control device 120 may identify a plurality of markers located on the manipulator arm 140 in the positioning image and determine the relative pose of the manipulator arm 140 or the actuator 160 with respect to a reference coordinate system (e.g., a world coordinate system).
In some embodiments, the image capture device 110 may include, but is not limited to, a dual lens image capture device or a single lens image capture device, such as a binocular or monocular camera. The image capture module 110 may be an industrial camera, an underwater camera, a miniature electronic camera, an endoscopic camera, etc., depending on the application scenario. In some embodiments, the image acquisition module 110 may be fixed in position or variable in position, for example, an industrial camera fixed in a monitoring position or an endoscopic camera with adjustable position or pose. In some embodiments, the image acquisition module 110 may implement at least one of visible light band imaging, infrared band imaging, CT (Computed Tomography) imaging, acoustic wave imaging, and the like. Depending on the type of the captured image, those skilled in the art can select different image capturing devices as the image capturing module 110.
In some embodiments, a plurality of pose markers are distributed on the object (e.g., the manipulator arm 140 or the manipulator arm tip 130 shown in fig. 1). In some embodiments, a plurality of pose markers are disposed on an outer surface of a columnar portion of an object. For example, a plurality of pose markers are circumferentially distributed on the manipulator arm tip 130. For example, a plurality of posture markers are provided on the outer surface of the columnar portion of the operation arm tip 130. The plurality of markers may include a plurality of pose markers for identifying a pose and a plurality of composite markers for identifying a pose and an angle (e.g., an axial angle or a roll angle).
In some embodiments, a positioning tag (e.g., the tag 200 shown in fig. 2) is disposed on an outer surface of the columnar part of the object, and the plurality of pose markers may include a plurality of marker patterns distributed on the positioning tag along a circumferential direction of the columnar part and a plurality of marker pattern corner points in the marker patterns. The plurality of marker patterns includes a plurality of different composite marker patterns and a plurality of pose marker patterns, which may be identical. The composite identification pattern and the pattern corner points therein can be used for identifying poses and angles, and the pose identification pattern and the pattern corner points therein can be used for identifying poses. In some embodiments, the plurality of different composite marker patterns and the plurality of pose marker patterns are located in the same pattern distribution zone. In some embodiments, at least one of N consecutive ones of the plurality of identification patterns comprises a composite identification pattern, where 2 ≦ N ≦ 4. For example, a plurality of marker patterns may be uniformly distributed on the outer surface of the columnar portion, and a plurality of composite marker patterns may be uniformly distributed at intervals among the plurality of pose marker patterns, for example, one composite marker pattern is inserted every 3 pose marker patterns, as shown in fig. 2.
In some embodiments, the identification pattern may be provided on a label on the end of the manipulator arm, or may be printed on the end of the manipulator arm, or may be a pattern formed by the physical configuration of the end of the manipulator arm itself, e.g., may include depressions or protrusions, and combinations thereof. In some embodiments, the identification pattern may include a pattern formed in brightness, grayscale, color, or the like. In some embodiments, the identification pattern may include a pattern that actively (e.g., self-illuminating) or passively (e.g., reflected light) provides information that is detected by the image capture device. Those skilled in the art will appreciate that in some embodiments, the pose of the marker or the pose of the marker pattern may be represented by the pose of the marker pattern corner point coordinate system. In some embodiments, the identification pattern is provided on the distal end of the manipulator arm in an area suitable for capturing images by the image capture device, for example, an area that may be covered by the field of view of the image capture device during operation or an area that is not easily disturbed or obscured during operation.
Fig. 2 illustrates a schematic diagram of a tag 200 including multiple identifiers according to some embodiments. Fig. 3 shows a schematic view of a label 300 which is provided on the periphery of the distal end of the operation arm and formed in a cylindrical shape. It will be appreciated that for simplicity, the label 200 may include the same identification pattern as the label 300.
Referring to fig. 2, the plurality of markers includes a plurality of pose marker patterns 210 and a plurality of pose marker pattern corner points therein, and a composite marker pattern 220 and a composite marker pattern corner point therein. In some embodiments, as shown in FIG. 2, the plurality of pose marker patterns 210 and the composite marker pattern 220 are disposed in the same pattern distribution zone. In the present disclosure, the pose marker pattern corner points are indicated by "o" symbols, and the composite marker pattern corner points are indicated by "Δ" symbols. In some embodiments, the pose identification pattern 210 or the pose identification pattern corner point P may be identified 210 Determining the pose identification by identifying the composite identification pattern 220 or the corner R of the composite identification pattern 220 A composite identity is determined.
Referring to fig. 3, in the circumferentially disposed state, the tag 200 becomes a tag 300 spatially configured in a cylindrical shape. In some embodiments, the axial angle or roll angle of each marker may be represented by the axial angle of a marker pattern or a marker pattern corner point, wherein the marker pattern includes the pose marker pattern 310 and the composite marker pattern 320. The angle about the axis of each identification pattern or identification pattern corner mark is known or predetermined. In some embodiments, the identified angle about the axis of each marker may be determined based on the distribution of multiple markers (marker patterns or marker pattern corner points). In some embodiments, the plurality of markers may be uniformly distributed (e.g., an equidistant distribution of marker pattern corners in label 200, an equidistant distribution of marker pattern corners in label 300, and an equal distribution of angular distribution). In some embodiments, each identifier may be used to identify a particular angle-around-axis based on a distribution of the plurality of identifiers, each identifier having a one-to-one correspondence with the identified angle-around-axis. In this disclosure, the angle around the axis or roll angle refers to an angle around the Z-axis (e.g., the Z-axis of an object coordinate system or an identification coordinate system).
As shown in fig. 3, in the label 300, the plurality of identification patterns are uniformly distributed along the circumference of the cylindrical structure, and the plurality of identification pattern corner points are uniformly distributed on the cross-section circle 330, so that the distribution angle (for example, the angle α) of any adjacent identification pattern corner point 0 ) And are equal. Marking pattern corner point P for setting X-axis direction 301 ,P 301 As a reference corner point for marking a 0-degree angle around the shaft (marking pattern corner point P) 301 The located identification pattern is used as a reference pattern), the corner points P of the identification pattern can be determined according to the corner points of any identification pattern and the corner points P of the identification pattern 301 The angular point identifier of the identifier pattern determines the angle around the axis of the identifier.
In some embodiments, the corner points of the pattern are identified in a set coordinate system (e.g., the identified coordinate system [ wm0 ] ≡ [ X ] shown in fig. 3 wm0 Y wm0 Z wm0 ] T ) The angle around the axis identified in (a) may be determined based on the following equation (1):
α m =α 0 (m-1) (1)
wherein alpha is m To identify selected marker pattern corner points (e.g., marker pattern corner point P) 301 ) As the first marking pattern corner point, the axial angle of the mth marking pattern corner point is determined in the clockwise direction of the cross-sectional circle 330.
In some embodiments, the plurality of pose identification patterns may be the same pattern or different patterns. In some embodiments, the plurality of composite identification patterns are different patterns, each composite identification pattern may be used to identify a particular axial angle, and each composite identification pattern has a one-to-one correspondence with the identified axial angle.
Fig. 4 illustrates a schematic diagram of an implementation scenario 400, according to some embodiments of the present disclosure. As shown in FIG. 4, the manipulator arm 440 includes a distal end 430 and a distal end effector 460, and a plurality of markers (e.g., the pose marker pattern 410 and the composite marker pattern 420) may be circumferentially disposed at the distal endOn end 430. For example, the tag 200 as shown in FIG. 2 is circumferentially disposed on the lever arm end 430. A plurality of corner points of the logo pattern are distributed on a cross-sectional circle 431 of the end 430 of the manipulator arm. In some embodiments, based on the identified identification, an identification coordinate system { wm0 }. Ident [ X ] is established wm0 Y wm0 Z wm0 ] T The origin of the coordinate system { wm0} is the center of the circle 431, and the X-axis direction is the origin pointing to one of the marker pattern corner points (e.g., the pattern corner point P corresponding to one of the identified pose markers) 401 ) The direction of the Z-axis is parallel to the axial direction of the object (e.g., manipulator arm 440), and the Y-axis is perpendicular to the XZ-plane.
In some embodiments, an object coordinate system [ wm }. Ident [ X ] is established based on a plurality of composite identifications wm Y wm Z wm ] T The origin of the object coordinate system { wm } is the center of the cross-section circle 431, and the X-axis points to the corner point R of the composite identification pattern 401 The Z axis is parallel to or coincident with the axial direction of the object (e.g., manipulator arm 440), and the Y axis is perpendicular to the XZ plane. In some embodiments, the distribution of the plurality of composite marker patterns may be based on, for example, the remaining composite marker patterns and the composite marker pattern corner points R 401 And determining the angle of the composite identification pattern corner point mark contained in the composite identification pattern around the shaft according to the position relation of the corresponding composite identification pattern.
Some embodiments of the present disclosure provide a method for determining a pose of an object. Fig. 5 illustrates a flow diagram of a method 500 of determining a pose of an object according to some embodiments of the present disclosure. Some or all of the steps in method 500 may be performed by a control device (e.g., control device 120) of control system 100. The control 120 may be configured on a computing device. Some or all of the steps in method 500 may be implemented by software, firmware, and/or hardware. In some embodiments, the method 500 may be performed by a robotic system (e.g., the surgical robotic system 1300 shown in fig. 13). In some embodiments, the method 500 may be implemented as computer readable instructions. Which may be read and executed by a general-purpose or special-purpose processor, such as processor 1320 shown in fig. 13. In some embodiments, these instructions may be stored on a computer-readable medium.
Referring to FIG. 5, in step 501, a scout image is acquired. In some embodiments, the positioning image includes a plurality of markers on the object. In some embodiments, the plurality of identifiers includes a plurality of pose identifiers for identifying a pose and at least one composite identifier for identifying a pose and an angle. In some embodiments, the positioning image may be received from an image acquisition device 110 as shown in FIG. 1. For example, the control device 120 may receive positioning images actively transmitted by the image acquisition apparatus 110. Alternatively, the control device 120 may send an image request instruction to the image capturing apparatus 110, and the image capturing apparatus 110 sends the positioning image to the control device 120 in response to the image request instruction.
With continued reference to FIG. 5, in step 503, a plurality of markers located on the object are identified in the scout image. For example, an exemplary method of identifying a plurality of markers located on an object may include the method shown in fig. 8 and 10. In some embodiments, the control device 120 may identify some or all of the identifiers in the positioning image through an image processing algorithm. In some embodiments, the image processing algorithm may include a feature recognition algorithm, which may extract or recognize the identified features. For example, the image processing algorithm may comprise a corner detection algorithm for detecting the corner of the identification pattern. The corner detection algorithm may be one of, but not limited to, a gray-scale image-based corner detection, a binary image-based corner detection, and a contour curve-based corner detection. For example, the image processing algorithm may be a color feature extraction algorithm for detecting color features in the identification pattern. As another example, the image processing algorithm may be a contour detection algorithm for detecting contour features of the identification pattern. In some embodiments, the control device may identify the identity of some or all of the positioning images by the recognition model.
With continued reference to FIG. 5, at step 505, the pose of the object with respect to the reference coordinate system is determined based on the at least one composite marker and the plurality of pose markers. In some embodiments, the pose of the object coordinate system with respect to the reference coordinate system may be determined as the pose of the object with respect to the reference coordinate system based on the two-dimensional coordinates in the positioning image and the three-dimensional coordinates in the object coordinate system of the at least one composite marker and the plurality of pose markers.
In some embodiments, method 500 may further include determining a plurality of two-dimensional coordinates identified in the positioning image. In some embodiments, the coordinates of the marker may be represented by coordinates of the corner points of the marker pattern. For example, two-dimensional coordinates identified in the positioning image and three-dimensional coordinates identified in the object coordinate system may be represented by coordinates identifying corner points of the pattern. In some embodiments, determining the two-dimensional coordinates of the plurality of markers in the positioning image may include determining the two-dimensional coordinates of the at least one composite marker and the plurality of pose markers in the positioning image. In some embodiments, the method 500 may further include determining three-dimensional coordinates of the at least one composite marker and the plurality of pose markers in the object coordinate system based on the at least one composite marker.
In some embodiments, the method 500 may further include determining the pose of the object coordinate system with respect to the reference coordinate system based on the two-dimensional coordinates of the at least one composite landmark pattern corner point and the plurality of pose landmark pattern corner points in the positioning image and the transformation relationship of the three-dimensional coordinates in the object coordinate system and the camera coordinate system with respect to the reference coordinate system. In some embodiments, the transformation relationship of the camera coordinate system relative to the reference coordinate system may be known. For example, the reference coordinate system is a world coordinate system, and the transformation relationship of the camera coordinate system relative to the world coordinate system can be determined according to the pose of the camera. In other embodiments, the reference coordinate system may be the camera coordinate system itself according to actual requirements. In some embodiments, the pose of the object coordinate system with respect to the camera coordinate system is determined based on the camera imaging principle and the projection model, based on the two-dimensional coordinates of the at least one composite marker pattern corner point and the plurality of pose marker pattern corner points in the positioning image and the three-dimensional coordinates in the object coordinate system. And obtaining the pose of the object coordinate system relative to the reference coordinate system based on the transformation relation between the pose of the object coordinate system relative to the camera coordinate system and the pose of the camera coordinate system relative to the reference coordinate system.
In some embodiments, internal parameters of the camera may also be considered. For example, the camera internal reference may be the camera internal reference of image capture device 110 as shown in fig. 1. The internal parameters of the camera may be known or calibrated. In some embodiments, the camera coordinate system may be understood as a coordinate system established with the camera origin. For example, a coordinate system established with the optical center of the camera as the origin or a coordinate system established with the lens center of the camera as the origin. When the camera is a binocular camera, the origin of the camera coordinate system may be the center of the left lens of the camera, or the center of the right lens, or any point on the line connecting the centers of the left and right lenses (e.g., the midpoint of the line).
In some embodiments, the pose of the object coordinate system relative to the reference coordinate system (e.g., world coordinate system) may be determined based on the following equation (2):
Figure BDA0003239973970000061
wherein, w R wm is the attitude of the object coordinate system relative to the world coordinate system, w P wm the position of the object coordinate system relative to the world coordinate system, w R lens is the pose of the camera coordinate system relative to the world coordinate system, w P lens is the position of the camera coordinate system relative to the world coordinate system, lens R wm the pose of the object coordinate system with respect to the camera coordinate system, lens P wm is the position of the object coordinate system relative to the camera coordinate system.
FIG. 6 illustrates a flow chart of a method 600 of determining a pose of an object relative to a reference coordinate, according to some embodiments of the present disclosure. Some or all of the steps in method 600 may be performed by a control device (e.g., control device 120) of control system 100. The control 120 may be configured on a computing device. Some or all of the steps in method 600 may be implemented by software, firmware, and/or hardware. In some embodiments, the method 600 may be performed by a robotic system (e.g., the surgical robotic system 1300 shown in fig. 13). In some embodiments, method 600 may be implemented as computer readable instructions. Which may be read and executed by a general-purpose or special-purpose processor, such as processor 1320 shown in fig. 13. In some embodiments, these instructions may be stored on a computer-readable medium.
Referring to fig. 6, in step 601, three-dimensional coordinates of at least one composite marker and a plurality of pose markers in a marker coordinate system are determined. In some embodiments, the three-dimensional coordinates of each marker pattern corner point in the marker coordinate system { wm0} may be determined based on the following equation (3):
C m =[r·cosα m r·sinα m 0] T (3)
wherein, C m To use the selected corner point of the identification pattern as the first corner point of the identification pattern (e.g., the corner point P of the pose identification pattern) 401 ) According to the clockwise direction of the section circle 431, the three-dimensional coordinate of the mth marking pattern corner point in the marking coordinate system is represented as r, which is the radius.
In some embodiments, the determination of the axial angle α identified by the mth identification pattern angle point is based on equation (1) m Then the angle alpha around the shaft determined based on the formula (1) m And the formula (3) determines the three-dimensional coordinate C of the mth mark pattern corner point in the mark coordinate system { wm0} m
Referring to FIG. 6, in step 603, based on the at least one composite marker, a roll angle of the marker coordinate system relative to the object coordinate system is determined. In some embodiments, a first axial angle identified in the object coordinate system by one of the at least one composite marker may be determined, and a second axial angle identified in the marker coordinate system by the composite marker may be determined. Based on the first and second axial angles, a roll angle of the identification coordinate system relative to the object coordinate system may be determined. In some embodiments, referring to FIG. 4, roll angle Δ α may refer to the angle of rotation about the Z axis of the identified coordinate system { wm0} relative to the object coordinate system { wm }. In some embodiments, the roll angle Δ α may be determined based on the following equation (4):
Δα=α 12 (4)
wherein alpha is 1 Is a first axial angle, α 2 Is a second axial angle. The first axial angle is a composite marker pattern corner point (e.g., composite marker pattern corner point R) 402 ) The identified angle around the axis in the object coordinate system. The second axial angle is a composite marker pattern corner point (e.g., composite marker pattern corner point R) 402 ) The identified angle around the axis in the identified coordinate system.
In some embodiments, the X-axis of the marker coordinate system { wm0} points to a composite marker pattern corner point (e.g., composite marker pattern corner point R) 402 ) The method 600 may further include determining a first-axis angle of the composite marker identified in the object coordinate system as a roll angle of the marker coordinate system relative to the object coordinate system. In some embodiments, the first-axis angle may be determined based on a pattern included with the composite mark.
Referring to fig. 6, in step 605, three-dimensional coordinates of the at least one composite marker and the plurality of pose markers in the object coordinate system are determined based on the roll angle of the marker coordinate system relative to the object coordinate system and the three-dimensional coordinates of the at least one composite marker and the plurality of pose markers in the marker coordinate system. It is understood that, given the roll angle of the marker coordinate system relative to the object coordinate system, the three-dimensional coordinates of a plurality of marker pattern corner points (e.g., composite marker pattern corner points and pose marker pattern corner points) in the marker coordinate system can be transformed into three-dimensional coordinates in the object coordinate system according to coordinate transformation.
Referring to fig. 6, in step 607, the pose of the object coordinate system with respect to the reference coordinate system is determined as the pose of the object with respect to the reference coordinate system based on the two-dimensional coordinates in the positioning image of the at least one composite marker and the plurality of pose markers and the three-dimensional coordinates in the object coordinate system. In some embodiments, step 607 in method 600 may be implemented similarly to determining the pose of the object coordinate system relative to the reference coordinate system in method 500.
FIG. 7 illustrates a flow diagram of a method 700 of determining a pose of an object relative to a reference coordinate, according to further embodiments of the present disclosure. Method 700 may be an alternative embodiment of method 600 of fig. 6. Some or all of the steps of method 700 may be performed by a control device (e.g., control device 120) of control system 100. The control 120 may be configured on a computing device. Some or all of the steps of method 700 may be implemented by software, firmware, and/or hardware. In some embodiments, method 700 may be performed by a robotic system (e.g., surgical robotic system 1300 shown in fig. 13). In some embodiments, method 700 may be implemented as computer readable instructions. Which may be read and executed by a general-purpose or special-purpose processor, such as processor 1320 shown in fig. 13. In some embodiments, these instructions may be stored on a computer-readable medium.
Referring to fig. 7, in step 701, a pose of the marker coordinate system with respect to the reference coordinate system is determined based on the two-dimensional coordinates of the at least one composite marker and the plurality of pose markers in the positioning image and the three-dimensional coordinates in the marker coordinate system. In some embodiments, the three-dimensional coordinates of the at least one composite marker and the plurality of pose markers in the marker coordinate system may be implemented similarly to step 601 in method 600.
Referring to FIG. 7, in step 703, a roll angle of the marker coordinate system relative to the object coordinate system is determined based on the at least one composite marker. Determining the roll angle of the identification coordinate system relative to the object coordinate system may be implemented in some embodiments similarly to step 603 in method 600.
Referring to fig. 7, in step 705, the pose of the object coordinate system with respect to the reference coordinate system is determined as the pose of the object with respect to the reference coordinate system based on the roll angle of the identified coordinate system with respect to the object coordinate system and the pose of the identified coordinate system with respect to the reference coordinate system.
For example, taking the example of the reference coordinate system as the world coordinate system, the pose of the object coordinate system with respect to the world coordinate system may be determined based on the following formula (5):
Figure BDA0003239973970000081
wherein, w R wm the pose of the object coordinate system relative to the world coordinate system, w P wm the position of the object coordinate system relative to the world coordinate system, w R wm0 to identify the pose of the coordinate system relative to the world coordinate system, w P wm0 to identify the position of the coordinate system relative to the world coordinate system, rot z (Δ α) represents the roll angle Δ α of rotation about the Z-axis of the object coordinate system.
Fig. 8 illustrates a flow diagram of a method 800 for identifying an identity in accordance with some embodiments of the present disclosure. As shown in fig. 8, some or all of the steps of the method 800 may be performed by a data processing device (e.g., the control device 120 shown in fig. 1, the processor 1320 shown in fig. 13). Some or all of the steps in method 800 may be implemented by software, firmware, and/or hardware. In some embodiments, the method 800 may be performed by a robotic system (e.g., the surgical robotic system 1300 shown in fig. 13). In some embodiments, method 800 may be implemented as computer-readable instructions. Which may be read and executed by a general-purpose or special-purpose processor, such as processor 1320 shown in fig. 13. In some embodiments, these instructions may be stored on a computer-readable medium.
Referring to fig. 8, in step 801, a plurality of candidate identifiers is determined from a positioning image. In some embodiments, the marker may comprise marker pattern corner points in the marker pattern. The coordinates or coordinate system origin of the candidate logo may be represented by the candidate logo pattern corners. In some embodiments, the candidate marker pattern corner points may refer to possible marker pattern corner points obtained by performing a preliminary process or a preliminary identification on the positioning image.
In some embodiments, method 800 may include determining a Region of Interest (ROI) in the scout image. For example, the ROI may be first cut out from the scout image, and a plurality of candidate identifications may be determined from the ROI. The ROI may be a full image of the positioning image or a partial region. For example, the ROI of the current frame may be truncated based on a region within a certain range of the corner points of the plurality of identification patterns determined from the previous frame of image (e.g., the positioning image of the previous image processing cycle). For the positioning image of the non-first frame, the ROI may be a region within a certain distance range centered on an imaginary point formed by coordinates of a plurality of corner points of the identification pattern in the previous image processing cycle. The range of distances may be a fixed multiple, e.g. twice, of the average separation distance of the corner points of the identification pattern. It should be understood that the predetermined multiple may also be a variable multiple of the average spacing distance of the corner points of the plurality of candidate marker patterns in the previous image processing cycle.
In some embodiments, the method 800 may include determining Corner Likelihood values (CL) for each pixel point in the positioning image. In some embodiments, the corner likelihood value of a pixel point may be a numerical value characterizing the likelihood of the pixel point as a feature point (e.g., a corner). In some embodiments, the positioning image may be preprocessed before calculating the corner likelihood value of each pixel point, and then the corner likelihood value of each pixel point in the preprocessed image is determined. The pre-processing of the image may include, for example: and the image is subjected to at least one of image graying, image denoising and image enhancement. For example, image pre-processing may include: and intercepting the ROI from the positioning image, and converting the ROI into a corresponding gray image.
In some embodiments, the manner of determining the corner likelihood value of each pixel point in the ROI may include, for example, performing a convolution operation on each pixel point within the ROI to obtain a first and/or second derivative of each pixel point. And solving the corner likelihood value of each pixel point by using the first-order and/or second-order derivative of each pixel point in the ROI. Illustratively, the corner likelihood value of each pixel may be determined based on the following equation (6):
Figure BDA0003239973970000091
wherein τ is a set constant, for example set to 2; i is x 、I 45 、I y 、I n45 Respectively the first derivatives of pixel points in four directions of 0, pi/4, pi/2 and-pi/4; I.C. A xy And I 45_45 The second derivatives of the pixel points in the 0, pi/2 and pi/4, -pi/4 directions, respectively.
In some embodiments, the method 800 may include dividing the ROI into a plurality of sub-regions. For example, a non-maximum suppression method may be used to equally divide a plurality of sub-images in a ROI region. In some embodiments, the ROI may be equally segmented into multiple sub-images of 5 x 5 pixels. The above embodiments are exemplary and not limiting, it being understood that the positioning image or ROI may also be segmented into a plurality of sub-images of other sizes, for example, into a plurality of sub-images of 9 x 9 pixels.
In some embodiments, the method 800 may include determining the pixel in each sub-region with the largest corner likelihood value to form a set of pixels. For example, the pixel point with the largest CL value in each sub-image may be determined, the pixel point with the largest CL value in each sub-image may be compared with the first threshold, and the set of pixels with CL values larger than the first threshold may be determined. In some embodiments, the first threshold may be set to 0.06. It should be understood that the first threshold may also be set to other values.
Referring to FIG. 8, in step 803, a first identifier of a plurality of identifiers is identified from a plurality of candidate identifiers. In some embodiments, the first marker is identified based on the marker pattern matching template. In some embodiments, the marker pattern matching templates include at least one pose marker pattern matching template that is a composite marker pattern matching template that is different from the plurality of patterns. In some embodiments, the composite signature is identified based on a plurality of composite signature pattern matching templates that differ in pattern. For example, under the condition that the identification patterns of the pose identification are the same, the pose identification pattern matching template can be matched with the candidate identification, and if the matching fails, a plurality of different composite identification pattern matching templates are matched with the candidate identification one by one until the matching is successful.
In some embodiments, the first marker is identified using a marker pattern matching template to match the pattern at the corner points of the candidate marker pattern. For example, a candidate marker pattern corner reaching a preset pose pattern matching degree standard is determined as a first marker pattern corner. In some embodiments, the identification pattern matching template has the same or similar features as the pattern in the region near the corner of the identification pattern. If the matching degree between the identification pattern matching template and the patterns in the areas near the candidate identification pattern corners reaches a preset pattern matching degree standard (for example, the matching degree is higher than a threshold), the patterns in the areas near the candidate identification pattern corners and the identification pattern matching template can be considered to have the same or similar characteristics, and then the current candidate identification pattern corners can be considered as the identification pattern corners.
In some embodiments, the pixel point with the largest CL value in the pixel set is determined as the candidate identification pattern corner. For example, all pixels in the pixel set may be sorted in order of CL value from large to small, and the pixel with the largest CL value may be used as the candidate identification pattern corner. In some embodiments, after determining the candidate logo pattern corner, matching the pattern at the candidate logo pattern corner using a logo pattern matching template, and if a preset pattern matching degree criterion is reached, determining the candidate logo pattern corner as the first identified logo pattern corner.
In some embodiments, the method 800 may further include determining, as the candidate identification pattern corner, a pixel with a largest corner likelihood value of remaining pixels in the set of pixels, in response to a failure to match. For example, if the candidate logo pattern corner does not meet the preset matching degree standard, selecting a pixel point of the secondary CL value (a pixel point with the second largest CL value) as the candidate logo pattern corner, matching the pattern at the candidate logo pattern corner by using the logo pattern matching template, and repeating the steps until the first logo pattern corner is identified.
In some embodiments, the identification pattern may be a checkerboard pattern alternating between black and white, and thus the identification pattern matching template may be the same checkerboard pattern, using the gray scale profile G of the identification pattern matching template M Pixel neighborhood gray scale distribution G of pixel points corresponding to candidate identification pattern corner points image The Correlation Coefficient (CC) between the two signals. Pixel neighborhood gray distribution G of pixel points image The gray scale distribution of pixels in a certain range (for example, 10 × 10 pixels) centered on the pixel point. The correlation coefficient may be determined based on the following equation (7):
Figure BDA0003239973970000101
where Var () is a variance function and Cov () is a covariance function. In some embodiments, when the correlation coefficient is less than 0.8, and the correlation between the gray scale distribution in the pixel domain and the matching template of the identification pattern is low, it is determined that the candidate identification pattern corner with the largest corner likelihood value is not the identification pattern corner, otherwise, it is determined that the candidate identification pattern corner with the largest corner likelihood value is the identification pattern corner.
In some embodiments, method 800 may further include determining edge directions of candidate logo pattern corners. For example, as shown in fig. 9, the candidate pose identification pattern corner point is the corner point P in the pose identification pattern 900 901 Then the corner point P 901 The edge direction of (a) may refer to the formation of a corner point P 901 As indicated by the dashed arrows in fig. 9.
In some embodiments, the edge direction may be determined by the first derivative values (I) in the X-direction and the Y-direction of the planar coordinate system for each pixel of a range of neighborhoods (e.g., 10 × 10 pixels) centered on the corner of the candidate logo pattern x And I y ) And (4) determining. For example, the edge direction may be determined based on the following equation (8):
Figure BDA0003239973970000111
wherein the first derivative (I) x And I y ) The method can be obtained by performing convolution operation on each pixel point in a certain range of neighborhood. In some embodiments, the edge direction I is determined by calculating the edge direction of the pixels in each range neighborhood angle And corresponding weight I weight Clustering to obtain edge direction of the pixel point, and selecting weight I weight I corresponding to the largest class angle As the edge direction. Note that if there are multiple edge directions, the weight I is chosen weight I corresponding to a plurality of classes with the largest proportion angle As the edge direction.
In some embodiments, the method used for Clustering calculation may be any one of a K-means method, a BIRCH (Balanced Iterative Clustering and Clustering using hierarchical Based Balanced Iterative Clustering method) method, a DBSCAN (Density-Based Clustering with application with Noise) method, and a GMM (Gaussian Mixed Model) method.
In some embodiments, method 800 may include identifying a pattern matching template based on edge direction rotation. Rotating the logo pattern matching template based on the edge direction may align the logo pattern matching template with the image at the candidate logo pattern corner points. The edge direction of the candidate marker pattern corner point can be used to determine the setting direction of the image at the candidate marker pattern corner point in the positioning image. In some embodiments, the marker pattern matching template is rotated based on the edge direction, and the marker pattern matching template may be adjusted to be the same or nearly the same as the image direction at the corner point of the candidate marker pattern to facilitate image matching.
Referring to fig. 8, in step 805, with the first identifier as a starting point, other identifiers are searched. In some embodiments, in response to identifying the composite marker, other markers are identified based on the pose marker pattern matching template. In some embodiments, the other identifiers include pose identifiers or composite identifiers. Fig. 10 illustrates a flow diagram of a method 1000 for searching for an identity in accordance with some embodiments of the present disclosure. As shown in fig. 10, some or all of the steps of the method 1000 may be performed by a data processing device (e.g., the control device 120 shown in fig. 1, the processor 1320 shown in fig. 13). Some or all of the steps of method 1000 may be implemented by software, firmware, and/or hardware. In some embodiments, the method 1000 may be performed by a robotic system (e.g., the surgical robotic system 1300 shown in fig. 13). In some embodiments, method 1000 may be implemented as computer-readable instructions. Which may be read and executed by a general-purpose or special-purpose processor, such as processor 1320 shown in fig. 13. In some embodiments, these instructions may be stored on a computer-readable medium.
Referring to fig. 10, in step 1001, a second identifier is determined, starting with a first identifier. In some embodiments, the second marker pattern corner point is searched in the set search direction with the first marker pattern corner point as a starting point. In some embodiments, the set search direction may comprise at least one of a direction directly in front of (corresponding to an angular direction of 0 °), directly behind (corresponding to an angular direction of 120 °), directly above (corresponding to an angular direction of 90 °), directly below (-90 °), and obliquely (e.g. an angular direction of ± 45 °) the corner point of the first identification pattern.
In some embodiments, n search directions are set, e.g. 8 searches are performed in each search direction v sn May be determined based on the following equation (9):
v sn =[cos(n·π/4)sin(n·π/4)],(n=1,2,…,8) (9)
in some embodiments, the search direction set in the current step may be determined according to a deviation angle between adjacent identification pattern corner points in the plurality of identification pattern corner points determined in the previous frame. Illustratively, the predetermined search direction is determined based on the following formula (10):
Figure BDA0003239973970000121
wherein (x) j ,y j ) Two-dimensional coordinates of a plurality of identification pattern corner points determined for a previous frame (or a previous image processing cycle); n is a radical of an alkyl radical last Determining the number of a plurality of identification pattern corner points for the previous frame; v. of s1 A search direction set for the first; v. of s2 For the second set search direction.
In some embodiments, as shown in FIG. 11, the corner point P is marked with a first marker pattern 1101 Is used as a search starting point, and a second identification pattern corner point P is searched in the set search direction 1102 The coordinate position of (a). For example, with a first marking pattern corner point P 1101 Is used as a search starting point, and is searched in a set search direction V by a search frame (e.g., a dotted line frame in fig. 11) at a certain search step length 1101 And searching for corner points of the identification pattern.
In some embodiments, if at least one candidate identifier exists in the search box, the candidate identifier pattern corner with the maximum likelihood value of the corner in the search box is preferentially selected as the second identifier pattern corner P 1102 . Limiting in search boxWith a suitable size, the corner point P of the first marking pattern 1101 Is used as a search starting point to perform second marking of the pattern corner point P 1102 During searching, the candidate marker pattern corner with the maximum corner likelihood value in the candidate markers appearing in the search box has a high possibility of being the marker pattern corner. Therefore, the candidate marker with the largest likelihood of the corner in the search box can be regarded as the corner point P of the second marker pattern 1102 In order to increase the data processing speed. In other embodiments, in order to improve the accuracy of identifying the corner points of the identification pattern, when at least one candidate identifier exists in the search box, the candidate identification pattern corner point with the largest corner likelihood value among the candidate identifiers appearing in the search box is selected to identify the corner point, so as to determine whether the candidate identification pattern corner point with the largest corner likelihood value is the identification pattern corner point. For example, a pose marker pattern matching template or a composite marker pattern matching template may be used to match an image in a certain range at a candidate marker pattern corner point with the maximum corner likelihood value, and the candidate marker pattern corner point satisfying a predetermined pattern matching degree criterion may be considered as a searched second marker pattern corner point P 1102
In some embodiments, with continued reference to FIG. 11, the size of the search box may be increased in steps, thereby increasing the search range in steps. The search step size may vary synchronously with the side length of the search box. In other embodiments, the size of the search box may be fixed.
In some embodiments, the marker pattern may be a black and white pattern, and the pattern matching may be performed based on the correlation coefficient in equation (7). And if the correlation coefficient is greater than the threshold value, the candidate identification pattern corner with the maximum corner likelihood value is considered as the identification pattern corner and is marked as a second identification pattern corner.
Referring to fig. 10, in step 1003, a search direction is determined based on the first identifier and the second identifier. In some embodiments, the search direction comprises: a first search direction and a second search direction. The first search direction may be a direction that is away from the second marker pattern corner point, with the coordinate position of the first marker pattern corner point as a starting point. The second search direction mayThe coordinate position of the corner point of the second identification pattern is taken as a starting point and is far away from the direction of the corner point of the first identification pattern. For example, the search direction V shown in FIG. 11 1102
In step 1005, the identifier is searched in the search direction with the first identifier or the second identifier as a starting point. In some embodiments, if the first identification pattern corner point is taken as a new starting point, the first search direction in the above embodiments may be taken as a search direction to perform the search for the identification pattern corner point. If the second identification pattern corner point is used as a new search starting point, the second search direction in the above embodiment may be used as a search direction to search for the identification pattern corner point. In some embodiments, a new identification pattern corner point (e.g., the third identification pattern corner point P in FIG. 11) is searched for 1103 ) May be performed similarly to step 1001. In some embodiments, the search step size may be the first identification pattern corner point P 1101 And a second identification pattern corner point P 1102 A distance L therebetween 1
In some embodiments, in response to the search distance being greater than the search distance threshold, determining a pixel with the largest corner likelihood value of the remaining pixels in the pixel set as a candidate identification pattern corner; and matching the identification pattern matching template with the identification patterns at the positions of the corner points of the candidate identification patterns to identify the first identification. In some embodiments, after determining the pixel with the largest corner likelihood value of the remaining pixels in the set of pixels as the new candidate marker pattern corner, the new first marker may be identified based on a method similar to step 803. In some embodiments, a search distance greater than a search distance threshold may be understood as a search distance greater than a search distance threshold in some or all of the search directions. In some embodiments, the search distance threshold may comprise a set multiple of the distance between the (N-1) th pose identification pattern corner point and the (N-2) th pose identification pattern corner point, where N ≧ 3. For example, the search distance threshold is twice the distance of the first two identified pattern corner points. Thus, the maximum searching distance for searching the third identification pattern corner is twice the distance between the first identification pattern corner and the second identification pattern corner, if the identification pattern corner is not searched in the searching direction after the searching distance is reached, the pixel with the maximum corner likelihood value of the rest pixels in the pixel set is determined as a new candidate pose identification pattern corner, a new first identification is identified, and the current searching process is correspondingly stopped. In some embodiments, similar to method 800, a new first marker pattern corner point may be re-determined, and similar to method 1000, the remaining marker pattern corner points may be searched for with the new marker pattern corner point as a search starting point.
In some embodiments, in response to the number of identified markers being greater than or equal to the marker number threshold, a pose of the object relative to the reference coordinate system may be determined based on the identified markers, and the search for the markers may be stopped accordingly. For example, in response to the number of identified marker pattern corner points being greater than or equal to the marker number threshold, the search for marker pattern corner points is stopped. For example, when four marker pattern corners are identified, the search for marker pattern corners is stopped.
In some embodiments, in response to the number of identified markers being less than the marker number threshold, determining the pixel with the largest corner likelihood value of the remaining pixels in the pixel set as the candidate marker pattern corner; and matching the identification pattern matching template with the identification patterns at the positions of the corner points of the candidate identification patterns to identify the first identification. In some embodiments, if the total number of identified corner points of the marker pattern is less than the threshold of the number of markers, the search based on the first marker pattern in the above steps is considered to fail. In some embodiments, if all the identified markers do not include the composite marker, for example, the identified marker pattern corner points do not include the composite marker pattern corner points, the search based on the first marker pattern in the above steps is considered to have failed. In some embodiments, in case of a search failure, the pixel with the largest corner likelihood value of the remaining pixels in the pixel set is determined as a new candidate identification pattern corner, after which a new first identification may be identified based on a method similar to step 803. In some embodiments, similar to method 800, a new first logo pattern corner may be re-determined, and similar to method 1000, the remaining logo pattern corners may be searched for, with the new logo pattern corner as a search starting point.
In some embodiments, if the identified markers include composite markers, the remaining markers searched may not determine the marker type (it is understood that the marker type includes pose markers and composite markers). For example, if the first identifier is a composite identifier, it may be uncertain whether the second identifier is specifically a pose identifier or a composite identifier.
In some embodiments, if the identified identity does not include a composite identity, the type of the new identity searched is determined. For example, if the first marker is not a composite marker, it needs to be determined whether the second marker is specifically a pose marker or a composite marker. If the first identifier and the second identifier are not composite identifiers, it is required to determine whether the third identifier is a pose identifier or a composite identifier.
In some embodiments, after the identification pattern corner is searched or identified, the determined identification pattern corner may be sub-pixel positioned to improve the position accuracy of the identification pattern corner.
In some embodiments, the CL values of the pixel points may be model-based fitted to determine the coordinates of the sub-pixel located logo pattern corner points. For example, the fitting function of the CL value of each pixel point in the ROI may be a quadratic function, and the extreme point of the function is a sub-pixel point. The fitting function may be determined based on the following equations (11) and (12):
S(x,y)=ax 2 +by 2 +cx+dy+exy+f (11)
Figure BDA0003239973970000141
wherein S (x, y) is a fitting function of CL values of all pixel points in each ROI, and a, b, c, d, e and f are coefficients; x is the number of c X-coordinate, y, for pose identification c The y coordinate of the pose mark.
In some embodiments of the present disclosure, the present disclosure also provides a computer device comprising a memory storing at least one instruction; and a processor coupled with the memory for executing at least one instruction to perform some or all of the steps in the methods of the present disclosure, such as some or all of the steps in the methods disclosed in fig. 5, 6, 7, 8, and 10.
Fig. 12 shows a block schematic diagram of a computer device 1200 according to some embodiments of the present disclosure. Referring to fig. 12, the computer apparatus 1200 includes a Central Processing Unit (CPU) 1201, a system memory 1204 including a Random Access Memory (RAM) 1202 and a Read Only Memory (ROM) 1203, and a system bus 1205 connecting the respective components. The computer device 1200 also includes an input/output system and mass storage device 1207 for storing an operating system 1212, application programs 1214, and other program modules 1215. The input/output devices include an input/output controller 1210 that is primarily comprised of a display 1208 and an input device 1209.
The mass storage device 1207 is connected to the central processing unit 1201 through a mass storage controller (not shown) connected to the system bus 1205. The mass storage device 1207 and its associated computer-readable media provide non-volatile storage for the computer device. That is, the mass storage device 1207 may include a computer-readable medium (not shown) such as a hard disk or Compact disk Read-Only Memory (CD-ROM) drive.
Without loss of generality, computer readable media may comprise computer storage media and communication media. Computer storage media includes volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data. Computer storage media includes RAM, ROM, flash memory or other solid state storage technology, CD-ROM, or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices. Of course, those skilled in the art will appreciate that computer storage media is not limited to the foregoing. The system memory and mass storage devices described above may be collectively referred to as memory.
The computer device 1200 may connect to the network 1212 through a network interface unit 1211 connected to the system bus 1205.
The system memory 1204 or mass storage device 1207 is also used to store one or more instructions. The central processor 1201, by executing the one or more instructions, implements all or part of the steps of the methods in some embodiments of the present disclosure described above.
In some embodiments of the present disclosure, the present disclosure also provides a computer-readable storage medium, in which at least one instruction is stored, and the at least one instruction is executed by a processor to cause a computer to perform some or all of the steps of the positioning method of some embodiments of the present disclosure, such as some or all of the steps of the methods disclosed in fig. 5, fig. 6, fig. 7, fig. 8, and fig. 10.
In some embodiments, the present disclosure also provides a non-transitory computer-readable storage medium comprising instructions, such as a memory comprising computer programs (instructions), which are executable by a processor of a computer device to perform the methods shown in the various embodiments of the present application. For example, the non-transitory computer readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a Compact Disc Read-Only Memory (CD-ROM), a magnetic tape, a floppy disk, an optical data storage device, and the like.
Fig. 13 illustrates a modular schematic of a surgical robotic system 1300 according to some embodiments of the present disclosure. In some embodiments of the present disclosure, referring to fig. 13, a surgical robotic system 1300 may comprise: a surgical tool 1350, an image collector 1310, and a processor 1320. The surgical tool 1350 may include a manipulator arm 1340, an effector 1330 disposed on a distal end of the tip of the manipulator arm 1340, and at least one composite marker and a plurality of pose markers disposed on the tip of the manipulator arm 1340. The image collector 1310 may be used to collect positioning images of the operation arm 1340. Processor 1320 is coupled to image collector 1310 for performing some or all of the steps of the methods of some embodiments of the present disclosure, such as some or all of the steps of the methods disclosed in fig. 5, 6, 7, 8, and 10.
While particular embodiments of the present disclosure have been illustrated and described, it would be obvious to those skilled in the art that various other changes and modifications can be made without departing from the spirit and scope of the disclosure. Accordingly, it is intended to cover in the appended claims all such changes and modifications that are within the scope of this disclosure.

Claims (25)

1. A method for determining a pose of an object, comprising:
acquiring a positioning image;
identifying a plurality of markers on the object in the positioning image, the plurality of markers including a plurality of pose markers for identifying poses and at least one composite marker for identifying poses and angles; and
determining a pose of the object relative to a reference coordinate system based on the at least one composite signature and the plurality of pose signatures.
2. The method of claim 1, comprising:
determining two-dimensional coordinates of the at least one composite marker and the plurality of pose markers in the positioning image.
3. The method of claim 2, comprising:
and determining three-dimensional coordinates of the at least one composite marker and the plurality of pose markers in the object coordinate system based on the at least one composite marker.
4. The method of claim 3, comprising:
and determining the pose of the object coordinate system relative to the reference coordinate system as the pose of the object relative to the reference coordinate system based on the two-dimensional coordinates of the at least one composite mark and the plurality of pose marks in the positioning image and the three-dimensional coordinates in the object coordinate system.
5. The method of claim 2, comprising:
determining three-dimensional coordinates of the at least one composite marker and the plurality of pose markers in a marker coordinate system;
determining a roll angle of the marker coordinate system relative to an object coordinate system based on the at least one composite marker;
determining three-dimensional coordinates of the at least one composite marker and the plurality of pose markers in the object coordinate system based on a roll angle of the marker coordinate system relative to the object coordinate system and the three-dimensional coordinates of the at least one composite marker and the plurality of pose markers in the marker coordinate system; and
and determining the pose of the object coordinate system relative to the reference coordinate system as the pose of the object relative to the reference coordinate system based on the two-dimensional coordinates of the at least one composite identifier and the plurality of pose identifiers in the positioning image and the three-dimensional coordinates in the object coordinate system.
6. The method of claim 5, comprising:
determining a first axial angle of one of the at least one composite marker identified in the object coordinate system;
determining a second axial angle of one of the at least one composite marker identified in the marker coordinate system; and
and determining the roll angle of the identification coordinate system relative to the object coordinate system based on the first axial angle and the second axial angle.
7. The method of claim 5, the X-axis of the marker coordinate system pointing to the composite marker, the method comprising:
and determining a first axial angle of the composite marker marked in the object coordinate system as a roll angle of the marker coordinate system relative to the object coordinate system.
8. The method of claim 6 or 7, comprising:
determining the first-axis angle based on a pattern included in the composite signature.
9. The method of claim 1, comprising:
determining a plurality of candidate identifiers from the positioning image;
identifying a first token of the plurality of tokens from the plurality of candidate tokens; and
and searching other identifications by taking the first identification as a starting point.
10. The method of claim 9, comprising:
identifying the first identifier based on an identifier pattern matching template.
11. The method of claim 9, comprising:
and identifying the composite mark based on a plurality of composite mark pattern matching templates with different patterns.
12. The method of claim 9, comprising:
in response to identifying the composite marker, identifying other markers based on a pose marker pattern matching template.
13. The method of claim 9, the marker comprising a marker pattern and marker pattern corner points in the marker pattern, the method comprising:
determining a region of interest in the positioning image;
dividing the region of interest into a plurality of sub-regions;
determining the pixel with the maximum corner likelihood value in each sub-region to form a pixel set;
determining a pixel with the maximum corner likelihood value in the candidate identifications as a candidate identification pattern corner; and
and matching the identification pattern matching template with the identification patterns at the positions of the corner points of the candidate identification patterns to identify the first identification.
14. The method of claim 13, comprising:
and in response to the failure of matching, determining the pixel with the maximum corner likelihood value in the rest pixels in the pixel set as a candidate identification pattern corner.
15. The method of claim 9, comprising:
searching for a second identifier by taking the first identifier as a starting point;
determining a search direction based on the first identifier and the second identifier; and
and searching for the identifier in the searching direction by taking the first position identifier or the second identifier as a starting point.
16. The method of claim 15, comprising:
in response to the fact that the search distance is larger than the search distance threshold value, determining a pixel with the largest corner likelihood value of the rest pixels in the pixel set as a candidate identification pattern corner; and
and matching the identification pattern matching template with the identification patterns at the positions of the corner points of the candidate identification patterns to identify the first identification.
17. The method of claim 15, comprising:
in response to the number of identified markers being greater than or equal to a marker number threshold, determining a pose of the object relative to a reference coordinate system based on the identified markers.
18. The method of claim 15, comprising:
in response to the fact that the number of the identified identifications is smaller than the identification number threshold value, determining the pixel with the largest corner likelihood value of the rest pixels in the pixel set as a candidate identification pattern corner; and
and matching the identification pattern matching template with the identification patterns at the corner positions of the candidate identification patterns to identify the first identification.
19. The method of any of claims 1-7, 9-18, comprising:
determining a pose of a tip instrument of the object relative to the reference coordinate system based on the pose of the object relative to the reference coordinate system.
20. The method of any of claims 1-7, 9-18, the plurality of markers being disposed on an outer surface of a cylindrical portion of the object.
21. The method according to any one of claims 1-7 and 9-18, wherein the outer surface of the columnar part of the object is provided with a positioning tag, the positioning tag comprises a plurality of identification patterns, the plurality of identification patterns comprises a plurality of different composite identification patterns and a plurality of pose identification patterns, and the plurality of different composite identification patterns and the plurality of pose identification patterns are located in the same pattern distribution band.
22. The method of claim 21, wherein N consecutive ones of the plurality of identification patterns comprise at least one composite identification pattern, wherein 2 ≦ N ≦ 4.
23. A computer device, the computer device comprising:
a memory for storing at least one instruction; and
a processor, coupled with the memory, to execute the at least one instruction to perform the method of any of claims 1-22.
24. A computer-readable storage medium having stored therein at least one instruction, the at least one instruction being executable by a processor to cause a computer to perform the method of any one of claims 1-22.
25. A surgical robotic system comprising:
a surgical tool comprising a manipulator arm, an effector disposed on a distal end of a tip of the manipulator arm, and at least one composite marker and a plurality of pose markers disposed on the tip of the manipulator arm;
the image collector is used for collecting a positioning image of the operating arm; and
a processor coupled to the image collector for performing the method of any of claims 1-22 to determine the pose of the actuator.
CN202111016342.9A 2021-08-31 2021-08-31 Method for determining pose of object and surgical robot system Pending CN115731290A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111016342.9A CN115731290A (en) 2021-08-31 2021-08-31 Method for determining pose of object and surgical robot system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111016342.9A CN115731290A (en) 2021-08-31 2021-08-31 Method for determining pose of object and surgical robot system

Publications (1)

Publication Number Publication Date
CN115731290A true CN115731290A (en) 2023-03-03

Family

ID=85291702

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111016342.9A Pending CN115731290A (en) 2021-08-31 2021-08-31 Method for determining pose of object and surgical robot system

Country Status (1)

Country Link
CN (1) CN115731290A (en)

Similar Documents

Publication Publication Date Title
CN105729468B (en) A kind of robotic workstation based on the enhancing of more depth cameras
Lins et al. Vision-based measurement for localization of objects in 3-D for robotic applications
CN113910219B (en) Exercise arm system and control method
JP4709668B2 (en) 3D object recognition system
CN114536292A (en) Error detection method based on composite identification and robot system
Pretto et al. Flexible 3D localization of planar objects for industrial bin-picking with monocamera vision system
WO2008077132A1 (en) Imaging model and apparatus
CN114347037B (en) Robot system fault detection processing method based on composite identification and robot system
CN112836558A (en) Mechanical arm tail end adjusting method, device, system, equipment and medium
EP4209312A1 (en) Error detection method and robot system based on association identification
EP4209313A1 (en) Error detection method and robot system based on a plurality of pose identifications
CN112132136A (en) Target tracking method and device
CN114536402B (en) Robot system fault detection processing method based on association identification and robot system
CN115731290A (en) Method for determining pose of object and surgical robot system
CN115731289A (en) Method for determining pose of object and surgical robot system
CN114536331B (en) Method for determining external stress of deformable mechanical arm based on association identification and robot system
CN115957005A (en) Method for controlling an operating arm and surgical robotic system
US20230033339A1 (en) Image processing system
CN115700768A (en) Positioning method and surgical robot system
CN115946105A (en) Control method of operation arm and surgical robot system
CN111656400A (en) Object detection method and robot system
CN115708128A (en) Control method of operation arm and surgical robot system
CN114536401B (en) Robot system fault detection processing method based on multiple pose identifiers and robot system
CN116468646A (en) Execution arm detection method based on composite identification and robot system
CN116468648A (en) Execution arm detection method based on association identification and robot system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination