CN115624387A - Stitching control method, control system, readable storage medium and robot system - Google Patents

Stitching control method, control system, readable storage medium and robot system Download PDF

Info

Publication number
CN115624387A
CN115624387A CN202211058599.5A CN202211058599A CN115624387A CN 115624387 A CN115624387 A CN 115624387A CN 202211058599 A CN202211058599 A CN 202211058599A CN 115624387 A CN115624387 A CN 115624387A
Authority
CN
China
Prior art keywords
suture
needle
needle holder
suturing
real
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211058599.5A
Other languages
Chinese (zh)
Inventor
请求不公布姓名
朱祥
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Microport Medbot Group Co Ltd
Original Assignee
Shanghai Microport Medbot Group Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Microport Medbot Group Co Ltd filed Critical Shanghai Microport Medbot Group Co Ltd
Priority to CN202211058599.5A priority Critical patent/CN115624387A/en
Publication of CN115624387A publication Critical patent/CN115624387A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B17/04Surgical instruments, devices or methods, e.g. tourniquets for suturing wounds; Holders or packages for needles or suture materials
    • A61B17/0469Suturing instruments for use in minimally invasive surgery, e.g. endoscopic surgery
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/107Visualisation of planned trajectories or target regions

Landscapes

  • Health & Medical Sciences (AREA)
  • Surgery (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Molecular Biology (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Robotics (AREA)
  • Surgical Instruments (AREA)

Abstract

The invention provides a suture control method, a control system, a readable storage medium and a robot system, wherein the suture control method comprises the following steps: the method comprises the following steps: acquiring a planned suture track; step two: driving a first needle holder to clamp a suture needle to penetrate into a suture object along a planned suture track, acquiring real-time morphological information of the suture object based on real-time image information of a suture area, and updating the planned suture track based on the real-time morphological information of the suture object; the first needle holder is further driven to clamp the suture needle and penetrate out of the suture object along the updated planned suture track; step three: after the suture needle penetrates out of the suture object to reach a preset length, the second needle holder is driven to clamp the suture needle, the first needle holder is loosened, and then the second needle holder is driven to clamp the suture needle and move to a preset position. With such a configuration, the position change and the form change of the suture object caused by the needle threading action can be effectively dealt with, so that the suture needle can be threaded out from the preset needle outlet point, and the accuracy of the suture is effectively improved.

Description

Stitching control method, control system, readable storage medium and robot system
Technical Field
The invention relates to the technical field of medical instruments, in particular to a suture control method, a suture control system, a readable storage medium and a robot system.
Background
The design concept of the surgical robot system is to adopt a minimally invasive mode and accurately implement complex surgical operations. Under the condition that the traditional operation faces various limitations, a surgical robot system is developed to replace the traditional operation, the wound is small, the bleeding is less, the recovery is fast, the postoperative hospitalization time of a patient is greatly shortened, the postoperative survival rate and the recovery rate can also be obviously improved, the surgical robot system is favored by doctors and patients, and the surgical robot system is widely applied to various clinical operations as a high-end medical instrument at present.
In some application scenarios, the surgical robot system may also be used to perform a suturing operation, and generally, the surgical robot system sequentially holds a suturing needle by two mechanical arms to perform the suturing operation. However, during the suturing process, the needle threading action may cause the position of the sutured tissue to change, so that the deviation between the actual suturing track and the preset suturing track is large, and the suturing accuracy is low.
Disclosure of Invention
The invention aims to provide a suture control method, a suture control system, a readable storage medium and a robot system, which aim to solve the problem of low accuracy rate of the existing suture.
In order to solve the above-described technical problem, a first aspect of the present invention provides a suture control method including:
the method comprises the following steps: acquiring a planned suture track;
step two: driving a first needle holder to clamp a suture needle to penetrate into a suture object along the planned suture track, acquiring real-time morphological information of the suture object based on real-time image information of a suture area, and updating the planned suture track based on the real-time morphological information of the suture object; the first needle holder is further driven to clamp the suture needle and penetrate out of the suture object along the updated planned suture track;
step three: and after the sewing needle penetrates out of the sewing object to reach a preset length, driving a second needle holder to clamp the sewing needle, loosening the first needle holder, and further driving the second needle holder to clamp the sewing needle to move to a preset position.
Optionally, before the second step, the suture control method further includes: calibrating the relative coordinate relationship between the joint point of the first needle holder and the needle head of the suture needle; in the second step, after the first needle holder is driven to clamp the suture needle and penetrate into the suture object along the planned suture track, the real-time coordinate of the needle head of the suture needle is obtained based on the real-time image information of the suture area and the relative coordinate relation; and correcting the driving track of the first needle holder based on the updated planned suture track according to the real-time coordinates of the needle head of the suture needle so as to drive the suture needle to penetrate out of the suture object.
Optionally, the step of calibrating the relative coordinate relationship between the joint point of the first needle holder and the needle head of the suture needle comprises:
identifying and obtaining the head of the first needle holder and the suture needle according to a target detection algorithm based on real-time image information of a suture region;
intercepting an image of the head of the first needle holder, and obtaining coordinates of a joint point of the first needle holder according to a key point algorithm;
intercepting the image of the suture needle, and obtaining the coordinate of the needle head of the suture needle according to a key point algorithm;
and calibrating the relative coordinate relationship between the joint point of the first needle holder and the needle head of the suture needle according to the coordinate of the joint point of the first needle holder and the coordinate of the needle head of the suture needle.
Optionally, in the third step, after the step of driving the second needle holder to clamp the suture needle, whether the second needle holder successfully clamps the suture needle is further determined based on real-time image information of a suture area; if the judgment result is yes, continuing to loosen the first needle holder and further driving the second needle holder to clamp the suture needle and move to a preset position; if the judgment result is negative, the step of driving the second needle holder to clamp the suture needle is executed again.
Optionally, in the third step, the step of determining whether the second needle holder successfully holds the suture needle based on real-time image information of the suture region includes: identifying and obtaining the head of the second needle holder according to a target detection algorithm based on real-time image information of a suture region; and intercepting the image of the second needle holder, and judging whether the suture needle is successfully clamped or not according to a classification algorithm.
Optionally, in the third step, the step of driving the second needle holder to clamp the suture needle includes:
identifying a segmentation image of the suture needle through an image segmentation algorithm based on real-time image information of a suture region, and extracting a central line and an outer contour of the suture needle according to the segmentation image of the suture needle;
setting the position of 2/3-4/5 of the length of the central line as a second clamping point along the extending direction of the central line of the sewing needle from the tail end of the sewing needle to the needle head;
setting the direction of a tangent vector perpendicular to the second clamping point as a second clamping direction by combining the outer contour of the suture needle;
and driving the second needle holder to clamp the suture needle at the second clamping point along the second clamping direction.
Optionally, in the first step, the step of obtaining the planned suture trajectory includes:
identifying a segmentation image of the outer contour of the suturing object through an image segmentation algorithm based on real-time image information of a suturing area, and extracting a central axis of the suturing object according to the segmentation image of the outer contour of the suturing object;
and setting a group of penetrating points and penetrating points in a direction perpendicular to a central axis of the suture object according to the segmentation image of the outer contour of the suture object to obtain the planned suture track.
Optionally, in the first step, the planned suture trajectory is further updated according to input information; and/or in the second step, after the first needle holder is driven to clamp the suture needle and penetrate into the suture object along the planned suture track, the planned suture track is updated according to input information.
Optionally, if there is a next needle to be sutured, the suturing control method further includes:
step four: driving the first needle holder to clamp the suture needle, and loosening the second needle holder; and
step five: and updating the planned suture track of the next needle to be sutured, and returning to repeatedly execute the second step and the third step.
Optionally, in the fourth step, after the step of driving the first needle holder to clamp the suture needle, whether the first needle holder successfully clamps the suture needle is further determined based on real-time image information of a suture region; if the judgment result is yes, continuing to perform the step of loosening the second needle holder; if the judgment result is negative, the step of driving the first needle holder to clamp the suture needle is executed again.
Optionally, in the fourth step, the step of determining whether the first needle holder successfully holds the suture needle based on real-time image information of the suture region includes: identifying and obtaining the head of the first needle holder according to a target detection algorithm based on real-time image information of a suture region; and intercepting the image of the first needle holder, and judging whether the suture needle is successfully clamped or not according to a classification algorithm.
Optionally, in the fourth step, the step of driving the first needle holder to clamp the suture needle includes:
identifying a segmentation image of the suture needle through an image segmentation algorithm based on real-time image information of a suture region, and extracting a central line and an outer contour of the suture needle according to the segmentation image of the suture needle;
setting the position of 1/5-1/3 of the length of the central line as a first clamping point along the extending direction of the central line of the sewing needle from the tail end of the sewing needle to the direction of a needle head;
setting the direction of a tangent vector perpendicular to the first clamping point as a first clamping direction by combining the outer contour of the suture needle;
and driving the first needle holder to clamp the suture needle at the first clamping point along the first clamping direction.
In order to solve the above technical problems, a second aspect of the present invention also provides a suture control system for implementing the suture control method as described above; the suture control system includes: the device comprises an image acquisition module, an analysis processing module and a motion control module;
the image acquisition module is used for acquiring real-time image information of a suture region;
the analysis processing module is used for acquiring a planned suture track and updating the planned suture track based on real-time morphological information of the suture object;
the motion control module is used for driving the first needle holder and the second needle holder to move according to the analysis processing result of the analysis processing module.
In order to solve the above technical problem, a third aspect of the present invention also provides a readable storage medium, on which a program is stored, the program implementing the steps of the suture control method as described above when executed.
In order to solve the above technical problem, a fourth aspect of the present invention further provides a robotic system, which includes a first needle holder, a second needle holder, and the suture control system as described above.
In summary, in the suture control method, the control system, the readable storage medium, and the robot system according to the present invention, the suture control method includes: the method comprises the following steps: acquiring a planned suture track; step two: driving a first needle holder to clamp a suture needle to penetrate into a suture object along the planned suture track, acquiring real-time morphological information of the suture object based on real-time image information of a suture area, and updating the planned suture track based on the real-time morphological information of the suture object; the first needle holder is further driven to clamp the suture needle and penetrate out of the suture object along the updated planned suture track; step three: and after the sewing needle penetrates out of the sewing object to reach a preset length, driving a second needle holder to clamp the sewing needle, loosening the first needle holder, and further driving the second needle holder to clamp the sewing needle to move to a preset position.
According to the configuration, after the suture needle penetrates into the suture object, the real-time form information of the suture object can be monitored through the real-time image information of the suture area, the planned suture track is updated based on the real-time form information of the suture object, the position change and the form change of the suture object caused by the needle penetrating action can be effectively responded, the suture needle can penetrate out from the preset needle outlet point, and the suture accuracy is effectively improved.
Drawings
It will be appreciated by those skilled in the art that the drawings are provided for a better understanding of the invention and do not constitute any limitation to the scope of the invention. Wherein:
FIG. 1 is a schematic view of a suturing control system in accordance with an embodiment of the present invention;
FIGS. 2 a-2 f are schematic views illustrating the operation of a suture control method according to an embodiment of the present invention;
FIGS. 3a to 3c are schematic views illustrating the deformation of the suturing object according to the embodiment of the present invention caused by the penetration of the suturing needle;
FIG. 4 is a schematic diagram of an object detection algorithm of an embodiment of the present invention;
FIGS. 5a and 5b are schematic diagrams of a keypoint algorithm of an embodiment of the invention;
FIGS. 6a and 6b are schematic diagrams of a classification algorithm according to an embodiment of the present invention;
FIG. 7 is a schematic illustration of determining a clamping point and a clamping direction for an embodiment of the invention;
FIG. 8 is a schematic diagram of an image segmentation algorithm according to an embodiment of the present invention;
FIG. 9 is a schematic operational flow diagram of obtaining a planned suture trajectory in accordance with an embodiment of the present invention;
FIG. 10 is a schematic diagram of a planned suture trajectory gradual update according to an embodiment of the present invention.
Detailed Description
To further clarify the objects, advantages and features of the present invention, a more particular description of the invention will be rendered by reference to specific embodiments thereof which are illustrated in the appended drawings. It is to be noted that the drawings are in simplified form and are not to scale, but are provided for the purpose of facilitating and clearly illustrating embodiments of the present invention. Further, the structures illustrated in the drawings are often part of actual structures. In particular, the drawings are intended to show different emphasis, sometimes in different proportions.
As used in this application, the singular forms "a", "an" and "the" include plural referents, the term "or" is generally employed in a sense including "and/or," the terms "a" and "an" are generally employed in a sense including "at least one," the terms "at least two" are generally employed in a sense including "two or more," and the terms "first", "second" and "third" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicit to the number of technical features indicated. Thus, a feature defined as "first," "second," or "third" may explicitly or implicitly include one or at least two of that feature, "one end" and "the other end," and "proximal end" and "distal end" generally refer to the corresponding two parts, including not only the endpoints. In a manual or hand-operated application scenario, the terms "proximal" and "distal" are defined herein with respect to an operator, such as a surgeon or clinician. The term "proximal" refers to a position of an element closer to the operator, and the term "distal" refers to a position of an element closer to the surgical implement instrument and thus further from the operator. Furthermore, as used herein, the terms "mounted," "connected," and "disposed" on another element should be construed broadly and generally merely indicate that a connection, coupling, fit, or drive relationship exists between the two elements, and a connection, coupling, fit, or drive relationship between the two elements, whether direct or indirect through intervening elements, should not be construed as indicating or implying any spatial relationship between the two elements, i.e., an element may be located in any orientation within, outside, above, below, or to one side of another element unless the content clearly indicates otherwise. The specific meanings of the above terms in the present invention can be understood by those skilled in the art according to specific situations. Moreover, directional terminology, such as above, below, up, down, upward, downward, left, right, etc., is used with respect to the exemplary embodiments as they are shown in the figures, with the upward or upward direction being toward the top of the corresponding figure and the downward or downward direction being toward the bottom of the corresponding figure.
The invention aims to provide a suturing control method, a control system, a readable storage medium and a robot system, which aim to solve the problem of low suturing accuracy in the prior art.
The following description refers to the accompanying drawings.
The embodiment of the invention provides a robot system, which comprises an execution end in an application scene applied to automatic suturing, wherein the execution end comprises at least two mechanical arms, a first needle holder 11 and a second needle holder 12 (see fig. 2 d). The first needle holder 11 and the second needle holder 12 are respectively mounted on two mechanical arms, and the mechanical arms are used for moving under the control of instructions and driving the first needle holder 11 and the second needle holder 12 mounted thereon to move so as to execute the sewing operation. In some application scenarios, the robotic system is a surgical robotic system. It can be understood that the surgical robot system may be a master-slave teleoperated surgical robot system, the executing end is the slave end, or a single-end surgical robot system, and the operator directly operates the executing end to execute the operation, which is not limited in the present invention.
As shown in fig. 1, the surgical robot system includes a suture control system including an image acquisition module 81, an analysis processing module 82, and a motion control module 83; the image acquisition module 81 is configured to acquire real-time image information of a suture region, and the analysis processing module 82 is configured to acquire a planned suture trajectory and update the planned suture trajectory based on real-time morphological information of the suture object; the motion control module 83 is configured to drive the first needle holder 11 and the second needle holder 12 to move according to the analysis result of the analysis processing module 82. In an example, the image acquisition module 81 includes, for example, a binocular endoscope for acquiring stereo image data in the abdominal cavity; the image capturing module 81 may further include an image capturing device such as a camera outside the body. Preferably, the image acquisition module 81 further comprises a cold light source 12 for supplementing light during imaging of the binocular endoscope. Preferably, the suturing control system further comprises a display device and other modules, which are used for displaying the real-time image information of the suturing area obtained by the image acquisition module 81. Optionally, the analysis processing module 82 includes a server, an embedded device, and other devices with image data analysis processing functions, which are common in the art. The motion control module 83 includes, for example, a main control device and a motion controller of the surgical robot, and is capable of performing drive control of the mechanical arm, the first needle holder 11, and the second needle holder 12 according to an instruction. Preferably, the suturing control system further comprises a human-computer interaction module 84, wherein the human-computer interaction module 84 includes, but is not limited to, a mouse, a keyboard, a touch screen, and the like, and is used for inputting information and outputting feedback information such as image information.
Based on the suturing control system described above, please refer to fig. 2a to 2f, an embodiment of the present invention provides a suturing control method for performing a suturing operation on a suturing target 7. The suture control method can be realized based on the suture control system. In an exemplary embodiment, the suturing object 7 may be a wound of a patient, and it should be noted that the suturing object 7 is not limited to be a wound of a patient, and may also be a wound model prosthesis, etc. which may be used for training or operation verification by an operator, and the present invention does not limit the suturing application scenario of the robot system.
The suture control method includes:
step one S1: acquiring a planned suture track; as shown in fig. 2a, the planned suture trajectory comprises sets of entry points 101 and exit points 102 distributed on both sides of the suture object 7. Alternatively, there may be multiple planned suture trajectories around the suture object 7, if multiple needles are to be sutured, which constitutes a set of planned suture trajectories. Wherein the plurality of planned suture trajectories are preferably arranged at equal intervals. The planned suture trajectory set may be set in advance, for example, calculated by the analysis processing module 82, or may be obtained according to an input from an operator, and the manner of acquiring the planned suture trajectory and the planned suture trajectory set in step S1 is not limited. In particular, the planned suture trajectory acquired in step S1 is a specific planned suture trajectory in the planned suture trajectory set, and is preferably a planned suture trajectory of the first needle positioned at the end of the suture object 7 in the planned suture trajectory set. Of course, if only one suture needle is needed, only one planned suture track is in the set of planned suture tracks.
Step two S2: as shown in fig. 2b, the first needle-holding jaw 11 is driven to hold the suture needle 2 to penetrate into the suture object 7 along the planned suture trajectory, real-time morphological information of the suture object 7 is acquired based on real-time image information of a suture region, and the planned suture trajectory is updated based on the real-time morphological information of the suture object 7; the first needle holder is further driven to clamp the suture needle and the suture needle passes out of the suture object 7 along the updated planned suture track; the suturing region is a single region including the suturing target 7 and the adjacent space between the suturing targets 7, and the range of the suturing region can be set according to the actual surgical requirements. Preferably, the real-time image information of the suture region includes at least a part of the suture object 7, the suture needle 2, the first needle holder 11, and the second needle holder 12. It will be appreciated that the suture needle 2 is passed into the suture object 7 along the planned suture trajectory from the point of introduction 101 of the planned suture trajectory and out of the suture object 7 from the point of exit 102.
Referring to fig. 3a to 3c, since the suturing target 7 is generally elastic tissue, the suturing target 7 will deform under the action of the piercing force during the process of the suturing needle 2 passing through and out of the suturing target 7. Resulting in the planned suture trajectory acquired before the suture is no longer suitable for the subsequent suture path. In step S2, the suture needle 2 may temporarily stop the continuous penetration after penetrating the suture object 7 for a certain length, and at this time, real-time morphological information of the suture object 7 may be obtained by recognizing real-time image information of the suture region, and the planned suture trajectory may be updated based on the real-time morphological information of the current suture object 7, for example, the position of the threading point 102 may be adjusted. Therefore, the position change and the form change of the sewing object 7 caused by the needle threading action can be effectively dealt with, the sewing needle 2 can be threaded out from the updated threading point 102, and the sewing accuracy is effectively improved. The step of acquiring the real-time morphological information of the suturing object 7 based on the real-time image information of the suturing area and updating the planned suturing trajectory based on the real-time morphological information of the suturing object 7 may be calculated by the analysis processing module 82, or may be obtained by the operator manually recognizing the real-time morphological information of the suturing object 7 and redrawing and inputting the information by the human-machine interaction module 84.
Step three, S3: after the suture needle 2 passes out of the suture object 7 to reach a preset length (as shown in fig. 2 c), driving a second needle holder 12 to hold the suture needle 2 (as shown in fig. 2 d), releasing the first needle holder 11 (as shown in fig. 2 e), and further driving the second needle holder 12 to hold the suture needle 2 to move to a preset position; in this step, the predetermined length may be set according to the length and shape of the suture needle 2, the size of the suture object 7, and the like, and the predetermined position may be set according to the actual surgical scene. In one example, the second needle holder 12 is driven to hold the suture needle 2 to move to the predetermined position by passing over the suture object 7 while avoiding an obstacle such as the suture object 7 until the predetermined position is reached.
Optionally, if there is a next needle to be sutured, the suturing control method further includes a fourth step S4: driving the first needle holder 11 to hold the suture needle 2 (as shown in fig. 2 f), and releasing the second needle holder 12; and
step five S5: and updating the planned suture track of the next needle, and returning to repeatedly execute the second step S2 and the third step S3. Optionally, updating the planned suture trajectory of the next needle refers to selecting the planned suture trajectory of the next needle adjacent to the currently sutured needle from the planned suture trajectory set. Therefore, when the second step S2 and the third step S3 are repeatedly executed, the next needle adjacent to the currently sutured needle is sutured. And finishing the stitching after all the planned stitching tracks along a certain direction of the stitching object 7 in the planned stitching track set are stitched.
With the above arrangement, after the suture needle 2 penetrates into the suture object, the real-time form information of the suture object 7 can be monitored through the real-time image information of the suture area, and the planned suture track is updated based on the real-time form information of the suture object 7, so that the position change and the form change of the suture object 7 caused by the needle penetrating action can be effectively dealt with, the suture needle 2 can penetrate out from the preset needle outlet point 102, and the suture accuracy is effectively improved.
Optionally, before the second step S2, the suture control method further includes:
step S1a: calibrating the relative coordinate relationship between the joint point 111 of the first needle holder 11 and the needle head 21 of the suture needle 2;
in the second step S2, after the first needle holder 11 is driven to clamp the suture needle 2 and penetrate into the suture object 7 along the planned suture track, the real-time coordinate of the needle head 21 of the suture needle 2 is obtained based on the real-time image information of the suture area and the relative coordinate relation; and correcting the driving track of the first needle holder 11 based on the updated planned suture track according to the real-time coordinates of the needle head 21 of the suture needle 2 so as to drive the suture needle 2 to penetrate out of the suture object 7.
Generally, as shown in FIG. 3a, the suture needle 2 has a needle 21 and a tail end 22, the needle 21 is used for puncture, and the tail end 22 is connected with the suture. It can be understood that, when the first needle holder 11 is driven to clamp the suture needle 2 and penetrate into the suture object 7, the needle head 21 of the suture needle 2 is penetrated into the penetrating point 101 of the planned suture track, and then the first needle holder 11 is driven to rotate the suture needle 2 to penetrate into the suture object 7 according to the curvature of the suture needle 2. It is understood that the needle 21 of the suture needle 2 can be captured by the image acquisition module 81 before the suture needle 2 penetrates into the suture object 7, and then the analysis processing module 82 can calculate the real-time coordinates of the needle 21. However, once the suture needle 2 penetrates into the suture object 7, the needle head 21 of the suture needle 2 is shielded by the suture object 7 (as shown in fig. 3 b) and cannot be directly captured by the image capturing module 81. At this time, the real-time coordinates of the needle head 21 are lost, and it is difficult to accurately control the trajectory actually traveled by the suture needle 2 after the planned suture trajectory changes according to the positional change and the morphological change of the suture object 7 caused by the needle threading operation. The inventor has found that, since the first needle holder 11 includes the joint point 111, after the first needle holder 11 holds the suture needle 2, the coordinates of the joint point 111 and the coordinates of the needle 21 are relatively fixed, and the joint point 111 is always outside the suture object 7 during the whole suture process, the coordinates of the needle 21 can be indirectly obtained by monitoring the coordinates of the joint point 111. Based on the above research, in this embodiment, before the second step S2, the relative coordinate relationship between the joint point 111 of the first needle holder 11 and the needle head 21 of the suture needle 2 is calibrated based on the step S1a, so that the real-time coordinate of the needle head 21 inside the suture object 7 can be converted only by capturing the coordinate of the joint point 111 of the first needle holder 11 through the image acquisition module 81, and thus the advancing track of the suture needle 2 can be accurately controlled, and the suture accuracy is effectively improved.
Optionally, the step S1a of calibrating the relative coordinate relationship between the joint point 111 of the first needle holder 11 and the needle head 21 of the suture needle 2 includes:
step S1a1: identifying and obtaining the head of the first needle holder 11 and the suture needle 2 according to a target detection algorithm based on real-time image information of a suture region;
step S1a2: intercepting an image of the head of the first needle holder 11, and obtaining coordinates of a joint point 111 of the first needle holder 11 according to a key point algorithm;
step S1a3: intercepting the image of the suture needle 2, and obtaining the coordinate of the needle head 21 of the suture needle 2 according to a key point algorithm;
step S1a4: and calibrating the relative coordinate relationship between the joint point 111 of the first needle holder 11 and the needle head 21 of the suture needle 2 according to the coordinate of the joint point 111 of the first needle holder 11 and the coordinate of the needle head 21 of the suture needle 2.
The target detection algorithm and the key point algorithm may be based on a neural network algorithm, and those skilled in the art may select an appropriate target detection algorithm and key point algorithm according to the prior art to implement the above steps. The target detection algorithm and the key point algorithm are exemplarily described below with reference to fig. 4 and 5.
Referring to FIG. 4, a target frame 110 of the head of the first needle holder 11 and a target frame 20 of the suture needle 2 are identified by a target detection algorithm. Where Conv represents convolution operation, NMS represents non-maximum suppression, FC represents full join, and rule represents slot linear mapping. The convolutional layer uses the Google inceptionV1 network, corresponding to the first stage in fig. 4, for a total of 20 layers. The layer is mainly used for feature extraction, so that the generalization capability of the model is improved. The target detection layer first passes through 4 convolutional layers and 2 full-link layers, and finally generates 7x7x30 output. The purpose of passing through 4 convolutional layers first is to improve the generalization capability of the model. Yolo divides a piece 448x448 of the original image into 7x7 grids, and each cell is then responsible for detecting objects whose center points fall within the grid. The NMS screening layer is used for screening the most suitable ones in a plurality of results (a plurality of bounding boxes), firstly filtering out the boxes with the score lower than a threshold value, and carrying out NMS non-maximum value inhibition on the rest boxes to remove the boxes with higher overlapping degree. This results in the final most appropriate boxes and their categories.
Please refer to fig. 5a and fig. 5b, which illustrate a process of extracting the joint point 111 of the first needle holder 11 by using the keypoint algorithm, in the process illustrated in fig. 5a, conv represents convolution operation, concat represents feature map channel fusion, upsample represents up-sampling process, batchNorm2d represents batch normalization, and Relu represents non-linear mapping. The size of the real-time image information of the stitching region acquired by the image acquisition module 81 is compressed to 512 × 512, and then the image information is input into a feature extraction network (feature extractor), and features f4, f3, f2, and f1 output by layers layer1, layer2, layer3, and layer4 are sent into a feature fusion layer. f1 upsampling and f2 concat, and then obtaining h2 through convolution of 1x1,3x3. The same procedure gave h3 and h 4. h4 is convoluted by 3 × 3 and then sent to a convolution prediction branch to obtain the Score map. The coordinates of the first needle holder 11 in the Score map are expressed and processed in the form of thermodynamic diagram, the coordinates of all the joint points 11 of the first needle holder 11 are obtained through peak value calculation, and the coordinates are fed back to the original image through coordinate mapping.
Optionally, in the third step S3, after the step of driving the second needle holder 12 to hold the suture needle 2, whether the second needle holder 12 successfully holds the suture needle 2 is further determined based on real-time image information of a suture region; if the judgment result is yes, continuing to loosen the first needle holder 11 and further driving the second needle holder 12 to clamp the suture needle 2 and move to a preset position; if the judgment result is negative, the step of driving the second needle holder 12 to clamp the suture needle 2 is executed again.
In step S3, when the length of the penetrating needle 2 reaches the predetermined length, the second needle holder 12 can clamp the needle 2, and the needle replacement operation is started. However, the failure of the needle replacement operation often occurs in the past practice, and therefore, the present embodiment further adds an analysis and determination step on whether the second needle holder 12 successfully clamps the suture needle 2, only when the second needle holder 12 accurately clamps the suture needle 2, the first needle holder 11 will release the clamping of the suture needle 2, the second needle holder 12 will pull the suture needle 2 out of the suture object 7, otherwise, the position of the second needle holder 12 needs to be continuously adjusted until the second needle holder 12 can accurately clamp the suture needle 2.
Optionally, in step three S3, the step of determining whether the second needle holder successfully holds the suture needle based on the real-time image information of the suture region includes:
step S31: identifying and obtaining the head of the second needle holder 12 according to a target detection algorithm based on real-time image information of a suture region;
step S32: and intercepting the image of the second needle holder 12, and judging whether the suture needle 2 is successfully clamped or not according to a classification algorithm.
The target detection algorithm herein can refer to the above, and is not repeated here. The classification algorithm includes, but is not limited to, an image processing algorithm, an artificial intelligence algorithm, a deep learning algorithm, and the like. The classification algorithm is exemplarily described below in connection with fig. 6a and 6 b. In the flow shown in fig. 6a, conv stands for convolution operation, maxpool for maximum pooling, and FC for full join. The traditional convolution network or the full-connection network has the problems of information loss, loss and the like more or less during information transmission, and simultaneously, the gradient disappears or the gradient explodes, so that the deep network cannot be trained. In this embodiment, a network for classification is a network similar to VGG, and is improved on the basis of the VGG, and a residual error unit is added through a short-circuit mechanism. The basic cell structure is also the structure of convolution, BN and an activation function, but residual connection is added at the output position of each cell, and the cell output and the cell input are finally used as the final output through an activation function. The problems of information loss and loss are solved to a certain extent, the integrity of information is protected by directly bypassing the input information to output, and the whole network only needs to learn the part with the difference between input and output, so that the learning goal and difficulty are simplified.
Further, referring to fig. 7, in the third step S3, the step of driving the second needle holder 12 to clamp the suture needle 2 includes:
step S61: based on the real-time image information of the suture region, the segmentation image of the suture needle 2 is identified through an image segmentation algorithm, and a central line 23 (which refers to a central axis extending along the bending direction of the suture needle 2) and an outer contour 24 of the suture needle 2 are extracted according to the segmentation image of the suture needle 2.
Step S62: the position of 2/3 to 4/5 of the length of the center line 23 is set as a second clamping point 26 along the extending direction of the center line 23 of the sewing needle from the tail end 22 of the sewing needle to the needle head 21.
Step S63: the direction of the tangent 261 to the second clamping point 26 is set as the second clamping direction 28 in conjunction with the outer contour 24 of the suture needle 2.
Step S64: the second needle holder 12 is driven to hold the suture needle 2 at the second holding point 26 along the second holding direction 28.
In step S61, a person skilled in the art can select a suitable image segmentation algorithm according to the prior art. Referring to fig. 8, an exemplary image segmentation algorithm is shown, wherein the underlying network framework is a classical full convolution network (i.e., there is no full join operation in the network), the input of the network is an image (input image tile) with 572 × 572 and edges subjected to a mirror operation, such as a compression process of image information that can be real-time for a stitched area, and the left side of the network is a series of downsampling operations consisting of convolution and Max Pooling. The compression path consists of 4 blocks, each block using 3 effective convolution and 1 Max Pooling downsampling, the number of Feature maps after each downsampling is multiplied by 2, so there is the Feature Map size variation shown in fig. 8. Finally, feature Map of size 32 × 32 was obtained. The right part of the network is called an extended path. Each block is multiplied by 2 in size by deconvolution, the number of the blocks is reduced by half (the last layer is slightly different), and the blocks are merged with the Feature Map of the left symmetric compression path. The convolution operation of the extended path still uses the effective convolution operation, and the size of the resulting Feature Map is 338 × 338. Since this task is a binary task, the network has two outgoing Feature maps.
Optionally, in the fourth step S4, the step of driving the first needle holder 11 to clamp the suture needle 2 includes:
step S71: based on the real-time image information of the suture region, the segmentation image of the suture needle 2 is identified through an image segmentation algorithm, and a central line 23 (which refers to a central axis extending along the bending direction of the suture needle 2) and an outer contour 24 of the suture needle 2 are extracted according to the segmentation image of the suture needle 2.
Step S72: the position of 1/5 to 1/3 of the length of the center line 23 is set as a first pinch point 25 in the direction from the trailing end 22 of the suture needle to the needle head 21 in the extending direction of the center line 23 of the suture needle.
Step S73: the direction of the tangential vector 251 passing through the first clamping point 25 is set as the first clamping direction 27 in conjunction with the outer contour 24 of the suture needle 2.
Step S74: the first needle holder 11 is driven to hold the suture needle 2 at the first holding point 25 along the first holding direction 27. The setting principle, the setting purpose and the implementation manner of steps S71 to S74 can refer to the aforementioned steps S61 to S64, which are not repeated here.
Optionally, in the step four S4, after the step of driving the first needle holder 11 to hold the suture needle 2, whether the first needle holder 11 successfully holds the suture needle 2 is further determined based on real-time image information of a suture region; if the judgment result is yes, continuing to perform the step of loosening the second needle holder 12; if the judgment result is no, the step of driving the first needle holder 11 to clamp the suture needle 2 is executed again.
Optionally, in the fourth step S4, the step of determining whether the first needle holder 11 successfully holds the suture needle 2 based on the real-time image information of the suture region includes:
step S41: identifying and obtaining the head of the first needle holder 11 according to a target detection algorithm based on real-time image information of a suture region;
step S42: and intercepting the image of the first needle holder 11, and judging whether the suture needle 2 is successfully clamped or not according to a classification algorithm. The setting principle, the setting purpose and the implementation manner of step S41 and step S42 can refer to the aforementioned step S31 and step S32, and are not repeated here.
Similarly, after the second needle holder 12 holds the suture needle 2 and moves to a predetermined position, an analyzing and judging step of whether the first needle holder 11 successfully holds the suture needle 2 may be added, only when the first needle holder 11 accurately holds the suture needle 2, the second needle holder 12 may release the holding of the suture needle 2, otherwise, the position of the first needle holder 11 needs to be continuously adjusted until the first needle holder 11 can accurately hold the suture needle 2, so as to prevent the suture needle 2 from falling off without being held by any needle holder.
Optionally, referring to fig. 9, in step one S1, the step of obtaining the planned suture trajectory includes:
step S11: identifying a segmentation image 71 of the outer contour of the suturing object 7 through an image segmentation algorithm based on real-time image information of a suturing area, and extracting a central axis 72 of the suturing object 7 according to the segmentation image 71 of the outer contour of the suturing object 7; the image segmentation algorithm in step S11 can be referred to above and is not repeated here.
Step S12: according to the segmented image 71 of the outer contour of the suturing object 7, a set of an entry point 101 and an exit point 102 is set in a direction perpendicular to the central axis 72 of the suturing object 7, and the planned suturing trajectory is obtained.
Preferably, in step one S1, the planned suture trajectory is further updated according to input information; and/or, in the second step S2, after the first needle holder 11 is driven to hold the suture needle 2 and penetrate into the suture object 7 along the planned suture trajectory, the planned suture trajectory is updated according to the input information. In some embodiments, the planned suture trajectory is not limited to being automatically calculated by the analysis processing module 82, but may be obtained by an operator (e.g., a physician) through input. Optionally, in the first step S1 and the second step S2, the operator may input a modification trajectory after evaluating the feasibility of the planned suture trajectory automatically calculated by the analysis processing module 82 on the basis of the planned suture trajectory automatically calculated by the analysis processing module 82 to guide adjustment and update of the planned suture trajectory, or may completely manually draw out and input the planned suture trajectory without the planned suture trajectory automatically calculated by the analysis processing module 82. Based on this, the suture control system further comprises a human-computer interaction module for acquiring input information input by an operator. As shown in fig. 10, the process of updating the planned suture trajectory gradually as a needle is sutured is illustrated.
Further, an embodiment of the present invention provides a readable storage medium, on which a program is stored, and the program implements the steps of the suture control method described above when executed. Still further, an embodiment of the present invention provides a computer device, which includes a processor and a readable storage medium as above, where the processor is configured to execute a program stored on the readable storage medium. The readable storage medium may be disposed independently or may be disposed in a robot system integrally, which is not limited in the present invention.
In summary, in the suture control method, the control system, the readable storage medium, and the robot system according to the present invention, the suture control method includes: the method comprises the following steps: acquiring a planned stitching track; step two: driving a first needle holder to clamp a suture needle to penetrate into a suture object along the planned suture track, acquiring real-time morphological information of the suture object based on real-time image information of a suture area, and updating the planned suture track based on the real-time morphological information of the suture object; the first needle holder is further driven to clamp the suture needle and penetrate out of the suture object along the updated planned suture track; step three: and after the suture needle penetrates out of the suture object to reach a preset length, driving a second needle holder to clamp the suture needle, loosening the first needle holder, and further driving the second needle holder to clamp the suture needle and move to a preset position. According to the configuration, after the suture needle penetrates into the suture object, the real-time form information of the suture object can be monitored through the real-time image information of the suture area, the planned suture track is updated based on the real-time form information of the suture object, the position change and the form change of the suture object caused by the needle penetrating action can be effectively dealt with, the suture needle can penetrate from the preset needle outlet point, and the suture accuracy is effectively improved.
It should be noted that the above embodiments may be combined with each other. The above description is only for the purpose of describing the preferred embodiments of the present invention, and is not intended to limit the scope of the present invention, and any variations and modifications made by those skilled in the art based on the above disclosure are within the scope of the appended claims.

Claims (15)

1. A suture control method, comprising:
the method comprises the following steps: acquiring a planned suture track;
step two: driving a first needle holder to clamp a suture needle to penetrate into a suture object along the planned suture track, acquiring real-time morphological information of the suture object based on real-time image information of a suture area, and updating the planned suture track based on the real-time morphological information of the suture object; the first needle holder is further driven to clamp the suture needle and penetrate out of the suture object along the updated planned suture track;
step three: and after the suture needle penetrates out of the suture object to reach a preset length, driving a second needle holder to clamp the suture needle, loosening the first needle holder, and further driving the second needle holder to clamp the suture needle and move to a preset position.
2. The suture control method according to claim 1, wherein before the second step, the suture control method further comprises: calibrating the relative coordinate relationship between the joint point of the first needle holder and the needle head of the suture needle; in the second step, after the first needle holder is driven to clamp the suture needle and penetrate into the suture object along the planned suture track, the real-time coordinate of the needle head of the suture needle is obtained based on the real-time image information of the suture area and the relative coordinate relation; and correcting the driving track of the first needle holder based on the updated planned suture track according to the real-time coordinates of the needle head of the suture needle so as to drive the suture needle to penetrate out of the suture object.
3. The suturing control method according to claim 2, wherein the step of calibrating the relative coordinate relationship between the joint point of the first needle holder and the needle tip of the suturing needle comprises:
identifying and obtaining the head of the first needle holder and the suture needle according to a target detection algorithm based on real-time image information of a suture region;
intercepting an image of the head of the first needle holder, and obtaining coordinates of a joint point of the first needle holder according to a key point algorithm;
intercepting the image of the suture needle, and obtaining the coordinate of the needle head of the suture needle according to a key point algorithm;
and calibrating the relative coordinate relationship between the joint point of the first needle holder and the needle head of the suture needle according to the coordinate of the joint point of the first needle holder and the coordinate of the needle head of the suture needle.
4. The suturing control method according to claim 1, wherein in the third step, after the step of driving the second needle holder to clamp the suturing needle, it is further determined whether the second needle holder successfully clamps the suturing needle based on real-time image information of a suturing area; if the judgment result is yes, continuing to loosen the first needle holder and further driving the second needle holder to clamp the suture needle and move to a preset position; if the judgment result is negative, the step of driving the second needle holder to clamp the suture needle is executed again.
5. The suturing control method according to claim 4, wherein the step three of determining whether the second needle holder successfully holds the suturing needle based on real-time image information of the suturing area comprises: identifying and obtaining the head of the second needle holder according to a target detection algorithm based on real-time image information of a suture region; and intercepting the image of the second needle holder, and judging whether the suture needle is successfully clamped or not according to a classification algorithm.
6. The suture control method according to claim 1, wherein in the third step, the step of driving the second needle holder to hold the suture needle comprises:
identifying a segmentation image of the suture needle through an image segmentation algorithm based on real-time image information of a suture region, and extracting a central line and an outer contour of the suture needle according to the segmentation image of the suture needle;
setting the position of 2/3-4/5 of the length of the central line as a second clamping point along the extending direction of the central line of the sewing needle from the tail end of the sewing needle to the needle head;
setting the direction of a tangent vector perpendicular to the second clamping point as a second clamping direction by combining the outer contour of the suture needle;
and driving the second needle holder to clamp the suture needle at the second clamping point along the second clamping direction.
7. The suturing control method according to claim 1, wherein in the first step, the step of obtaining a planned suturing trajectory comprises:
identifying a segmentation image of the outer contour of the suture object through an image segmentation algorithm based on real-time image information of a suture area, and extracting a central axis of the suture object according to the segmentation image of the outer contour of the suture object;
and setting a group of penetrating points and penetrating points in a direction perpendicular to a central axis of the suture object according to the segmentation image of the outer contour of the suture object to obtain the planned suture track.
8. The suturing control method of claim 1, wherein in the first step, the planned suturing trajectory is further updated based on input information; and/or in the second step, after the first needle holder is driven to clamp the suture needle and penetrate into the suture object along the planned suture track, the planned suture track is updated according to input information.
9. The suturing control method according to claim 1, wherein if there is a next needle to be sutured, the suturing control method further comprises:
step four: driving the first needle holder to clamp the suture needle and loosening the second needle holder; and
step five: and updating the planned suture track of the next needle to be sutured, and returning to repeatedly execute the second step and the third step.
10. The suturing control method according to claim 9, wherein in the fourth step, after the step of driving the first needle holder to clamp the suturing needle, it is further determined whether the first needle holder successfully clamps the suturing needle based on real-time image information of a suturing area; if the judgment result is yes, continuing to perform the step of loosening the second needle holder; if the judgment result is negative, the step of driving the first needle holder to clamp the suture needle is executed again.
11. The suturing control method according to claim 10, wherein the step of determining whether the first needle holder successfully holds the suturing needle based on real-time image information of the suturing area in the fourth step includes: identifying and obtaining the head of the first needle holder according to a target detection algorithm based on real-time image information of a suture region; and intercepting the image of the first needle holder, and judging whether the suture needle is successfully clamped or not according to a classification algorithm.
12. The suture control method according to claim 10, wherein in the fourth step, the step of driving the first needle holder to hold the suture needle comprises:
identifying a segmentation image of the suture needle through an image segmentation algorithm based on real-time image information of a suture region, and extracting a central line and an outer contour of the suture needle according to the segmentation image of the suture needle;
setting the position of 1/5-1/3 of the length of the central line as a first clamping point along the extending direction of the central line of the sewing needle from the tail end of the sewing needle to the direction of a needle head;
setting the direction of a tangent vector perpendicular to the first clamping point as a first clamping direction by combining the outer contour of the suture needle;
and driving the first needle holder to clamp the suture needle at the first clamping point along the first clamping direction.
13. A suture control system for implementing the suture control method according to any one of claims 1 to 12; the suture control system includes: the device comprises an image acquisition module, an analysis processing module and a motion control module;
the image acquisition module is used for acquiring real-time image information of a suture region;
the analysis processing module is used for acquiring a planned suture track and updating the planned suture track based on real-time morphological information of the suture object;
the motion control module is used for driving the first needle holder and the second needle holder to move according to the analysis processing result of the analysis processing module.
14. A readable storage medium on which a program is stored, characterized in that the program, when executed, implements the steps of the suture control method according to any one of claims 1 to 12.
15. A robotic system comprising a first needle holder, a second needle holder, and the suture control system of claim 13.
CN202211058599.5A 2022-08-30 2022-08-30 Stitching control method, control system, readable storage medium and robot system Pending CN115624387A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211058599.5A CN115624387A (en) 2022-08-30 2022-08-30 Stitching control method, control system, readable storage medium and robot system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211058599.5A CN115624387A (en) 2022-08-30 2022-08-30 Stitching control method, control system, readable storage medium and robot system

Publications (1)

Publication Number Publication Date
CN115624387A true CN115624387A (en) 2023-01-20

Family

ID=84903193

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211058599.5A Pending CN115624387A (en) 2022-08-30 2022-08-30 Stitching control method, control system, readable storage medium and robot system

Country Status (1)

Country Link
CN (1) CN115624387A (en)

Similar Documents

Publication Publication Date Title
KR102013857B1 (en) Method and apparatus for generating learning data based on surgical video
AU2019352792B2 (en) Indicator system
US20200330159A1 (en) Path planning method with artificial potential field based on obstacle classification and medical system for steering flexible needle
US20190008598A1 (en) Fully autonomic artificial intelligence robotic system
JP2021531910A (en) Robot-operated surgical instrument location tracking system and method
WO2014068106A1 (en) Imaging system, operating device with the imaging system and method for imaging
DE112016006299T5 (en) Medical safety control device, medical safety control method and medical support system
WO2017098505A1 (en) Autonomic system for determining critical points during laparoscopic surgery
WO2017098506A9 (en) Autonomic goals-based training and assessment system for laparoscopic surgery
KR20210056239A (en) Surgical scene assessment based on computer vision
US20220104887A1 (en) Surgical record creation using computer recognition of surgical events
CN112704566B (en) Surgical consumable checking method and surgical robot system
US20240099556A1 (en) Systems and methods for object measurement in minimally invasive robotic surgery
US20210267692A1 (en) Systems and methods for performing robotic surgery
JP7395125B2 (en) Determining the tip and orientation of surgical tools
WO2021067591A2 (en) Systems and methods for use of stereoscopy and color change magnification to enable machine learning for minimally invasive robotic surgery
CN115624387A (en) Stitching control method, control system, readable storage medium and robot system
US20240164765A1 (en) Systems and methods for estimating needle pose
CN115005979A (en) Computer-readable storage medium, electronic device, and surgical robot system
US20230126545A1 (en) Systems and methods for facilitating automated operation of a device in a surgical space
KR20120052573A (en) Surgical robitc system and method of controlling the same
CN114041874B (en) Interface display control method and device, computer equipment and system and medium
US20230215059A1 (en) Three-dimensional model reconstruction
US20230210627A1 (en) Three-dimensional instrument pose estimation
Lu Suture thread detection and 3D model reconstruction for automated surgical knot tying with a vision-based robotic system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination