CN113616336B - Surgical robot simulation system, simulation method, and readable storage medium - Google Patents

Surgical robot simulation system, simulation method, and readable storage medium Download PDF

Info

Publication number
CN113616336B
CN113616336B CN202111069142.XA CN202111069142A CN113616336B CN 113616336 B CN113616336 B CN 113616336B CN 202111069142 A CN202111069142 A CN 202111069142A CN 113616336 B CN113616336 B CN 113616336B
Authority
CN
China
Prior art keywords
virtual
model
virtual instrument
instrument model
surgical robot
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111069142.XA
Other languages
Chinese (zh)
Other versions
CN113616336A (en
Inventor
刘鹭
马菁阳
杨智媛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Weiwei Aviation Robot Co ltd
Original Assignee
Shanghai Weiwei Aviation Robot Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Weiwei Aviation Robot Co ltd filed Critical Shanghai Weiwei Aviation Robot Co ltd
Priority to CN202111069142.XA priority Critical patent/CN113616336B/en
Publication of CN113616336A publication Critical patent/CN113616336A/en
Application granted granted Critical
Publication of CN113616336B publication Critical patent/CN113616336B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F30/00Computer-aided design [CAD]
    • G06F30/20Design optimisation, verification or simulation
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/50ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for simulation or modelling of medical disorders
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/101Computer-aided simulation of surgical operations
    • A61B2034/102Modelling of surgical devices, implants or prosthesis
    • A61B2034/104Modelling the effect of the tool, e.g. the effect of an implanted prosthesis or for predicting the effect of ablation or burring
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/107Visualisation of planned trajectories or target regions

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Surgery (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Public Health (AREA)
  • Medical Informatics (AREA)
  • Biomedical Technology (AREA)
  • General Health & Medical Sciences (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Robotics (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Veterinary Medicine (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Epidemiology (AREA)
  • Pathology (AREA)
  • Primary Health Care (AREA)
  • Databases & Information Systems (AREA)
  • Computer Hardware Design (AREA)
  • Evolutionary Computation (AREA)
  • Geometry (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Manipulator (AREA)

Abstract

The invention provides a simulation system, a simulation method and a readable storage medium of a surgical robot, wherein the simulation system of the surgical robot comprises: the system comprises a modeling module, an interaction module and a behavior judgment module; the modeling module is used for carrying out three-dimensional reconstruction on the predetermined tissue according to the medical image so as to obtain a virtual tissue model; the modeling module is also used for carrying out three-dimensional reconstruction on the surgical instrument to obtain a virtual instrument model; the interaction module is used for receiving an instruction input by the operating device and converting the instruction into a motion parameter of the virtual instrument model so as to drive the virtual instrument model to move in the virtual tissue model according to the motion parameter; the behavior judging module is used for judging whether the motion of the virtual instrument model meets a preset standard or not, and if not, a warning is sent out. By the configuration, the operation process can be simulated functionally, a doctor can use the operating device to practice repeatedly before a real operation, and the learning curve of the doctor on the operation robot platform is shortened.

Description

Surgical robot simulation system, simulation method, and readable storage medium
Technical Field
The invention relates to the technical field of medical instruments, in particular to a surgical robot simulation system, a simulation method and a readable storage medium.
Background
With the continuous development of social economy and scientific technology, in order to enable doctors to accumulate as much operation experience as possible before clinical operation and enable some doctors in the experience period to have actual operation opportunities, a related operation simulation training system is developed. The simplest simulated human body model similar to medical care can be used for a practicer to repeatedly plug in and pull out or install a nursing instrument on the model with the opening, the model also has feedback (such as bleeding, heartbeat mutation or sound, light and the like) with certain functions to give an alarm in an intuitive mode, or a simulation platform is completely established on the basis of software operation, for example, human body CT data is imported to generate a three-dimensional model of a virtual tissue organ, and operations such as partitioning, segmenting and cutting are performed on a software interface. Simulation model systems or platforms of operations of various parts of a human body are also developed in succession, but are simulation systems and platforms developed based on traditional surgical instruments and surgical techniques.
The surgical robot gradually enters the hospital in the last two decades, compared with the traditional surgery, the robot has greatly improved holding capacity, stability maintaining capacity and operation precision for a long time, and the surgical robot has the advantages of smaller wound, shorter recovery period and lower infection probability for patients. The application of the surgical robot to hospitals is inevitable, a plurality of surgical robots appear in the world at present, but the surgical simulation system for the robot is always in a vacant state, doctors still only know the scheme and do not know the operation before the operation of patients facing different conditions, doctors in related departments urgently need a simulation system or a platform to help the doctors to shorten the learning curve used by the surgical robot, and the learning efficiency is improved.
Disclosure of Invention
The invention aims to provide a surgical robot simulation system, a simulation method and a readable storage medium, which aim to solve the problem that the existing surgical robot simulation system is vacant.
In order to solve the above technical problem, the present invention provides a surgical robot simulation system, including: the system comprises a modeling module, an interaction module and a behavior judgment module;
the modeling module is used for carrying out three-dimensional reconstruction on a preset tissue according to the medical image so as to obtain a virtual tissue model; the modeling module is also used for carrying out three-dimensional reconstruction on the surgical instrument so as to obtain a virtual instrument model;
the interaction module is used for receiving an instruction input by the operating device and converting the instruction into the motion parameters of the virtual instrument model so as to drive the virtual instrument model to move in the virtual tissue model according to the motion parameters;
the behavior judging module is used for judging whether the motion of the virtual instrument model meets a preset standard or not, and if not, a warning is sent out.
Optionally, the surgical robot simulation system further includes a planning module and a navigation module;
the planning module is used for obtaining a planning path of the virtual instrument model according to the virtual organization model based on a preset algorithm;
the navigation module is used for guiding the motion direction of the virtual instrument model according to the planned path.
Optionally, the preset algorithm includes a skeletonization algorithm.
Optionally, the preset algorithm further includes a shortest path search algorithm.
Optionally, the navigation module is further configured to establish a virtual safety boundary along the planned path based on a preset distance.
Optionally, the virtual safety boundary includes an inner layer and an outer layer according to a distance with respect to the planned path.
Optionally, the predetermined tissue comprises a thoracoabdominal region, and the virtual tissue model comprises a breathing cycle variation model.
Optionally, the breathing cycle variation model is obtained by interpolating key points based on a plurality of groups of medical images; or the breathing cycle variation model is obtained based on a volume-to-time correspondence of the lung lobes.
Optionally, the virtual instrument model includes at least one of a shape, a size, and an operation manner of the surgical instrument.
Optionally, the preset specification includes:
the virtual instrument model moves along a planned path, and/or the operation mode of the virtual instrument model is operated according to the operation flow of the surgical instrument of the corresponding type.
Optionally, the surgical robot simulation system further includes: an operating device;
the operating device is in communication connection with the interaction module and used for outputting instructions to drive the virtual instrument model to move.
Optionally, the surgical robot simulation system further includes: an interface interaction module;
the interface interaction module is used for displaying at least one of the virtual tissue model, the virtual instrument model, the motion of the virtual instrument model and the judgment result of the behavior judgment module;
the interface interaction module is further configured to accept input interaction information to adjust parameters of at least one of the virtual tissue model, the virtual instrument model, the motion of the virtual instrument model, and the preset specification.
In order to solve the above technical problem, the present invention further provides a surgical robot simulation method, which includes:
performing three-dimensional reconstruction on a predetermined tissue according to the medical image to obtain a virtual tissue model;
performing three-dimensional reconstruction on the surgical instrument to obtain a virtual instrument model;
receiving an instruction input by an operating device and converting the instruction into a motion parameter of the virtual instrument model so as to drive the virtual instrument model to move in the virtual tissue model according to the motion parameter;
and judging whether the motion of the virtual instrument model meets a preset standard or not, and if not, giving a warning.
Optionally, after obtaining the virtual tissue model and the virtual instrument model, the surgical robot simulation method further includes:
obtaining a planned path of the virtual instrument model based on a preset algorithm according to the virtual tissue model;
and guiding the motion direction of the virtual instrument model according to the planned path.
Optionally, the surgical robot simulation method further includes:
and establishing a virtual safety boundary according to a preset distance based on the planned path.
Optionally, the virtual safety boundary includes an inner layer and an outer layer according to a distance with respect to the planned path.
Optionally, the predetermined tissue comprises a thoracoabdominal region, and the virtual tissue model comprises a breathing cycle variation model.
Optionally, the breathing cycle variation model is obtained by interpolating key points based on a plurality of groups of the medical images; or the breathing cycle variation model is obtained based on a volume and time correspondence formula of the lung lobes.
Optionally, the preset specification includes:
the virtual instrument model moves along a planned path, and/or the operation mode of the virtual instrument model is operated according to the operation flow of the corresponding type of surgical instrument.
Optionally, the surgical robot simulation method further includes:
displaying at least one of the virtual tissue model, the virtual instrument model, the motion of the virtual instrument model, and the judgment result of the behavior judgment module; and
accepting input interaction information to adjust parameters of at least one of the virtual tissue model, the virtual instrument model, motion of the virtual instrument model, and the preset specification.
In order to solve the above technical problem, the present invention further provides a readable storage medium, on which a program is stored, and when the program runs, the surgical robot simulation method as described above is implemented.
In summary, in the surgical robot simulation system, the simulation method and the readable storage medium provided by the present invention, the surgical robot simulation system includes: the system comprises a modeling module, an interaction module and a behavior judgment module; the modeling module is used for carrying out three-dimensional reconstruction on a preset tissue according to the medical image so as to obtain a virtual tissue model; the modeling module is also used for carrying out three-dimensional reconstruction on the surgical instrument to obtain a virtual instrument model; the interaction module is used for receiving an instruction input by the operating device and converting the instruction into the motion parameters of the virtual instrument model so as to drive the virtual instrument model to move in the virtual tissue model according to the motion parameters; the behavior judging module is used for judging whether the motion of the virtual instrument model meets a preset standard or not, and if not, a warning is sent out.
With the configuration, based on the instruction input by the operating device and received by the interaction module, the virtual instrument model of the surgical instrument can be driven to move in the virtual tissue model, meanwhile, the behavior judgment module can judge whether the movement of the virtual instrument model meets the preset specification or not, and if not, a warning is given out. Therefore, the operation process can be simulated in function, doctors can use similar operation devices to practice repeatedly before real operations through the external operation device, corresponding warning feedback is provided for error conditions occurring in the simulated operation process, and the learning curve of the doctors on the operation robot platform is shortened.
Drawings
It will be appreciated by those skilled in the art that the drawings are provided for a better understanding of the invention and do not constitute any limitation to the scope of the invention. Wherein:
FIG. 1 is a schematic diagram of an application scenario of a surgical robot simulation system according to the present invention;
FIG. 2 is a block diagram of a surgical robot simulation system in accordance with an embodiment of the present invention;
FIG. 3 is a flow chart of a surgical robot simulation method according to an embodiment of the present invention;
FIG. 4 is a schematic illustration of a virtual organizational model of an embodiment of the present invention;
FIGS. 5a and 5b are schematic diagrams of a breathing cycle variation model according to an embodiment of the present invention;
6 a-6 c are schematic diagrams of a virtual instrument model of an embodiment of the invention;
FIG. 7 is a schematic view of a simulated surgical interface showing a planned path in accordance with an embodiment of the present invention;
FIG. 8 is a schematic view of a simulated surgical interface showing a navigation module directing the direction of motion of a virtual instrument model according to a planned path in accordance with an embodiment of the present invention;
FIG. 9 is a schematic view of a simulated surgical interface showing selection of a category of virtual instrument models in accordance with an embodiment of the present invention;
FIG. 10 is a schematic view of a simulated surgical interface showing one example of a behavior determination module issuing an alert where a virtual catheter exceeds a virtual safety boundary in accordance with an embodiment of the present invention;
FIG. 11 is a schematic view of a simulated surgical interface showing one example of a behavior decision module issuing a warning in which a virtual catheter deviates from a planned path in accordance with an embodiment of the present invention;
FIG. 12 is a schematic view of a simulated surgical interface showing one example of a warning issued by the behavior determination module in which the virtual instrument model is not aligned with the nodule sample in accordance with an embodiment of the present invention;
FIG. 13 is a schematic illustration of the operational flow of a virtual instrument model of an embodiment of the present invention;
FIG. 14 is a navigation process alert behavior decision flow diagram of an embodiment of the present invention;
fig. 15 is a flow chart of a sampling process alarm behavior determination of an embodiment of the present invention.
In the drawings:
01-a host; 02-a display device;
10-a modeling module; 11-pulmonary blood vessels; 12-pulmonary trachea; 13-target nodule; 14-pleura; 15-sternum; 16-a sampling clamp; 17-a sampling needle; 18-a sampling brush; 20-a planning module; 21-sagittal lung plane; 22-lung cross section; 23-coronal plane of lung; 30-a navigation module; 40-an interaction module; 50-a behavior decision module; 60-an operating device; 70-interface interaction module.
Detailed Description
To further clarify the objects, advantages and features of the present invention, a more particular description of the invention will be rendered by reference to specific embodiments thereof which are illustrated in the appended drawings. It is to be noted that the drawings are in simplified form and are not to scale, but are provided for the purpose of facilitating and clearly illustrating embodiments of the present invention. Further, the structures illustrated in the drawings are often part of actual structures. In particular, the drawings are intended to show different emphasis, sometimes in different proportions.
As used in this application, the singular forms "a", "an" and "the" include plural referents, the term "or" is generally employed in a sense including "and/or," the terms "a" and "an" are generally employed in a sense including "at least one," the terms "at least two" are generally employed in a sense including "two or more," and the terms "first", "second" and "third" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicit to the number of technical features indicated. Thus, features defined as "first", "second", "third" may explicitly or implicitly include one or at least two of such features, the term "proximal" generally being the end near the operator, the term "distal" generally being the end near the patient, i.e. near the lesion, the terms "end" and "proximal" and "distal" generally referring to the corresponding two parts, which include not only the end points, the terms "mounted", "connected" and "connected" being to be understood in a broad sense, e.g. as being fixedly connected, as well as detachably connected, or as an integral part; can be mechanically or electrically connected; either directly or indirectly through intervening media, either internally or in any other relationship. In addition, as used in the present invention, the arrangement of one element in another element generally only means that there is a connection, coupling, fit or transmission relationship between the two elements, and the connection, coupling, fit or transmission between the two elements may be direct or indirect through an intermediate element, and cannot be understood as indicating or implying a spatial positional relationship between the two elements, i.e. one element may be in any orientation of the inside, outside, above, below or one side of another element, unless the content clearly indicates otherwise. The specific meanings of the above terms in the present invention can be understood by those skilled in the art according to specific situations.
The invention aims to provide a surgical robot simulation system, a simulation method and a readable storage medium, which aim to solve the problem that the existing surgical robot simulation system is vacant.
The following description is made with reference to the accompanying drawings.
Please refer to fig. 1 to fig. 15, in which fig. 1 is a schematic diagram of an application scenario of a surgical robot simulation system according to the present invention; FIG. 2 is a block diagram of a surgical robot simulation system in accordance with an embodiment of the present invention;
FIG. 3 is a flow chart of a surgical robot simulation method according to an embodiment of the present invention; FIG. 4 is a schematic illustration of a virtual organizational model of an embodiment of the present invention; FIGS. 5a and 5b are schematic diagrams of a breathing cycle variation model according to an embodiment of the present invention; FIGS. 6 a-6 c are schematic diagrams of a virtual instrument model according to an embodiment of the invention; FIG. 7 is a schematic view of a simulated surgical interface showing a planned path in accordance with an embodiment of the present invention; FIG. 8 is a diagrammatic view of a simulated surgical interface showing a navigation module directing the direction of movement of a virtual instrument model according to a planned path in accordance with an embodiment of the present invention; FIG. 9 is a schematic view of a simulated surgical interface showing selection of a category of virtual instrument models in accordance with an embodiment of the present invention; FIG. 10 is a schematic view of a simulated surgical interface showing one example of a behavior determination module issuing an alert where a virtual catheter is beyond a virtual safety boundary in accordance with an embodiment of the present invention; FIG. 11 is a schematic view of a simulated surgical interface showing one example of a warning issued by the behavior determination module in which the virtual catheter deviates from the planned path, in accordance with an embodiment of the present invention; FIG. 12 is a schematic view of a simulated surgical interface showing one example of a warning issued by the behavior determination module in which the virtual instrument model is not aligned with the nodule sample in accordance with an embodiment of the present invention; FIG. 13 is a schematic illustration of the operational flow of a virtual instrument model of an embodiment of the present invention; FIG. 14 is a navigation process alert behavior decision flow diagram of an embodiment of the present invention; fig. 15 is a flow chart of a sampling process alarm behavior determination of an embodiment of the present invention.
As shown in fig. 1 and 2, an embodiment of the present invention provides a surgical robot simulation system, which includes: a modeling module 10, an interaction module 40, and a behavior determination module 50; the modeling module 10 is configured to perform three-dimensional reconstruction on a predetermined tissue according to a medical image (mainly, a two-dimensional medical image, such as a CT image or an MRI image) to obtain a virtual tissue model; the modeling module 10 is further configured to perform three-dimensional reconstruction on the surgical instrument to obtain a virtual instrument model; the interaction module 40 is configured to receive an instruction input by an operating device and convert the instruction into a motion parameter of the virtual instrument model, so as to drive the virtual instrument model to move in the virtual tissue model according to the motion parameter; the behavior determination module 50 is configured to determine whether the motion of the virtual instrument model meets a preset specification, and if not, send an alarm. So configured, based on the instruction input by the operating device received by the interaction module 40, the virtual instrument model of the surgical instrument can be driven to move in the virtual tissue model, and the behavior determination module 50 can determine whether the movement of the virtual instrument model meets the preset specification, and if not, issue a warning. Therefore, the operation process can be simulated in function, a doctor can repeatedly practice by using a similar or same operation device as a real operation before the real operation through the external operation device, corresponding warning feedback is provided for the error condition in the simulation operation process, and the learning curve of the doctor on the operation robot platform is shortened. In addition, the control handle of the surgical robot corresponding to the operation can be directly adopted by the operation device, and the virtual instrument model of the surgical instrument is driven to move in the virtual tissue model in the same control mode in the actual operation. Therefore, for a doctor about to perform a real operation, the control handle which is completely the same as the real operation can be used for performing targeted repeated exercise operation before the operation, and the operation efficiency and the success rate are improved. The simulation training can be carried out by introducing medical images of different patients when no operation is carried out, so that the learning curve on the operation robot platform is greatly shortened.
Preferably, the surgical robot simulation system further comprises a planning module 20 and a navigation module 30; the planning module 20 is configured to obtain a planned path of the virtual instrument model based on a preset algorithm according to the virtual tissue model; the navigation module 30 is configured to guide a movement direction of the virtual instrument model according to the planned path.
Referring to fig. 4, in an exemplary embodiment, the predetermined tissue includes a thoracoabdominal region. After the medical image is acquired, the modeling module 10 can distinguish different types of tissues (such as heart, sternum, lung bronchus, etc.) by selecting different gray threshold values, and then three-dimensionally reconstruct the tissue to be reconstructed through a three-dimensional reconstruction algorithm to obtain a virtual tissue model. The three-dimensional reconstruction algorithm may be, for example, a multi-slice reconstruction (MPR) algorithm, although those skilled in the art may select other three-dimensional reconstruction algorithms according to the prior art. Fig. 4 shows an example of a three-dimensional reconstructed virtual tissue model, which includes: pulmonary blood vessels 11, pulmonary trachea 12, target nodule 13, and pleura 14. Also shown in fig. 4 is a sternum 15, whose purpose is to represent that the pleura 14 needs to be reconstructed by volume segmentation, and that it is not possible to reconstruct only the pleura 14 on a certain sectional plane by segmentation, and in the example of the virtual tissue model shown in fig. 4, the sternum 15 itself does not need to be reconstructed in three dimensions. Further, the reconstruction of the target nodule 13 may be spherical or more closely approximating a true irregularity, with the spherical shape being selected to represent the target nodule 13 in the exemplary embodiment of FIG. 4. In other embodiments, other shapes may be used to express the target nodule 13. Of course, the thoracoabdominal region is only an example of the predetermined tissue and is not limited to the predetermined tissue, and the surgical robot simulation system provided in this embodiment may also be applied to surgical simulation of other parts, which is not limited to the invention.
Preferably, for thoracic and abdominal tissues (such as the lungs), the virtual tissue model further comprises a breathing cycle variation model. After the tissue modeling is completed, the tissue is preferably subjected to respiratory motion modeling to obtain a respiratory cycle variation model so as to enable the respiratory cycle variation model to be closer to the state of a real intraoperative patient. Fig. 5a and 5b show an example of a breathing cycle variation model, wherein fig. 5a shows that in the inspiratory state of human body, the lung tissue volume is increased, the distance between the airway and the airway is increased, and the diameter of the airway is also expanded; fig. 5b shows that in the exhalation state of the human body, the lung tissue volume is reduced, the distance between the air passages is reduced, and the diameter of the air passages is contracted. The target nodule 13 may also exhibit a regular reciprocating motion with the breathing motion.
In some embodiments, the breathing cycle variation model is derived by interpolating key points based on a plurality of sets of the medical images. Specifically, a plurality of groups of medical images (such as CT images or MRI images) of a patient in a normal breathing state can be taken before an operation, then key points are marked on each group of medical images, and the selection criteria of the key points can select bifurcation nodes which are large in lung bronchus and easy to distinguish, such as main airway protruding points, left and right main airway first-level bifurcation points and the like, so that the same key points on the plurality of groups of medical images are arranged according to the time on the breathing cycle, a variation curve of the key points in space and time is obtained, because the medical images cannot be acquired at equal intervals when the patient breathes, interpolation needs to be carried out on the variation curve, the existence of the relevant key points at equal intervals and time intervals is ensured as far as possible, the principle can be applied to all the key points, and a group of airway models which periodically change in space and time can be reconstructed according to the structural relationship between the key points and the airway models, so that a breathing cycle variation model can be obtained.
In other embodiments, the breathing cycle variation model is derived based on a volume-versus-time correspondence of lung lobes. Specifically, a volume-time correspondence can be obtained by evaluating the change of the whole volume of each lung lobe along with the breathing cycle, so that the change quantity of the model can be directly calculated by the three-dimensional model of each tissue according to the volume ratio of different cycles, and as the air passages in the lung lobes change along with the volume change of the lung lobes, a group of air passage models which periodically change in space and time can be obtained, namely the breathing cycle change model.
Optionally, the virtual instrument model includes at least one of a shape, a size, and an operation manner of the surgical instrument. In practice, the surgical instruments are of many different types, which vary in shape, size and manner of operation. Referring to fig. 6a to 6c, there are exemplarily shown several virtual models of surgical instruments commonly used in bronchial surgery, which may have many differences in process design from real surgical instruments, but have substantially the same shape and size, and the main operation manner of the virtual models of surgical instruments is clear. The dashed lines in fig. 6a to 6c illustrate the operation of the virtual model of the corresponding surgical instrument, in particular the change in motion that occurs after receiving the command of the operating device. FIG. 6a shows a virtual model of a sampling forceps 16, which operates in a tip-to-tip fashion; fig. 6b shows a virtual model of a sampling needle 17, which operates in a manner of penetration and withdrawal; fig. 6c shows a virtual model of the sample brush 18, which operates in a back-and-forth swiping manner. It will be appreciated that fig. 6a to 6c are only a few examples of virtual models of surgical instruments and are not limiting to the virtual models of surgical instruments, and that a person skilled in the art may select other kinds of surgical instruments to model according to actual surgical needs, for example, the virtual instrument models may also include virtual catheters and the like.
Preferably, the preset algorithm comprises a skeletonization algorithm. Skeletonization is a method for describing the topology of an object by using a central axis transformation, and can describe the shape of the object. After the virtual organization model is obtained, the topological structure of the virtual organization can be further extracted through a skeletonization algorithm so as to realize path planning. In some embodiments, the topology of the pulmonary airway may be extracted by a skeletonization algorithm, and spatial coordinates of various branch points of the pulmonary trachea 12 are obtained and sequentially marked to implement path planning. Alternatively, the skeletonization algorithm includes, but is not limited to, a topology refinement method, a distance field method, and a generalized potential field method, and those skilled in the art can understand the specific principle of the skeletonization algorithm according to the prior art, which is not described in the present invention. In the present embodiment, whichever specific algorithm is used, a topology that satisfies the calculation requirement can be obtained.
Further, the preset algorithm further includes a shortest path search algorithm. After the topological structure of the virtual organization is obtained, the starting point and the end point of the planned path can be set, and the planned path of the virtual instrument model is calculated and obtained through a shortest path search algorithm. In some embodiments, after the topology of the pulmonary trachea 12 is obtained, the location of the target nodule 13 may be marked as an end point of the planned path, which may be located at any point within the main airway, for example. Alternatively, the target nodule 13 may be designated by a physician, and the starting point for planning the path may also be designated by the physician. After the starting point and the end point of the planned path are set, the planned path can be obtained through a shortest path searching algorithm. Optionally, the shortest path search algorithm includes, but is not limited to, a depth-first search method, a breadth-first search method, and a Dijkstra algorithm, and those skilled in the art can understand the specific principle of the shortest path search algorithm according to the prior art, which is not described in the present invention.
Fig. 7 shows an exemplary simulated surgical interface showing a simulated surgical interface, wherein the left window of the interface is an image display of the sagittal 21, transverse 22 and coronal 23 lung plane, respectively, and the dots indicate the positions of the target nodule 13 on the three images. The middle window of the interface displays a virtual tissue model showing the effect of the segmented reconstruction of the whole pulmonary trachea 12, the dots represent target nodules 13, the lines connected with the dots represent planned paths, the right window of the interface displays a list of a plurality of target nodules 13, a doctor can select an 'add target point' button to add the target nodules 13, and a plurality of target nodules 13 can calculate a plurality of planned paths.
Preferably, the navigation module is further configured to establish a virtual safety boundary based on a preset distance along the planned path. After the planning module obtains the planned path, a circle is established by taking the planned path as a center and taking a preset distance as a radius, and the circle is made to follow the planned path, that is, a corresponding circle is established by taking different points on the planned path as circle centers and taking the preset distance as the radius, and a virtual safety boundary along the planned path can be established by a plurality of circular sets. It should be noted that the preset distance may be set and adjusted according to specific situations, for example, the preset distance may be set and adjusted according to a distance between the planned path and a virtual tissue to be avoided. In some embodiments, different values may be selected depending on the location of the pulmonary airway 12. For example, at the main airway, a greater value may be taken, while at the branch airway, a corresponding reduction may be taken. In one example, the predetermined distance may be, for example, 0.5cm to 1cm, wherein the airway branch overlap region may be selected to have a relatively large value.
Optionally, the preset specification includes: the virtual instrument model moves along the planned path. The virtual safety boundary defines a spatial alarm range, and the virtual instrument model is limited in the virtual safety boundary when moving along the planned path. When a doctor controls the virtual instrument model to advance and retreat along the planned path in the virtual tissue model through the external operation device, the movement direction of the virtual instrument model cannot be well coincided with the planned path due to improper operation, so that the virtual instrument model exceeds a virtual safety boundary. From the perspective of the virtual tissue model, the virtual instrument model is caused to either contact the airway wall in the virtual tissue model or to directly penetrate the airway wall. Thus, if the virtual instrument model exceeds the virtual safety boundary, the motion of the virtual instrument model may be deemed to be outside of the preset specification, and the behavior determination module 50 may issue a warning accordingly.
Further, the virtual safety boundary includes an inner layer and an outer layer according to a distance relative to the planned path. In some embodiments, the virtual security boundary may be set to two layers, inner and outer. For example, an inner virtual security boundary is established according to a first preset distance, and an outer virtual security boundary is established according to a second preset distance greater than the first preset distance. So configured, different levels of warning may be triggered by calculating the intersection distance of the virtual instrument model's penetration distance with the two layers of the virtual safety boundary. Preferably, the inner and outer layers of virtual safety boundaries can also display the force feedback effect of the tip of the real surgical instrument on the simulated surgical interface through a virtual modeling method, it is to be noted that according to actual needs, more layers of virtual safety boundaries can be further set to set more complex safety warning rules, and the invention has no special limitation to this.
Further, the surgical robot simulation system further includes: an operating device 60; the operating device 60 is connected to the interactive module 40 for outputting instructions to drive the virtual instrument model to move. Referring to fig. 1, the operation device 60 can be, for example, an operation handle, which can be operated by a doctor, and converts the action of the doctor operation into a control command, and outputs the control command to the interaction module 40. Optionally, the modeling module 10, the planning module 20, the navigation module 30, the interaction module 40, and the behavior determination module 50 are all integrated in a host 01 or integrated in the surgical robot. Of course, in other embodiments, at least one of the modeling module 10, the planning module 20, the navigation module 30, the interaction module 40, and the behavior determination module 50 may be independently disposed outside the host 01, which is not limited in the present invention.
Further, the surgical robot simulation system further includes: an interface interaction module 70; the interface interaction module 70 is configured to display at least one of the virtual tissue model, the virtual instrument model, the motion of the virtual instrument model, and the judgment result of the behavior judgment module, and preferably further configured to display a planned path; the interface interaction module 70 is further configured to accept input interaction information to adjust parameters (such as warning level, sampling times, etc.) of at least one of the virtual tissue model, the virtual instrument model, the motion of the virtual instrument model, and the preset specification, and preferably further configured to adjust parameters of a planned path. Referring to fig. 1, the interface interaction module 70 may include, for example, a display device 02 and an input device, wherein the input device may be integrated on the display device 02 to form a touch screen, and the input device may also be an input device that is independent from the display device 02, such as a keyboard or a mouse, which are commonly used in the art. The display device 02 can display a simulation operation interface, such as at least one of a virtual tissue model, a virtual instrument model, a planned path, a motion of the virtual instrument model, and a determination result of the behavior determination module.
Referring to fig. 8, a doctor inputs an instruction through an external operating device to control the virtual instrument model to move forward and backward along a planned path, a simulated surgical interface displayed by the display device 02 of the interface interaction module 70 can be displayed, the simulated surgical interface can have direction prompts and angle adjustment prompts in the process to help the doctor to more accurately control the virtual instrument model, the change of the virtual instrument model can be prompted after the simulated surgical interface reaches a preset target area, virtual sampling is performed, and real-time warning can be triggered in the whole process for various virtual injuries, unsafe behaviors and irregular operations caused by misoperation. Fig. 8 shows an example in which a virtual catheter is used as an example of a virtual instrument model, and shows a schematic diagram of the virtual catheter advancing along a planned path in a virtual lung bronchus, the virtual endoscopic view is the same as the view seen by a doctor during a real airway endoscopic surgery, the virtual catheter advances along a current airway, a branch appears in front of the virtual catheter, a dashed line is the trend of the planned path, the doctor needs to operate the virtual catheter to turn right after entering the upper left branch airway according to the trend, and a direction prompt can appear when a bifurcation region is met on a simulated surgical interface, and preferably, a key guide can be provided (the position of a black point on a functional key schematic diagram of an operating device circumscribed at the upper left corner in fig. 8 indicates that a key which needs to be pressed down to turn right is about to be pressed).
Optionally, the preset specification may further include: and the operation mode of the virtual instrument model is operated according to the operation flow of the surgical instrument of the corresponding type. The operation process of the virtual instrument model mainly comprises the process of a sampling link of a virtual sampling instrument. Referring to fig. 13, in an exemplary embodiment, the sampling process sequentially includes: SA0: sampling preparation; and SA1: selecting a sampling brush; and SA2: brushing the aligned target position once; and SA3: selecting a sampling clamp; and SA4: clamping the target position for five times; and SA5: selecting a sampling brush; and SA6: brushing the aligned target position once; and SA7: flushing; and SA8: and (5) finishing sampling. If the operation mode of the virtual instrument model does not operate according to the sampling flow, it can be considered that the motion of the virtual instrument model does not meet the preset specification, and the behavior determination module 50 issues an alarm. It should be noted that the above-mentioned sampling process is only a standard process for processing samples in a conventional endotracheal endoscopy department, which is only an exemplary but non-limiting operation process of the surgical instrument. The skilled person can set different operation procedures according to different surgical instruments and different surgical requirements. Furthermore, the doctor can adjust the parameters of each step of the sampling process, such as the sampling sequence or the sampling times, according to the exercise condition of the doctor by an interface interaction module.
Alternatively, the alerts issued by the decision module 50 may be divided into different categories and different levels. Table 1 below shows an example of the type of alarm in a simulated surgery example, it should be understood that the table is only an example and not a limitation on the type of alarm.
TABLE 1
Figure BDA0003259808340000131
The alert type table is described in detail below in connection with several exemplary examples of simulated surgical interfaces.
In the example shown in fig. 9, the left window of the simulated surgical interface is now in a state where the physician has reached the target area along the planned path, completing the navigation. At this time, a sampling instrument selection interface is entered, that is, a window on the right side of the interface is displayed, the doctor can perform subsequent operations according to the sampling flow sequence, and if the operations are not performed according to the sampling flow sequence, the behavior determination module 50 issues a warning.
In the exemplary case shown in fig. 10, if the tip of the virtual catheter exceeds the virtual safety boundary due to mishandling during the course of traveling along the planned path, the exceeding distance is used to determine the triggering of several levels of warning under the warning. In fig. 10, when the tip of the virtual catheter penetrates through the wall of the virtual airway and penetrates into the outer virtual safety boundary (the position of the dotted line), three-level serious alarm is triggered, the interface of the virtual endoscopic view angle becomes a global view angle, so that a doctor can conveniently see how much of the virtual catheter penetrates through the virtual airway, and the effect of penetrating through the virtual airway is shown in full black in the virtual endoscopic view angle.
In the exemplary case shown in fig. 11, if the tip of the virtual catheter deviates from the planned path due to improper steering during the course of traveling along the planned path, several levels of warnings under this type of warning will be triggered according to the degree of deviation (linear distance, angle, etc.). Meanwhile, the effect of walking by mistake is expressed in full white at the virtual endoscopic view angle, so that a doctor can find the effect in time.
In the exemplary example shown in fig. 12, if the virtual sampling link is not aligned with the target nodule 13, a sampling instruction (e.g., brushing, clipping) is issued, a warning of sampling failure occurs on the simulation surgical interface, and at the same time, a scene that the center of the target nodule 13 does not coincide with the tip of the virtual sampling instrument is represented by a central cross line in the virtual endoscopic view, and a doctor needs to operate an external operating device again to make the cross point of the cross line enter the range of the target nodule 13 for sampling.
The flowchart of fig. 14 exemplarily shows the determination of whether a corresponding warning will be triggered for each motion action during the course of the virtual catheter traveling along the planned path.
The flow chart of fig. 15 illustratively shows the decision flow of warnings that may be triggered by a virtual sampling segment due to various types of physician non-normative or mishandling. The basis for judging the alarm of the order of use of the instruments and the alarm of the number of sampling times in the process is derived from fig. 13, for example, each time step SC1 is performed, the current and past selected information orders of the instruments are compared with the use order of the instruments in fig. 13, if the orders are found not to be right, a report is triggered, and the doctor needs to reselect the instruments.
Optionally, step SC2 in fig. 15 may be achieved by a doctor sending a direction adjustment command through an external operating device, and step SC3 and step SC4 are implemented in a specific algorithm, for example, a virtual surgical collision detection algorithm may be adopted. In one example, a collision detection algorithm between a rigid body (virtual instrument model) and a deformable body (target nodule) is embodied. The virtual surgical collision detection algorithm includes, but is not limited to, a bounding box method, a spatial segmentation method, a spatial distance method, and the like, in some embodiments, a bounding sphere method in the bounding box method may be used (the target nodule 13 may be considered as a sphere approximately), an OBB hierarchical tree may be specifically selected for motion modeling of the virtual instrument model to complete calculation of rotation and movement, and an accurate collision result is obtained by calculating intersection tests of the bounding sphere and other types of bounding boxes (virtual instrument models) and iteratively updating basic primitives (triangular patches of the virtual instrument models) included in two leaf nodes where the bounding boxes collide with the bounding sphere, where the collision result is whether an intersection result is generated. Those skilled in the art can understand the virtual surgical collision detection algorithm according to the prior art, and the present invention is not described in detail.
Based on the above simulation system for a surgical robot, an embodiment of the present invention further provides a simulation method for a surgical robot, and referring to fig. 3, the simulation method for a surgical robot includes:
step S1: performing three-dimensional reconstruction on a predetermined tissue according to the medical image to obtain a virtual tissue model;
step S2: performing three-dimensional reconstruction on the surgical instrument to obtain a virtual instrument model;
and step S3: receiving an instruction input by an operating device and converting the instruction into a motion parameter of the virtual instrument model so as to drive the virtual instrument model to move in the virtual tissue model according to the motion parameter;
and step S4: and judging whether the motion of the virtual instrument model meets a preset standard or not, and if not, giving a warning.
Preferably, after obtaining the virtual tissue model and the virtual instrument model, the surgical robot simulation method further includes:
step S31: obtaining a planned path of the virtual instrument model based on a preset algorithm according to the virtual tissue model;
step S32: and guiding the motion direction of the virtual instrument model according to the planned path.
Preferably, the predetermined tissue includes a thoracoabdominal region, and the virtual tissue model includes a breathing cycle variation model.
Preferably, the breathing cycle variation model is obtained by interpolating key points based on a plurality of groups of medical images; or the breathing cycle variation model is obtained based on a volume and time correspondence formula of the lung lobes.
Preferably, the surgical robot simulation method further includes: and establishing a virtual safety boundary according to a preset distance based on the planned path.
Preferably, the virtual safety boundary comprises an inner layer and an outer layer according to a distance relative to the planned path.
Preferably, the preset specification includes: the virtual instrument model moves along a planned path, and/or the operation mode of the virtual instrument model is operated according to the operation flow of the surgical instrument of the corresponding type.
Preferably, the surgical robot simulation method further includes:
displaying at least one of the virtual tissue model, the virtual instrument model, the motion of the virtual instrument model, and the judgment result of the behavior judgment module; preferably, the planned path can also be displayed; and
accepting input interaction information to adjust parameters of at least one of the virtual tissue model, the virtual instrument model, the motion of the virtual instrument model, and the preset specification; preferably, the parameters of the planned path are also adjustable.
The above steps can be understood with reference to the description of the surgical robot simulation system described above. Further, the embodiment of the present invention also provides a readable storage medium, on which a program is stored, and when the program runs, the surgical robot simulation method as described above is implemented. The readable storage medium may be disposed independently, or may be disposed in the surgical robot simulation system in an integrated manner, such as being integrated in the host 01 or in the surgical robot, which is not limited in the present invention.
In summary, in the surgical robot simulation system, the simulation method and the readable storage medium provided by the present invention, the surgical robot simulation system includes: the system comprises a modeling module, an interaction module and a behavior judgment module; the modeling module is used for carrying out three-dimensional reconstruction on a preset tissue according to the medical image so as to obtain a virtual tissue model; the modeling module is also used for carrying out three-dimensional reconstruction on the surgical instrument so as to obtain a virtual instrument model; the interaction module is used for receiving an instruction input by the operating device and converting the instruction into the motion parameters of the virtual instrument model so as to drive the virtual instrument model to move in the virtual tissue model according to the motion parameters; the behavior judgment module is used for judging whether the motion of the virtual instrument model meets a preset standard or not, and if not, a warning is given out. According to the configuration, the virtual instrument model of the surgical instrument can be driven to move in the virtual tissue model based on the instruction input by the operating device and received by the interaction module, meanwhile, the behavior judging module can judge whether the movement of the virtual instrument model meets the preset specification or not, and if not, a warning is given out. Therefore, the operation process can be simulated in function, doctors can use similar operation devices to practice repeatedly before real operations through the external operation device, corresponding warning feedback is provided for error conditions occurring in the simulated operation process, and the learning curve of the doctors on the operation robot platform is shortened.
It should be noted that, several of the above embodiments may be combined with each other. The above description is only for the purpose of describing the preferred embodiments of the present invention, and is not intended to limit the scope of the present invention, and any variations and modifications made by those skilled in the art based on the above disclosure are within the scope of the appended claims.

Claims (16)

1. A surgical robot simulation system, comprising: the system comprises a modeling module, an interaction module, a behavior judgment module, an interface interaction module and an operation device;
the modeling module is used for carrying out three-dimensional reconstruction on a preset tissue according to the medical image so as to obtain a virtual tissue model; the modeling module is also used for carrying out three-dimensional reconstruction on the surgical instrument so as to obtain a virtual instrument model;
the interaction module is used for receiving an instruction input by the operating device and converting the instruction into the motion parameters of the virtual instrument model so as to drive the virtual instrument model to move in the virtual tissue model according to the motion parameters;
the behavior judging module is used for judging whether the motion of the virtual instrument model meets a preset standard or not, and if not, a warning is sent out;
the predetermined tissue comprises a thoracoabdominal region, and the virtual tissue model comprises a breathing cycle variation model;
the interface interaction module is used for displaying at least one of the virtual tissue model, the virtual instrument model, the motion of the virtual instrument model and the judgment result of the behavior judgment module;
the interface interaction module is further used for receiving input interaction information so as to adjust parameters of at least one of the virtual tissue model, the virtual instrument model, the motion of the virtual instrument model and the preset specification;
the operating device is in communication connection with the interaction module and used for outputting instructions to drive the virtual instrument model to move; the operation device is a control handle of the surgical robot which actually corresponds to the operation device during operation.
2. The surgical robot simulation system of claim 1, further comprising a planning module and a navigation module;
the planning module is used for obtaining a planned path of the virtual instrument model based on a preset algorithm according to the virtual organization model;
the navigation module is used for guiding the motion direction of the virtual instrument model according to the planned path.
3. The surgical robot simulation system of claim 2, wherein the preset algorithm comprises a skeletonization algorithm.
4. The surgical robot simulation system of claim 3, wherein the preset algorithm further comprises a shortest path search algorithm.
5. The surgical robot simulation system of claim 2, wherein the navigation module is further configured to establish a virtual safety boundary based on a preset distance along the planned path.
6. The surgical robot simulation system of claim 5, wherein the virtual safety boundary comprises an inner layer and an outer layer according to a distance relative to the planned path.
7. The surgical robot simulation system according to claim 1, wherein the respiratory cycle variation model is obtained by interpolating key points based on a plurality of sets of the medical images; or the breathing cycle variation model is obtained based on a volume-to-time correspondence of the lung lobes.
8. The surgical robot simulation system of claim 1, wherein the virtual instrument model includes at least one of a morphology, a size, and a manner of operation of a surgical instrument.
9. The surgical robot simulation system of claim 1,
the virtual instrument model moves along a planned path, and/or the operation mode of the virtual instrument model is operated according to the operation flow of the corresponding type of surgical instrument.
10. A surgical robot simulation method, comprising:
performing three-dimensional reconstruction on a predetermined tissue according to the medical image to obtain a virtual tissue model; the predetermined tissue comprises a thoracoabdominal region, and the virtual tissue model comprises a breathing cycle variation model;
performing three-dimensional reconstruction on the surgical instrument to obtain a virtual instrument model;
receiving an instruction input by an operating device and converting the instruction into a motion parameter of the virtual instrument model so as to drive the virtual instrument model to move in the virtual tissue model according to the motion parameter; the operation device is a control handle of the surgical robot which actually corresponds to the operation device during the surgical operation;
judging whether the motion of the virtual instrument model meets a preset standard or not, and if not, sending a warning;
displaying at least one of the virtual tissue model, the virtual instrument model, the motion of the virtual instrument model, and the judgment result of the behavior judgment module; and
accepting input interaction information to adjust parameters of at least one of the virtual tissue model, the virtual instrument model, motion of the virtual instrument model, and the preset specification.
11. A surgical robot simulation method according to claim 10, wherein after obtaining the virtual tissue model and the virtual instrument model, the surgical robot simulation method further comprises:
obtaining a planned path of the virtual instrument model based on a preset algorithm according to the virtual tissue model;
and guiding the motion direction of the virtual instrument model according to the planned path.
12. The surgical robot simulation method according to claim 11, further comprising:
and establishing a virtual safety boundary according to a preset distance based on the planned path.
13. A surgical robot simulation method according to claim 12, wherein the virtual safety boundary comprises an inner layer and an outer layer according to a distance relative to the planned path.
14. The surgical robot simulation method according to claim 12, wherein the breathing cycle variation model is obtained by interpolating key points based on a plurality of sets of the medical images; or the breathing cycle variation model is obtained based on a volume-to-time correspondence of the lung lobes.
15. The surgical robot simulation method according to claim 12, wherein the preset specifications include:
the virtual instrument model moves along a planned path, and/or the operation mode of the virtual instrument model is operated according to the operation flow of the corresponding type of surgical instrument.
16. A readable storage medium on which a program is stored, characterized in that the program, when executed, implements a surgical robot simulation method according to any one of claims 10 to 15.
CN202111069142.XA 2021-09-13 2021-09-13 Surgical robot simulation system, simulation method, and readable storage medium Active CN113616336B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111069142.XA CN113616336B (en) 2021-09-13 2021-09-13 Surgical robot simulation system, simulation method, and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111069142.XA CN113616336B (en) 2021-09-13 2021-09-13 Surgical robot simulation system, simulation method, and readable storage medium

Publications (2)

Publication Number Publication Date
CN113616336A CN113616336A (en) 2021-11-09
CN113616336B true CN113616336B (en) 2023-04-14

Family

ID=78389852

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111069142.XA Active CN113616336B (en) 2021-09-13 2021-09-13 Surgical robot simulation system, simulation method, and readable storage medium

Country Status (1)

Country Link
CN (1) CN113616336B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114081624B (en) * 2021-11-10 2023-06-27 武汉联影智融医疗科技有限公司 Virtual simulation system of surgical robot

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108778113A (en) * 2015-09-18 2018-11-09 奥瑞斯健康公司 The navigation of tubulose network
CN113035038A (en) * 2021-03-29 2021-06-25 安徽工业大学 Virtual orthopedic surgery exercise system and simulation training method

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190088162A1 (en) * 2016-03-04 2019-03-21 Covidien Lp Virtual and/or augmented reality to provide physical interaction training with a surgical robot
CN106901834A (en) * 2016-12-29 2017-06-30 陕西联邦义齿有限公司 The preoperative planning of minimally invasive cardiac surgery and operation virtual reality simulation method
CN106974730A (en) * 2017-04-01 2017-07-25 上海术理智能科技有限公司 Surgical simulation method, device and equipment based on virtual reality and medical image
KR101880246B1 (en) * 2017-12-28 2018-07-19 (주)휴톰 Method, apparatus and program for controlling surgical image play
CN108320645B (en) * 2018-01-19 2020-02-07 中南大学湘雅二医院 Medical simulation training method
CN113317877B (en) * 2020-02-28 2022-06-17 上海微创卜算子医疗科技有限公司 Augmented reality surgical robot system and augmented reality equipment

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108778113A (en) * 2015-09-18 2018-11-09 奥瑞斯健康公司 The navigation of tubulose network
CN113035038A (en) * 2021-03-29 2021-06-25 安徽工业大学 Virtual orthopedic surgery exercise system and simulation training method

Also Published As

Publication number Publication date
CN113616336A (en) 2021-11-09

Similar Documents

Publication Publication Date Title
US20230088056A1 (en) Systems and methods for navigation in image-guided medical procedures
US20230145309A1 (en) Graphical user interface for planning a procedure
US20230301725A1 (en) Systems and methods of registration for image-guided procedures
US11864856B2 (en) Systems and methods of continuous registration for image-guided surgery
US12004820B2 (en) Systems and methods of registration for image-guided surgery
EP3576598B1 (en) System of registration for image-guided procedures
US10373719B2 (en) Systems and methods for pre-operative modeling
US10706543B2 (en) Systems and methods of registration for image-guided surgery
JP2020168374A (en) Systems and methods for filtering localization data
CN110087576A (en) System and method for elongated devices to be registrated to 3-D image in the program that image guides
CN113616333A (en) Catheter movement assistance method, catheter movement assistance system, and readable storage medium
US20230030727A1 (en) Systems and methods related to registration for image guided surgery
CN112423652A (en) Systems and methods related to registration for image guided surgery
CN113616336B (en) Surgical robot simulation system, simulation method, and readable storage medium
US20220054202A1 (en) Systems and methods for registration of patient anatomy
US20240164853A1 (en) User interface for connecting model structures and associated systems and methods
WO2023056188A1 (en) Systems and methods for target nodule identification
WO2023060198A1 (en) Medical instrument guidance systems, including guidance systems for percutaneous nephrolithotomy procedures, and associated devices and methods

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20220323

Address after: 201203 room 209, floor 2, building 1, No. 1601, Zhangdong Road, China (Shanghai) pilot Free Trade Zone, Pudong New Area, Shanghai (actual floor 3)

Applicant after: Shanghai Weiwei aviation robot Co.,Ltd.

Address before: Room 101, block B, building 1, No. 1601, Zhangdong Road, China (Shanghai) pilot Free Trade Zone, Pudong New Area, Shanghai, 201203

Applicant before: Shanghai minimally invasive medical robot (Group) Co.,Ltd.

TA01 Transfer of patent application right
GR01 Patent grant
GR01 Patent grant