CN113197665A - Minimally invasive surgery simulation method and system based on virtual reality - Google Patents

Minimally invasive surgery simulation method and system based on virtual reality Download PDF

Info

Publication number
CN113197665A
CN113197665A CN202110482041.9A CN202110482041A CN113197665A CN 113197665 A CN113197665 A CN 113197665A CN 202110482041 A CN202110482041 A CN 202110482041A CN 113197665 A CN113197665 A CN 113197665A
Authority
CN
China
Prior art keywords
module
data
dimensional image
dimensional
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
CN202110482041.9A
Other languages
Chinese (zh)
Inventor
曹立华
牛振凯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to CN202110482041.9A priority Critical patent/CN113197665A/en
Publication of CN113197665A publication Critical patent/CN113197665A/en
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/101Computer-aided simulation of surgical operations
    • A61B2034/102Modelling of surgical devices, implants or prosthesis
    • A61B2034/104Modelling the effect of the tool, e.g. the effect of an implanted prosthesis or for predicting the effect of ablation or burring
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/101Computer-aided simulation of surgical operations
    • A61B2034/105Modelling of the patient, e.g. for ligaments or bones
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/107Visualisation of planned trajectories or target regions

Landscapes

  • Health & Medical Sciences (AREA)
  • Surgery (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Medical Informatics (AREA)
  • Robotics (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Processing Or Creating Images (AREA)
  • Image Processing (AREA)
  • Apparatus For Radiation Diagnosis (AREA)

Abstract

The invention belongs to the technical field of operation simulation, and discloses a minimally invasive surgery simulation method and system based on virtual reality, wherein the minimally invasive surgery simulation system based on virtual reality comprises: the system comprises a data preprocessing module, a data importing module, a central control module, a three-dimensional image reconstruction module, a three-dimensional image generating module, a three-dimensional image processing module, a surgery simulation module, a warning reminding module, an effect evaluation module, a data storage module and an updating display module. The invention utilizes the virtual environment provided by the VR equipment to enable a doctor to carry out omnibearing observation on the three-dimensional model of the part to be operated in the virtual space, thereby facilitating the doctor to fully and sufficiently know the state of an illness of a patient; a simulated operation operating platform performed in a virtual space by using VR equipment facilitates the simulation of a doctor before an operation so as to realize more specific preoperative planning; the three-dimensional medical image model presented in the virtual reality is more real, and the interaction between a doctor and three-dimensional image data is realized.

Description

Minimally invasive surgery simulation method and system based on virtual reality
Technical Field
The invention belongs to the technical field of operation simulation, and particularly relates to a minimally invasive surgical operation simulation method and system based on virtual reality.
Background
Currently, the anorectal refers to the anus and rectum. Anorectal diseases are common diseases and frequently occur. Various diseases occurring in anus, anal canal and large intestine are called anorectal diseases in a broad sense, and the common anorectal diseases are more than 100.
An electronic anorectal imaging inspection system is an internationally recognized excellent anorectal imaging inspection system, a medical video camera shooting technology is adopted, the defects of traditional anorectal endoscopy inspection are overcome, doctors and patients can observe focus positions simultaneously in the inspection process, the focuses can be locked and then printed and imaged, the focuses can also be magnified and observed, image acquisition can be carried out on deep focus positions in the anorectal, real-time diagnosis is carried out, the doctors and the patients can clearly, accurately and intuitively know the state of an illness, and misdiagnosis and missed diagnosis are avoided. At present, the operation for treating anorectal diseases is realized by means of an electronic anorectal imaging inspection system, the convenience of the operation is greatly improved, but the operation needs abundant experience due to the position of anorectal and the particularity of anorectal operation treatment. In the prior art, a method for performing anorectal minimally invasive surgery simulation is not available, so that simulation of anorectal minimally invasive surgery cannot be realized, and study and operation of a intern are not facilitated.
Through the above analysis, the problems and defects of the prior art are as follows: in the prior art, a method for performing anorectal minimally invasive surgery simulation is not available, so that simulation of anorectal minimally invasive surgery cannot be realized, and study and operation of a intern are not facilitated.
Disclosure of Invention
Aiming at the problems in the prior art, the invention provides a minimally invasive surgery simulation method and system based on virtual reality, and aims to solve the problems that in the prior art, a minimally invasive surgery simulation method for anorectal surgery can not be performed temporarily, simulation of anorectal minimally invasive surgery cannot be realized, and study and operation of a practicing doctor are not facilitated.
The invention is realized in such a way that a minimally invasive surgery simulation method based on virtual reality comprises the following steps:
establishing a normal image coordinate system by using a data import program through a data import module, and importing medical image data of CT, MRI and PET scanning into a computer; preprocessing imported medical image data scanned by CT, MRI or PET by using an eFilm Workstation medical image processing software through a data preprocessing module, and segmenting to obtain two-dimensional image data sets of different enhanced time phases;
step two, the central control module coordinately controls the three-dimensional image reconstruction module by using a central processor, a singlechip or a controller to respectively reconstruct three-dimensional images of the acquired two-dimensional image data sets with different enhanced time phases on a three-dimensional space by using a computer image processing technology;
the reconstructing a three-dimensional image of a three-dimensional space from two-dimensional image datasets of different enhancement time phases obtained by using a computer image processing technology comprises:
(2.1) processing the obtained two-dimensional image data sets with different enhancement time phases to obtain forward projection data required by reconstruction; reconstructing the data by adopting an iterative reconstruction algorithm OSEM to obtain a plurality of groups of reconstructed data;
the formula for reconstructing the data by adopting the iterative reconstruction algorithm OSEM is as follows:
Figure BDA0003049634900000021
wherein,
Figure BDA0003049634900000022
for the image intensity of the jth voxel at different angles deg at the kth iteration, deg is defined as follows:
Figure BDA0003049634900000023
wherein S islIs the L (1, 2, 1, L) subset, L is the total subset number of partitions, M is the total prime number, giIs forward projection data; p is a radical ofijIs an element of the system response matrix P, i.e. the probability that a voxel j is detected by the corresponding line i;
the OSEM algorithm introduces the concept of ordered subsets into a maximum likelihood algorithm (MLEM), each reconstruction uses projection data in one subset to correct each pixel at the same time, the reconstructed image is updated once, one iteration (all subsets correct the pixels once) is completed, namely, the image is reconstructed n times in the process of one iteration, and therefore the convergence speed of the image is accelerated. Meanwhile, the algorithm can maintain the image quality while reducing the acquisition time of the projection data by adopting a large-angle sampling interval.
(2.2) based on the obtained reconstruction data group, calculating an offset vector between the coordinates of the registration points between two adjacent angle reconstruction results to obtain a system geometric offset vector;
(2.3) rotating the reconstruction data of other angles by a certain angle in the direction opposite to the rotation direction of the system except the initial angle reconstruction result to reconstruct a three-dimensional image on a three-dimensional space;
introducing the medical image at the two-dimensional voxel level into a Unity3D engine through a three-dimensional stereo image generation module, and converting the two-dimensional image into a three-dimensional image through a computer to generate a real-time dynamic three-dimensional stereo image;
respectively carrying out segmentation, modeling and rendering processing on each part of the three-dimensional image by using an image processing program through a three-dimensional image processing module; constructing an operation planning and simulation operation platform by using a unity3D engine through an operation simulation module, inputting virtual and realistic patient data, and performing minimally invasive surgery simulation based on the generated three-dimensional image and the real-time dynamic three-dimensional image;
fifthly, warning and reminding are carried out on the operation steps with errors or risks in the simulation operation process by using a warning and reminding module and a warning device; evaluating the simulated operation effect of the doctor after the operation by utilizing an evaluation program through an effect evaluation module; storing medical image data, a two-dimensional image data set, a three-dimensional image, a real-time dynamic three-dimensional image, warning reminding data and an effect evaluation result by using a database through a data storage module;
and sixthly, updating and displaying the medical image data, the two-dimensional image data set, the three-dimensional image, the real-time dynamic three-dimensional image, the warning reminding data and the real-time data of the effect evaluation result by using the display through the updating and displaying module.
Further, in the first step, the preprocessing the imported medical image data of CT, MRI or PET scan by the data preprocessing module using the eFilm work medical image processing software includes: and normalizing the acquired scanned medical image data to obtain an image after normalization processing.
Further, the normalizing process of the acquired scanned medical image data includes:
(1) obtaining a medical image through the scanned medical image data;
(2) establishing a normalized map library, wherein the normalized map library is a set of normalized maps normalized to the same image coordinate system;
(3) calculating the distance between a normalization map and a medical image to be processed, wherein the medical image to be processed and the normalization map are normalized to the same image coordinate system;
(4) determining a reference map based on the distances between the plurality of normalized maps and the medical image to be processed; and segmenting the medical image data to be processed based on the reference map.
Further, the rendering processing of each part of the three-dimensional stereoscopic image by the stereoscopic image processing module using the image processing program respectively includes:
(1) obtaining a depth map, a normal map, a color map of a three-dimensional image to be rendered, and coordinates of each point in the image in a projection space and a world space;
(2) obtaining a normal vector of each point in the three-dimensional image to be rendered in the world space; according to the color map, obtaining color information of each point in the three-dimensional image to be rendered;
(3) performing illumination calculation according to coordinates of each point in the three-dimensional image to be rendered in a world space, a normal vector of each point in the world space and color information to obtain pixel colors of each point in the three-dimensional image to be rendered;
(4) and outputting pixels according to the coordinates and pixel colors of each point in the three-dimensional image to be rendered in the projection space, and obtaining the rendered three-dimensional image.
Further, the performing minimally invasive surgical simulation comprises:
(1) performing preoperative planning on the operation through a simulated 3D model of the focus of the patient according to the input virtual actualized patient data;
(2) and performing minimally invasive surgery simulation through the control handle in a virtual reality environment by utilizing VR glasses.
Further, performing minimally invasive surgical simulation using an interactive method, comprising: design of incision size, selection of incision location, adjustment of endoscope access path, adjustment of endoscope field of view, and use of surgical instruments.
Further, the evaluation of the simulated operation effect of the doctor by the effect evaluation module through an evaluation program after the operation comprises the following steps:
evaluating the simulated operation effect of the doctor after the operation by utilizing an evaluation program through an effect evaluation module; if the evaluation effect is general, the preoperative planning is unreasonable, and the preoperative planning should be carried out again; if the evaluation effect is good, the preoperative planning is reasonable.
Another object of the present invention is to provide a virtual reality-based minimally invasive surgery simulation system applying the virtual reality-based minimally invasive surgery simulation method, the virtual reality-based minimally invasive surgery simulation system comprising:
the system comprises a data preprocessing module, a data importing module, a central control module, a three-dimensional image reconstruction module, a three-dimensional image generating module, a three-dimensional image processing module, a surgery simulation module, a warning reminding module, an effect evaluation module, a data storage module and an updating display module;
the data preprocessing module is connected with the central control module and used for preprocessing imported medical image data scanned by CT, MRI or PET through eFilm work medical image processing software and manually segmenting to obtain two-dimensional image data sets with different enhanced time phases;
the data import module is connected with the central control module and used for establishing a normal image coordinate system through a data import program and importing medical image data scanned by CT, MRI and PET into a computer;
the central control module is connected with the data preprocessing module, the data importing module, the three-dimensional image reconstruction module, the three-dimensional image generating module, the three-dimensional image processing module, the operation simulation module, the warning reminding module, the effect evaluation module, the data storage module and the updating display module and is used for coordinating and controlling the normal operation of each module of the minimally invasive surgery simulation system based on the virtual reality through the central processing unit;
the three-dimensional image reconstruction module is connected with the central control module and is used for reconstructing a three-dimensional image of the obtained two-dimensional medical image on a three-dimensional space through a computer image processing technology;
the three-dimensional image generation module is connected with the central control module and used for guiding the two-dimensional voxel-level medical images into a Unity3D engine, and converting the two-dimensional images into three-dimensional images through a computer to generate real-time dynamic three-dimensional images;
the three-dimensional image processing module is connected with the central control module and is used for respectively carrying out segmentation, modeling and rendering processing on each part of the three-dimensional image through an image processing program;
the operation simulation module is connected with the central control module and used for constructing a platform for operation planning and operation simulation through a unity3D engine, inputting virtual and realistic patient data and carrying out minimally invasive surgery simulation;
the warning and reminding module is connected with the central control module, and in the process of simulating the operation, the minimally invasive surgery simulation system based on the virtual reality can provide instant feedback for the operation of a doctor and warn and remind wrong or dangerous operation steps through the early warning device;
the effect evaluation module is connected with the central control module and is used for evaluating the simulated operation effect of the doctor after the operation through an evaluation program;
the data storage module is connected with the central control module and used for storing medical image data, a two-dimensional image data set, a real-time dynamic three-dimensional image, warning reminding data and an effect evaluation result through a database;
and the updating display module is connected with the central control module and is used for updating and displaying the medical image data, the two-dimensional image data set, the real-time dynamic three-dimensional image, the warning reminding data and the real-time data of the effect evaluation result through the display.
It is a further object of the present invention to provide a computer program product stored on a computer readable medium, comprising a computer readable program for providing a user input interface for implementing said virtual reality based minimally invasive surgical simulation method when executed on an electronic device.
It is another object of the present invention to provide a computer-readable storage medium storing instructions that, when executed on a computer, cause the computer to perform the virtual reality-based minimally invasive surgical simulation method.
By combining all the technical schemes, the invention has the advantages and positive effects that: according to the minimally invasive surgery simulation method based on virtual reality, provided by the invention, a doctor can carry out all-around observation on a three-dimensional model of a part to be operated in a virtual space by utilizing a virtual environment provided by VR equipment, so that the doctor can sufficiently know the state of illness of a patient; utilize the simulation operation platform that VR equipment goes on in virtual space, make things convenient for the doctor to carry out the simulation operation before the art in order to realize more specific preoperative planning. Meanwhile, compared with the prior art, the invention also has the following advantages:
(1) compared with computer software with complex operation in the prior art, the equipment provided by the invention is more convenient to use, does not need special training, is simple to operate, can realize the functions of touching and using tools such as a simulated scalpel and the like in a virtual reality environment, and has simple steps;
(2) the system provided by the invention can help a doctor to intuitively obtain the model of the part to be operated of the patient before the operation, and a reasonable operation plan is made according to the size, the position and the serious conditions of the focus presented by the model, so that the best effect can be achieved in the operation process.
(3) Compared with the traditional two-dimensional medical image picture, the three-dimensional medical image model presented in the virtual reality is more real, a doctor can diagnose and analyze the image data of a patient in a multi-angle and all-around manner, and meanwhile, the virtual reality technology can realize the interaction between the doctor and the three-dimensional image data.
(4) By displaying the size data of the focus of a diseased organ, the depth value of the focus in the organ and the distance value between the focus and a main blood vessel in the organ in the three-dimensional model, a doctor is helped to quantify uncertain parameters judged by depending on personal experience skills in preoperative planning; the simulated operation platform is provided, so that a doctor can perform simulated operation in advance in a virtual operation scene by utilizing VR equipment before the operation, and the doctor can evaluate the operation risk and difficulty level in the process of the simulated operation.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the embodiments of the present invention will be briefly described below, and it is obvious that the drawings described below are only some embodiments of the present invention, and it is obvious for those skilled in the art that other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 is a flowchart of a virtual reality-based minimally invasive surgery simulation method provided by an embodiment of the invention.
Fig. 2 is a schematic diagram of a virtual reality-based minimally invasive surgery simulation method provided by an embodiment of the invention.
Fig. 3 is a flowchart of a method for reconstructing a three-dimensional image from an acquired two-dimensional medical image in a three-dimensional space by using a computer image processing technique through a three-dimensional image reconstruction module according to an embodiment of the present invention.
Fig. 4 is a flowchart of a method for rendering each part of a three-dimensional stereoscopic image by using an image processing program through a stereoscopic image processing module according to an embodiment of the present invention.
FIG. 5 is a block diagram of a virtual reality-based simulation system for minimally invasive surgery provided by an embodiment of the invention;
in the figure: 1. a data preprocessing module; 2. a data import module; 3. a central control module; 4. a three-dimensional image reconstruction module; 5. a three-dimensional image generation module; 6. a stereo image processing module; 7. a surgical simulation module; 8. a warning alert module; 9. an effect evaluation module; 10. a data storage module; 11. and updating the display module.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is further described in detail with reference to the following embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
Aiming at the problems in the prior art, the invention provides a minimally invasive surgery simulation method and system based on virtual reality, and the invention is described in detail below with reference to the accompanying drawings.
As shown in fig. 1 to 2, a virtual reality-based minimally invasive surgery simulation method provided by an embodiment of the present invention includes the following steps:
s101, establishing a normal image coordinate system by a data import program through a data import module, and importing medical image data of CT, MRI and PET scanning into a computer;
s102, preprocessing imported medical image data of CT, MRI or PET scanning by using an eFilm Workstation medical image processing software through a data preprocessing module, and manually segmenting to obtain two-dimensional image data sets of different enhanced time phases;
s103, the central control module coordinately controls the three-dimensional image reconstruction module through the central processing unit, and a three-dimensional image is reconstructed from the obtained two-dimensional medical image on a three-dimensional space by using a computer image processing technology;
s104, introducing the two-dimensional voxel-level medical image into a Unity3D engine through a three-dimensional stereo image generation module, and converting the two-dimensional image into a three-dimensional image through a computer to generate a real-time dynamic three-dimensional stereo image;
s105, respectively carrying out segmentation, modeling and rendering processing on each part of the three-dimensional image by using an image processing program through a three-dimensional image processing module;
s106, constructing an operation planning and simulation operation platform by using a unity3D engine through an operation simulation module, inputting virtual and realistic patient data, and performing minimally invasive surgery simulation based on the generated three-dimensional image and the dynamic three-dimensional image;
s107, warning and reminding are carried out on the operation steps with errors or risks in the process of the simulated operation by using a warning and reminding module and a warning device; evaluating the simulated operation effect of the doctor after the operation by utilizing an evaluation program through an effect evaluation module;
s108, storing medical image data, a two-dimensional image data set, a real-time dynamic three-dimensional image, warning reminding data and an effect evaluation result by using a database through a data storage module;
and S109, updating and displaying the medical image data, the two-dimensional image data set, the real-time dynamic three-dimensional image, the warning reminding data and the real-time data of the effect evaluation result by the display through the updating and displaying module.
As shown in fig. 3, a virtual reality-based minimally invasive surgery simulation system according to an embodiment of the present invention includes: the system comprises a data preprocessing module 1, a data importing module 2, a central control module 3, a three-dimensional image reconstruction module 4, a three-dimensional image generating module 5, a three-dimensional image processing module 6, an operation simulation module 7, a warning reminding module 8, an effect evaluation module 9, a data storage module 10 and an updating display module 11.
The data preprocessing module 1 is connected with the central control module 3 and used for preprocessing medical image data from CT, MRI or PET scanning through eFilm work medical image processing software and obtaining two-dimensional image data sets with different enhanced time phases through manual segmentation;
the data import module 2 is connected with the central control module 3 and used for establishing a normal image coordinate system through a data import program and importing medical image data of CT, MRI and PET scanning into a computer;
the central control module 3 is connected with the data preprocessing module 1, the data importing module 2, the three-dimensional image reconstructing module 4, the three-dimensional image generating module 5, the three-dimensional image processing module 6, the operation simulation module 7, the warning reminding module 8, the effect evaluating module 9, the data storage module 10 and the updating display module 11, and is used for coordinating and controlling the normal operation of each module of the virtual reality-based minimally invasive surgery simulation system through a central processing unit;
the three-dimensional image reconstruction module 4 is connected with the central control module 3 and is used for reconstructing a three-dimensional image of the obtained two-dimensional medical image on a three-dimensional space through a computer image processing technology;
the three-dimensional image generation module 5 is connected with the central control module 3 and is used for introducing the two-dimensional voxel-level medical images into a Unity3D engine, and converting the two-dimensional images into three-dimensional images through a computer to generate real-time dynamic three-dimensional images;
the three-dimensional image processing module 6 is connected with the central control module 3 and is used for respectively carrying out segmentation, modeling and rendering processing on each part of the three-dimensional image through an image processing program;
the operation simulation module 7 is connected with the central control module 3 and used for constructing a platform for operation planning and simulation operation through a unity3D engine, inputting virtual and realistic patient data and carrying out minimally invasive surgery simulation;
the warning and reminding module 8 is connected with the central control module 3, and in the process of the simulated operation, the minimally invasive surgery simulation system based on the virtual reality can provide instant feedback for the operation of a doctor, and warning and reminding are carried out on wrong or dangerous operation steps through an early warning device;
the effect evaluation module 9 is connected with the central control module 3 and is used for evaluating the simulated operation effect of the doctor after the operation through an evaluation program;
the data storage module 10 is connected with the central control module 3 and is used for storing medical image data, a two-dimensional image data set, a real-time dynamic three-dimensional image, warning reminding data and an effect evaluation result through a database;
and the updating display module 11 is connected with the central control module 3 and is used for updating and displaying the medical image data, the two-dimensional image data set, the real-time dynamic three-dimensional image, the warning reminding data and the real-time data of the effect evaluation result through a display.
The invention is further described with reference to specific examples.
The technical solution of the present invention is further illustrated by the following specific examples.
Example 1
As shown in fig. 1, a virtual reality-based minimally invasive surgery simulation method according to an embodiment of the present invention is a preferred embodiment, where a data preprocessing module preprocesses medical image data from CT, MRI, or PET scanning using an eflm work medical image processing software, and the method includes: and normalizing the acquired scanned medical image data to obtain an image after normalization processing.
The normalization processing of the acquired scanned medical image data provided by the embodiment of the invention comprises the following steps:
obtaining a medical image through the scanned medical image data; establishing a normalized map library, wherein the normalized map library is a set of normalized maps normalized to the same image coordinate system; calculating the distance between a normalization map and a medical image to be processed, wherein the medical image to be processed and the normalization map are normalized to the same image coordinate system; determining a reference map based on the distances between the plurality of normalized maps and the medical image to be processed; and segmenting the medical image data to be processed based on the reference map.
Example 2
The virtual reality-based minimally invasive surgery simulation method provided by the embodiment of the invention is shown in fig. 1, as a preferred embodiment, as shown in fig. 4, the method for reconstructing a three-dimensional image from an obtained two-dimensional medical image on a three-dimensional space by using a three-dimensional image reconstruction module and a computer image processing technology provided by the embodiment of the invention comprises the following steps:
s201, the data coincidence processing system is used for performing coincidence processing on the acquired two-dimensional image data to acquire forward projection data required by reconstruction;
s202, reconstructing data by adopting an iterative reconstruction algorithm OSEM to obtain a plurality of groups of reconstructed data;
s203, based on the reconstruction data set, calculating an offset vector between the coordinates of the registration points between two adjacent angle reconstruction results to obtain a system geometric offset vector;
s204, besides the initial angle reconstruction result, the reconstruction data of other angles rotate a certain angle in the direction opposite to the rotation direction of the system, and a three-dimensional image is reconstructed in a three-dimensional space.
The formula for reconstructing data by adopting an iterative reconstruction algorithm OSEM provided by the embodiment of the invention is as follows:
Figure BDA0003049634900000111
wherein,
Figure BDA0003049634900000112
for the image intensity of the jth voxel at different angles deg at the kth iteration, deg is defined as follows:
Figure BDA0003049634900000121
wherein S islIs the L (1, 2, 1, L) subset, L is the total subset number of partitions, M is the total prime number, giIs forward projection data; p is a radical ofijIs the probability that an element of the system response matrix P, i.e. a voxel j, is detected by the corresponding line i.
Example 3
As shown in fig. 1 and fig. 5, as a preferred embodiment, the virtual reality-based minimally invasive surgery simulation method according to an embodiment of the present invention performs rendering processing on each part of a three-dimensional image by using an image processing program through a stereo image processing module, and includes:
s301, obtaining a depth map, a normal map and a color map of a three-dimensional image to be rendered, and coordinates of each point in the image in a projection space and a world space;
s302, obtaining a normal vector of each point in the three-dimensional image to be rendered in the world space; according to the color map, obtaining color information of each point in the three-dimensional image to be rendered;
s303, performing illumination calculation according to coordinates of each point in the three-dimensional image to be rendered in a world space, a normal vector of each point in the world space and color information to obtain the pixel color of each point in the three-dimensional image to be rendered;
s304, outputting pixels according to the coordinates and pixel colors of each point in the three-dimensional image to be rendered in the projection space, and obtaining the rendered three-dimensional image.
Example 4
As shown in fig. 1, as a preferred embodiment, the method for simulating a minimally invasive surgical operation based on virtual reality according to the embodiment of the present invention includes the steps of, by using a unity3D engine to construct an operation planning and simulation platform through an operation simulation module, inputting virtual reality-based patient data, and performing minimally invasive surgical operation simulation, including:
performing preoperative planning on the operation through a simulated 3D model of the focus of the patient according to the real condition of the patient; and performing minimally invasive surgery simulation through the control handle in a virtual reality environment by utilizing VR glasses.
Example 5
As shown in fig. 1, the virtual reality-based minimally invasive surgical simulation method according to an embodiment of the present invention is a preferred embodiment, and the minimally invasive surgical simulation performed by using an interactive method according to an embodiment of the present invention includes: design of incision size, selection of incision location, adjustment of endoscope access path, adjustment of endoscope field of view, and use of surgical instruments.
Example 6
As shown in fig. 1, the virtual reality-based minimally invasive surgery simulation method according to an embodiment of the present invention is a preferred embodiment, and the evaluation of the simulated surgery effect of a doctor after surgery by using an evaluation program through an effect evaluation module according to an embodiment of the present invention includes:
evaluating the simulated operation effect of the doctor after the operation by utilizing an evaluation program through an effect evaluation module;
if the evaluation effect is general, the preoperative planning is unreasonable, and the preoperative planning should be carried out again;
if the evaluation effect is good, the preoperative planning is reasonable.
Example 7
The minimally invasive surgery simulation method based on virtual reality provided by the embodiment of the invention specifically comprises the following steps:
(1) medical image data transmission: the system uses the medical image processing software of eFilm Workstation (Version 2.0.1) to preprocess the data from CT or MRI, and obtains two-dimensional image data sets of different enhanced time phases by manual segmentation, thereby establishing a normal image coordinate system to lead the scanned medical images of CT, MRI, PET and the like into a computer;
(2) three-dimensional modeling: and reconstructing a three-dimensional image of the obtained two-dimensional medical image on a three-dimensional space by using a computer image processing technology. The three-dimensional modeling mainly comprises four steps: firstly, a data coincidence processing system is used for performing coincidence processing on acquired two-dimensional image data to acquire forward projection data required by reconstruction; secondly, reconstructing data by adopting an iterative reconstruction algorithm OSEM to obtain a plurality of groups of reconstructed data; thirdly, acquiring a system geometric offset vector; fourthly, besides the initial angle reconstruction result, the reconstruction data of other angles rotate a certain angle in the direction opposite to the rotation direction of the system. Therefore, three-dimensional modeling is realized;
(3) and (3) outputting the medical image model in a virtual reality manner: the medical image at the two-dimensional voxel level is introduced into a Unity3D engine, and a computer converts a two-dimensional image into a three-dimensional image to generate a real-time dynamic three-dimensional image. Then, each part of the three-dimensional image is segmented, modeled and rendered, and an interactive method for simulating the operation process in a three-dimensional view is provided, which comprises the steps of designing the size of a cut, selecting the position of the cut, adjusting the entry path of an endoscope, adjusting the visual field of the endoscope and using a surgical instrument, so that a doctor can not only use a display to see a two-dimensional plane image, but also can more truly observe the three-dimensional image in a close distance manner in a virtual reality environment and interact with the image;
(4) surgery planning and simulation of real patient images: the unity3D engine is used for constructing a platform for surgical planning and simulated surgery, and virtual reality patient data is input, wherein the virtual reality equipment supported by the system comprises HTC VIVE and Oculus Rift. Firstly, a doctor plans a surgery before the surgery through a simulated 3D model of a focus of a patient according to a real state of a patient, and then utilizes VR glasses to simulate the surgery through a control handle in a virtual reality environment. If the evaluation effect is general, the preoperative planning is unreasonable, and the preoperative planning should be carried out again; if the evaluation effect is good, the preoperative planning is reasonable.
Further, the three-dimensional modeling specifically includes:
(1) the data coincidence processing system is used for performing coincidence processing on the acquired two-dimensional image data to acquire forward projection data required by reconstruction;
(2) reconstructing the data by adopting an iterative reconstruction algorithm OSEM to obtain a plurality of groups of reconstructed data;
(3) obtaining a system geometric offset vector;
(4) besides the initial angle reconstruction result, the reconstruction data of other angles rotate a certain angle in the direction opposite to the rotation direction of the system, and three-dimensional modeling is realized.
Further, the interactive method for simulating the surgical procedure in the three-dimensional view in step (3) includes selecting the incision position by selecting the incision position of the surgical instrument simulated by the VR handle in the virtual reality environment, adjusting the incision size by controlling the length of the moving path of the VR handle in the process of incising the organ, adjusting the position and the visual field of the medical imaging device by touching a two-dimensional touch disc on the VR handle, and using the simulated surgical instrument by pulling a trigger button on the VR handle.
Further, the step (4) specifically includes:
firstly, a doctor carries out preoperative planning on a surgery through a simulated 3D model of a focus of a patient according to the real state of illness of the patient;
then, VR glasses are used for carrying out simulated surgery through a control handle in a virtual reality environment, in the process of the simulated surgery, the system can provide instant feedback for the operation carried out by a doctor, warn and remind wrong or risky operation steps, and meanwhile, evaluate the effect of the simulated surgery of the doctor after the surgery; if the evaluation effect is general, the preoperative planning is unreasonable, and the preoperative planning should be carried out again; if the evaluation effect is good, the preoperative planning is reasonable.
In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When used in whole or in part, can be implemented in a computer program product that includes one or more computer instructions. When loaded or executed on a computer, cause the flow or functions according to embodiments of the invention to occur, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored in a computer readable storage medium or transmitted from one computer readable storage medium to another, for example, the computer instructions may be transmitted from one website site, computer, server, or data center to another website site, computer, server, or data center via wire (e.g., coaxial cable, fiber optic, Digital Subscriber Line (DSL), or wireless (e.g., infrared, wireless, microwave, etc.)). The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device, such as a server, a data center, etc., that includes one or more of the available media. The usable medium may be a magnetic medium (e.g., floppy Disk, hard Disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., Solid State Disk (SSD)), among others.
The above description is only for the purpose of illustrating the present invention and the appended claims are not to be construed as limiting the scope of the invention, which is intended to cover all modifications, equivalents and improvements that are within the spirit and scope of the invention as defined by the appended claims.

Claims (10)

1. A virtual reality-based minimally invasive surgery simulation method is characterized by comprising the following steps:
establishing a normal image coordinate system by using a data import program through a data import module, and importing medical image data of CT, MRI and PET scanning into a computer; preprocessing imported medical image data scanned by CT, MRI or PET by using an eFilm Workstation medical image processing software through a data preprocessing module, and segmenting to obtain two-dimensional image data sets of different enhanced time phases;
step two, the central control module coordinately controls the three-dimensional image reconstruction module by using a central processor, a singlechip or a controller to respectively reconstruct three-dimensional images of the acquired two-dimensional image data sets with different enhanced time phases on a three-dimensional space by using a computer image processing technology;
the reconstructing a three-dimensional image of a three-dimensional space from two-dimensional image datasets of different enhancement time phases obtained by using a computer image processing technology comprises:
(2.1) processing the obtained two-dimensional image data sets with different enhancement time phases to obtain forward projection data required by reconstruction; reconstructing the data by adopting an iterative reconstruction algorithm OSEM to obtain a plurality of groups of reconstructed data;
the formula for reconstructing the data by adopting the iterative reconstruction algorithm OSEM is as follows:
Figure FDA0003049634890000011
wherein,
Figure FDA0003049634890000012
for the image intensity of the jth voxel at different angles deg at the kth iteration, deg is defined as follows:
Figure FDA0003049634890000013
wherein S islIs the L (1, 2, 1, L) subset, L is the total subset number of partitions, M is the total prime number, giIs forward projection data; p is a radical ofijIs an element of the system response matrix P, i.e. the probability that a voxel j is detected by the corresponding line i;
the OSEM algorithm introduces the concept of ordered subsets into a maximum likelihood algorithm (MLEM), each reconstruction uses projection data in one subset to correct each pixel at the same time, the reconstructed image is updated once, one iteration (all subsets correct the pixels once) is completed, namely, the image is reconstructed n times in the process of one iteration, and therefore the convergence speed of the image is accelerated. Meanwhile, the algorithm can maintain the image quality while reducing the acquisition time of the projection data by adopting a large-angle sampling interval.
(2.2) based on the obtained reconstruction data group, calculating an offset vector between the coordinates of the registration points between two adjacent angle reconstruction results to obtain a system geometric offset vector;
(2.3) rotating the reconstruction data of other angles by a certain angle in the direction opposite to the rotation direction of the system except the initial angle reconstruction result to reconstruct a three-dimensional image on a three-dimensional space;
introducing the medical image at the two-dimensional voxel level into a Unity3D engine through a three-dimensional stereo image generation module, and converting the two-dimensional image into a three-dimensional image through a computer to generate a real-time dynamic three-dimensional stereo image;
respectively carrying out segmentation, modeling and rendering processing on each part of the three-dimensional image by using an image processing program through a three-dimensional image processing module; constructing an operation planning and simulation operation platform by using a unity3D engine through an operation simulation module, inputting virtual and realistic patient data, and performing minimally invasive surgery simulation based on the generated three-dimensional image and the real-time dynamic three-dimensional image;
fifthly, warning and reminding are carried out on the operation steps with errors or risks in the simulation operation process by using a warning and reminding module and a warning device; evaluating the simulated operation effect of the doctor after the operation by utilizing an evaluation program through an effect evaluation module; storing medical image data, a two-dimensional image data set, a three-dimensional image, a real-time dynamic three-dimensional image, warning reminding data and an effect evaluation result by using a database through a data storage module;
and sixthly, updating and displaying the medical image data, the two-dimensional image data set, the three-dimensional image, the real-time dynamic three-dimensional image, the warning reminding data and the real-time data of the effect evaluation result by using the display through the updating and displaying module.
2. The virtual reality-based minimally invasive surgical simulation method of claim 1, wherein in the first step, the preprocessing the imported medical image data of the CT, MRI or PET scan by the data preprocessing module using the eFilm work medical image processing software comprises: and normalizing the acquired scanned medical image data to obtain an image after normalization processing.
3. The virtual reality-based minimally invasive surgical simulation method of claim 1, wherein the normalization process of the acquired scanned medical image data comprises:
(1) obtaining a medical image through the scanned medical image data;
(2) establishing a normalized map library, wherein the normalized map library is a set of normalized maps normalized to the same image coordinate system;
(3) calculating the distance between a normalization map and a medical image to be processed, wherein the medical image to be processed and the normalization map are normalized to the same image coordinate system;
(4) determining a reference map based on the distances between the plurality of normalized maps and the medical image to be processed; and segmenting the medical image data to be processed based on the reference map.
4. The virtual reality-based minimally invasive surgery simulation method according to claim 1, wherein the rendering processing of each part of the three-dimensional stereoscopic image by the stereoscopic image processing module by using the image processing program respectively comprises:
(1) obtaining a depth map, a normal map, a color map of a three-dimensional image to be rendered, and coordinates of each point in the image in a projection space and a world space;
(2) obtaining a normal vector of each point in the three-dimensional image to be rendered in the world space; according to the color map, obtaining color information of each point in the three-dimensional image to be rendered;
(3) performing illumination calculation according to coordinates of each point in the three-dimensional image to be rendered in a world space, a normal vector of each point in the world space and color information to obtain pixel colors of each point in the three-dimensional image to be rendered;
(4) and outputting pixels according to the coordinates and pixel colors of each point in the three-dimensional image to be rendered in the projection space, and obtaining the rendered three-dimensional image.
5. The virtual reality-based minimally invasive surgical simulation method of claim 1, wherein the performing minimally invasive surgical simulation comprises:
(1) performing preoperative planning on the operation through a simulated 3D model of the focus of the patient according to the input virtual actualized patient data;
(2) and performing minimally invasive surgery simulation through the control handle in a virtual reality environment by utilizing VR glasses.
6. The virtual reality-based minimally invasive surgical simulation method of claim 1, wherein the minimally invasive surgical simulation is performed using an interactive method comprising: design of incision size, selection of incision location, adjustment of endoscope access path, adjustment of endoscope field of view, and use of surgical instruments.
7. The virtual reality-based minimally invasive surgical simulation method of claim 1, wherein the evaluation of the simulated surgical effect of the doctor by the effect evaluation module using an evaluation procedure after surgery comprises:
evaluating the simulated operation effect of the doctor after the operation by utilizing an evaluation program through an effect evaluation module; if the evaluation effect is general, the preoperative planning is unreasonable, and the preoperative planning should be carried out again; if the evaluation effect is good, the preoperative planning is reasonable.
8. A virtual reality-based minimally invasive surgery simulation system applying the virtual reality-based minimally invasive surgery simulation method according to claims 1-7, wherein the virtual reality-based minimally invasive surgery simulation system comprises:
the system comprises a data preprocessing module, a data importing module, a central control module, a three-dimensional image reconstruction module, a three-dimensional image generating module, a three-dimensional image processing module, a surgery simulation module, a warning reminding module, an effect evaluation module, a data storage module and an updating display module;
the data preprocessing module is connected with the central control module and used for preprocessing medical image data from CT, MRI or PET scanning through eFilm work medical image processing software and manually segmenting to obtain two-dimensional image data sets with different enhanced time phases;
the data import module is connected with the central control module and used for establishing a normal image coordinate system through a data import program and importing medical image data scanned by CT, MRI and PET into a computer;
the central control module is connected with the data preprocessing module, the data importing module, the three-dimensional image reconstruction module, the three-dimensional image generating module, the three-dimensional image processing module, the operation simulation module, the warning reminding module, the effect evaluation module, the data storage module and the updating display module and is used for coordinating and controlling the normal operation of each module of the minimally invasive surgery simulation system based on the virtual reality through the central processing unit;
the three-dimensional image reconstruction module is connected with the central control module and is used for reconstructing a three-dimensional image of the obtained two-dimensional medical image on a three-dimensional space through a computer image processing technology;
the three-dimensional image generation module is connected with the central control module and used for guiding the two-dimensional voxel-level medical images into a Unity3D engine, and converting the two-dimensional images into three-dimensional images through a computer to generate real-time dynamic three-dimensional images;
the three-dimensional image processing module is connected with the central control module and is used for respectively carrying out segmentation, modeling and rendering processing on each part of the three-dimensional image through an image processing program;
the operation simulation module is connected with the central control module and used for constructing a platform for operation planning and operation simulation through a unity3D engine, inputting virtual and realistic patient data and carrying out minimally invasive surgery simulation;
the warning and reminding module is connected with the central control module, and in the process of simulating the operation, the minimally invasive surgery simulation system based on the virtual reality can provide instant feedback for the operation of a doctor and warn and remind wrong or dangerous operation steps through the early warning device;
the effect evaluation module is connected with the central control module and is used for evaluating the simulated operation effect of the doctor after the operation through an evaluation program;
the data storage module is connected with the central control module and used for storing medical image data, a two-dimensional image data set, a real-time dynamic three-dimensional image, warning reminding data and an effect evaluation result through a database;
and the updating display module is connected with the central control module and is used for updating and displaying the medical image data, the two-dimensional image data set, the real-time dynamic three-dimensional image, the warning reminding data and the real-time data of the effect evaluation result through the display.
9. A computer program product stored on a computer readable medium, comprising a computer readable program for providing a user input interface for implementing a virtual reality based minimally invasive surgical simulation method as claimed in any one of claims 1 to 7 when executed on an electronic device.
10. A computer readable storage medium storing instructions that, when executed on a computer, cause the computer to perform a method of virtual reality based minimally invasive surgery simulation as claimed in any one of claims 1 to 7.
CN202110482041.9A 2021-04-30 2021-04-30 Minimally invasive surgery simulation method and system based on virtual reality Withdrawn CN113197665A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110482041.9A CN113197665A (en) 2021-04-30 2021-04-30 Minimally invasive surgery simulation method and system based on virtual reality

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110482041.9A CN113197665A (en) 2021-04-30 2021-04-30 Minimally invasive surgery simulation method and system based on virtual reality

Publications (1)

Publication Number Publication Date
CN113197665A true CN113197665A (en) 2021-08-03

Family

ID=77029905

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110482041.9A Withdrawn CN113197665A (en) 2021-04-30 2021-04-30 Minimally invasive surgery simulation method and system based on virtual reality

Country Status (1)

Country Link
CN (1) CN113197665A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114795468A (en) * 2022-04-19 2022-07-29 首都医科大学附属北京天坛医院 Intraoperative navigation method and system for intravascular treatment
CN115363752A (en) * 2022-08-22 2022-11-22 华平祥晟(上海)医疗科技有限公司 Intelligent operation path guiding system
CN116959697A (en) * 2023-06-19 2023-10-27 华平祥晟(上海)医疗科技有限公司 Preoperative scheme auxiliary evaluation system and method based on artificial intelligence
CN117160029A (en) * 2023-08-31 2023-12-05 江西格如灵科技股份有限公司 VR handle detection method and system

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114795468A (en) * 2022-04-19 2022-07-29 首都医科大学附属北京天坛医院 Intraoperative navigation method and system for intravascular treatment
CN114795468B (en) * 2022-04-19 2022-11-15 首都医科大学附属北京天坛医院 Intraoperative navigation method and system for intravascular treatment
CN115363752A (en) * 2022-08-22 2022-11-22 华平祥晟(上海)医疗科技有限公司 Intelligent operation path guiding system
CN115363752B (en) * 2022-08-22 2023-03-28 华平祥晟(上海)医疗科技有限公司 Intelligent operation path guiding system
CN116959697A (en) * 2023-06-19 2023-10-27 华平祥晟(上海)医疗科技有限公司 Preoperative scheme auxiliary evaluation system and method based on artificial intelligence
CN117160029A (en) * 2023-08-31 2023-12-05 江西格如灵科技股份有限公司 VR handle detection method and system

Similar Documents

Publication Publication Date Title
CN113197665A (en) Minimally invasive surgery simulation method and system based on virtual reality
US10580325B2 (en) System and method for performing a computerized simulation of a medical procedure
EP2637593B1 (en) Visualization of anatomical data by augmented reality
WO2020177348A1 (en) Method and apparatus for generating three-dimensional model
CN112734776A (en) Minimally invasive surgical instrument positioning method and system
CN110660130A (en) Medical image-oriented mobile augmented reality system construction method
Kumar et al. Stereoscopic visualization of laparoscope image using depth information from 3D model
CN114445431A (en) Method and device for arbitrarily cutting medical three-dimensional image
WO2021030995A1 (en) Inferior vena cava image analysis method and product based on vrds ai
WO2021081771A1 (en) Vrds ai medical image-based analysis method for heart coronary artery, and related devices
Advincula et al. Development and future trends in the application of visualization toolkit (VTK): the case for medical image 3D reconstruction
Amara et al. Augmented reality visualization and interaction for covid-19 ct-scan nn automated segmentation: A validation study
CN112331311B (en) Method and device for fusion display of video and preoperative model in laparoscopic surgery
KR102213412B1 (en) Method, apparatus and program for generating a pneumoperitoneum model
CN116712167A (en) Navigation method and system for pulmonary nodule operation
Stoyanov et al. Intra-operative visualizations: Perceptual fidelity and human factors
CN111329589A (en) Handheld intelligent fusion radiography navigation system
WO2022223042A1 (en) Surgical path processing system, method, apparatus and device, and storage medium
WO2021081839A1 (en) Vrds 4d-based method for analysis of condition of patient, and related products
Novotny et al. Towards placental surface vasculature exploration in virtual reality
Wu et al. AI-Enhanced Virtual Reality in Medicine: A Comprehensive Survey
JP2022506708A (en) Systems and methods for optical tracking
CN115990032B (en) Priori knowledge-based ultrasonic scanning visual navigation method, apparatus and device
Kirmizibayrak Interactive volume visualization and editing methods for surgical applications
WO2021081842A1 (en) Intestinal neoplasm and vascular analysis method based on vrds ai medical image and related device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WW01 Invention patent application withdrawn after publication

Application publication date: 20210803

WW01 Invention patent application withdrawn after publication