CN110537980A - puncture surgery navigation method based on motion capture and mixed reality technology - Google Patents

puncture surgery navigation method based on motion capture and mixed reality technology Download PDF

Info

Publication number
CN110537980A
CN110537980A CN201910905977.0A CN201910905977A CN110537980A CN 110537980 A CN110537980 A CN 110537980A CN 201910905977 A CN201910905977 A CN 201910905977A CN 110537980 A CN110537980 A CN 110537980A
Authority
CN
China
Prior art keywords
puncture
motion capture
puncture needle
marker ball
method based
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910905977.0A
Other languages
Chinese (zh)
Inventor
王浴屺
王殊轶
于德旺
康健
朱自强
豆建生
梁巨宏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Shanghai for Science and Technology
Original Assignee
University of Shanghai for Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Shanghai for Science and Technology filed Critical University of Shanghai for Science and Technology
Priority to CN201910905977.0A priority Critical patent/CN110537980A/en
Publication of CN110537980A publication Critical patent/CN110537980A/en
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B17/34Trocars; Puncturing needles
    • A61B17/3403Needle locating or guiding means
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/101Computer-aided simulation of surgical operations
    • A61B2034/105Modelling of the patient, e.g. for ligaments or bones
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/107Visualisation of planned trajectories or target regions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2055Optical tracking systems

Landscapes

  • Health & Medical Sciences (AREA)
  • Surgery (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Medical Informatics (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Robotics (AREA)
  • Pathology (AREA)
  • Apparatus For Radiation Diagnosis (AREA)

Abstract

The invention discloses a puncture surgery navigation method based on motion capture and mixed reality technology, which has the technical scheme key points that: a puncture surgery navigation method based on motion capture and mixed reality technology comprises the following steps: s1, attaching a marker ball to the skin around the position where the patient needs to puncture; s2, capturing the spatial position coordinates of the marker ball through a Qualisys optical motion capture system; s3, carrying out CT shooting or MRI on the patient together with the marker ball; s4, modeling the small ball and the operation part by using Unity; s5, importing the model into augmented reality equipment; s6, capturing the spatial position coordinates of the puncture needle with the marker ball by a Qualissys optical motion capture system in the operation process, and feeding back the coordinates to the augmented reality equipment; s7, the doctor moves the puncture needle with the marker ball, and after a puncture path is planned, the position of the puncture needle relative to the kidney is displayed in the AR glasses in real time. The invention can track and position the puncture needle in real time and display the three-dimensional information of the kidney model.

Description

puncture surgery navigation method based on motion capture and mixed reality technology
Technical Field
The invention relates to the field of surgical navigation, in particular to a puncture surgical navigation method based on motion capture and mixed reality technologies.
Background
percutaneous Nephrolithotomy (PNO) is initiated in the early forty years, while the actual clinical application is in the early fifty years, and includes three parts:
1. Percutaneous nephropuncture fistulation (PCN) was initiated by Goodwin, who first used Trocar technology as a primary work on PCN in 1995.
2. Percutaneous Nephrolithotomy (PNL), a new method called Percutaneous Nephrolithotomy (PNCL), can be developed into three stages, namely X-ray fluorescence lithotomy technique, nephrolithotomy technique under direct vision, and lithotripsy technique such as ultrasound and hydroelectricity.
3. Percutaneous renal biopsy (PNB) was used clinically in the early fifties. Chinese scholars began earlier in developing PNB, and in 1958, Zhao Kui Dan, Zhou Hui Ying and the like reported the clinical application of PNB.
The front puncture channel is mainly established by B-ultrasonic and X-ray positioning, an image is a two-dimensional plane, an operator cannot easily grasp the puncture angle and cannot clearly distinguish the distribution condition of renal vessels and the adjacent relation of peripheral organs, so that the injury to the vessels and the peripheral organs cannot be well avoided; moreover, the disease condition of some patients is complicated, anatomical variation such as horseshoe kidney and kidney rotation failure may exist, and the operation risk is increased. Although the color Doppler ultrasound can display the artery and vein blood flow, the puncture is facilitated to avoid a large amount of blood vessel injuries, and the risk of operation heavy bleeding is reduced to a certain degree, the planning performance of a two-dimensional image provided for a puncture channel is not strong, the display of the internal structure of the kidney is unclear, and the puncture positioning is difficult to achieve the rationality. The requirement on the experience of doctors is high, psychological pressure is brought to doctors in the operation, young doctors who cannot use ultrasound can not perform the operation, and the operation is slow.
A general B-ultrasonic and C-arm positioning clinician worries about pleural and adjacent visceral organ injuries, often selects 12-costal puncture, and brings passiveness to operation in an operation. The inferior calyx of the kidney can not enter the superior calyx of the kidney in many times, and for multiple calculi, calculi can remain and need secondary operation, even if the calculi can not be completely removed in the secondary operation of part of patients, the main reason is that the distribution of the positions of the calculi cannot be grasped, and although the puncture channel can enter the collection system, the puncture channel cannot enter some of the superior calyx of the kidney, so that more calculi can be obtained. The augmented reality technology can display a three-dimensional model of the kidney and surrounding organs and tissues of the kidney, help a doctor to observe whether other visceral organs can be damaged before puncture, and the position of the calculus or focus relative to the kidney is clear at a glance and is superior to a two-dimensional picture in view field; moreover, the path of the surgical instrument can be displayed in real time, so that the irregular puncture path of a doctor can be corrected in time, the increase of the puncture times due to the wrong planning of the needle path direction can be avoided, and the pain of a patient can be aggravated.
With the rapid development of computer-assisted surgery technology, surgical navigation systems are widely used in surgical operations. The surgical navigation system can position surgical instruments, display the positions of the surgical instruments relative to a focus and the tissue structures of the vector position, the horizontal position, the coronal position and the like of the focus area on a screen in real time, and finally guide a clinician to adjust the positions of the surgical instruments so as to complete surgery more quickly, safely and accurately. Augmented Reality (AR) technology is a technology that integrates a computer-generated virtual model with a real scene in which a user is located by means of a photoelectric display technology, a sensor technology, computer graphics, and the like, so that the user can be certain from a sensory effect that a virtual object is a component of the surrounding real environment. Since microsoft introduced the mixed reality device HoloLens in 2016, the surgical navigation method based on HoloLens is widely used in clinic. However, the following problems exist in the design of the current surgical navigation system based on the HoloLens.
(1) the three-dimensional registration method mainly uses an identification diagram, and the position of the identification diagram in a computer is placed to seriously influence the accuracy of a navigation system.
(2) the HoloLens is required to rescan the identification map if the patient position changes, affecting efficiency.
(3) the judgment of the position of the virtual model of the operation part mainly takes subjectivity as main influence precision, and objective data information of the surgical instrument relative to the virtual model is lacked.
Therefore, there is a need for an improved structure that overcomes the above-mentioned deficiencies.
disclosure of Invention
the invention aims to provide a puncture operation navigation method based on motion capture and mixed reality technology, which can track and position a puncture needle in real time and display three-dimensional information of a kidney model.
The technical purpose of the invention is realized by the following technical scheme: a puncture surgery navigation method based on motion capture and mixed reality technology comprises the following steps:
S1, attaching a marker ball to physiological points on the surface of the skin around the position where the patient needs to puncture;
s2, capturing the spatial position coordinates of the marker ball through a Qualisys optical motion capture system;
s3, carrying out CT shooting or MRI on the patient together with the marker ball;
s4, modeling the small ball and the operation part by using Unity, and matching and unifying the Qualisys optical motion capture system and the space coordinate system of Unity;
S5, importing the model into augmented reality equipment;
S6, performing an operation, wherein in the operation process, the Qualissys optical motion capture system captures the spatial position coordinates of the puncture needle with the marker ball, calculates the relative position data of the puncture needle and the virtual model through an algorithm and feeds back the data to the augmented reality equipment, and provides accurate position information for a doctor;
And S7, after the virtual model is superposed on the real model, the doctor plans a puncture path by moving the puncture needle with the marker ball, selects a puncture point on the surface of the human body to insert the needle, and displays the position of the puncture needle relative to the kidney in the AR glasses in real time.
the invention is further provided with: in step S4, the spatial position of the marker ball is imported into unity so that unity coincides with the spatial coordinate system of Qualisys.
the invention is further provided with: in step S7, during the operation, the position of the puncture needle is captured by the capture camera, so as to obtain the relative position data of the puncture needle and the virtual model, and the real-time relative position distance data can be displayed on the augmented reality device.
In conclusion, the invention has the following beneficial effects:
1. and the three-dimensional model of the kidney and surrounding organs and tissues thereof is displayed, so that doctors can conveniently observe the positions and structures of the calculus and the focus.
2. The method is simple to operate, the virtual model is projected on the skin attached with the marker ball through the AR glasses after three-dimensional reconstruction of imaging pictures such as CT, MRI and the like, and finally superposition of the virtual model and the real model is achieved.
3. the puncture path is planned, so that a young doctor can more easily carry out the percutaneous renal puncture operation.
4. the cost is low, and the whole hardware equipment only needs a camera and AR glasses.
5. the efficiency is higher than that of the traditional ultrasonic guided puncture surgery, and under the condition of morbidity or emergency treatment in abnormal time periods, the operation can be carried out only by three-dimensional reconstruction of a CT picture and then leading the CT picture into AR glasses, and non-experienced doctors can participate in the operation without the guidance of ultrasonic imaging doctors.
drawings
FIG. 1 is a flow chart of an implementation of the AR percutaneous renal procedure navigation system of the present invention;
FIG. 2 is a schematic diagram of a utility scenario of the present invention;
Fig. 3 is a preoperative planning and surgical flow diagram of the present invention.
Detailed Description
in order to make the technical means, the original characteristics, the achieved purposes and the effects of the invention easy to understand, the invention is further described with reference to the figures and the specific embodiments.
as shown in fig. 1, the puncture navigation method based on motion capture and mixed reality technology provided by the invention comprises the following steps:
S1, attaching a marker ball to a physiological point on the surface of the skin around the kidney to be punctured by the patient;
S2, capturing the spatial position coordinates of the marker ball through a Qualissis optical motion capturing system, wherein the Qualissis optical motion capturing system can capture real-time position accuracy of a sub-millimeter level, a patient selects a bony position to be attached to the marker ball at a place needing augmented reality surgical navigation, and the spatial position of the marker ball is captured through a capturing camera of the Qualissis;
s3, carrying out CT shooting or MRI on the patient together with the marker ball;
S4, modeling the small ball and the operation part by using Unity, matching and unifying a Qualisys optical motion capture system and a space coordinate system of Unity, and leading the space position of the marker ball into Unity to enable the Unity to be consistent with the space coordinate system of Qualisys;
s5, importing the model into augmented reality equipment to form a real model;
S6, performing an operation, wherein in the operation process, the Qualissys optical motion capture system captures the spatial position coordinates of the puncture needle with the marker ball, calculates the relative position data of the puncture needle and the virtual model through an algorithm and feeds back the data to the augmented reality equipment, and provides accurate position information for a doctor;
S7, after the virtual model is superposed on the real model, a doctor plans a puncture path by moving a puncture needle with a marker ball, selects a puncture point on the surface of a human body to insert the needle, and displays the position of the puncture needle relative to the kidney in an AR (augmented reality) glasses in real time; in the operation process, the position of the puncture needle is captured through the capture camera, so that the relative position data of the puncture needle and the virtual model is obtained, the real-time relative position distance data can be displayed on the augmented reality equipment, and whether the focus is reached is judged by observing the position of the virtual needle tip.
The foregoing shows and describes the general principles, essential features, and advantages of the invention. It will be understood by those skilled in the art that the present invention is not limited to the embodiments described above, which are given by way of illustration of the principles of the present invention, and that various changes and modifications may be made without departing from the spirit and scope of the invention as defined by the appended claims. The scope of the invention is defined by the appended claims and equivalents thereof.

Claims (3)

1. a puncture surgery navigation method based on motion capture and mixed reality technology is characterized by comprising the following steps:
S1, attaching a marker ball to physiological points on the surface of the skin around the position where the patient needs to puncture;
S2, capturing the spatial position coordinates of the marker ball through a Qualisys optical motion capture system;
S3, carrying out CT shooting or MRI on the patient together with the marker ball;
S4, modeling the small ball and the operation part by using Unity, and matching and unifying the Qualisys optical motion capture system and the space coordinate system of Unity;
S5, importing the model into augmented reality equipment;
S6, performing an operation, wherein in the operation process, the Qualissys optical motion capture system captures the spatial position coordinates of the puncture needle with the marker ball, calculates the relative position data of the puncture needle and the virtual model through an algorithm and feeds back the data to the augmented reality equipment, and provides accurate position information for a doctor;
And S7, after the virtual model is superposed on the real model, the doctor plans a puncture path by moving the puncture needle with the marker ball, selects a puncture point on the surface of the human body to insert the needle, and displays the position of the puncture needle relative to the kidney in the AR glasses in real time.
2. The puncture surgery navigation method based on the motion capture and mixed reality technology according to claim 1, characterized in that: in step S4, the spatial position of the marker ball is imported into unity so that unity coincides with the spatial coordinate system of Qualisys.
3. the puncture surgery navigation method based on the motion capture and mixed reality technology according to claim 1, characterized in that: in step S7, during the operation, the position of the puncture needle is captured by the capture camera, so as to obtain the relative position data of the puncture needle and the virtual model, and the real-time relative position distance data can be displayed on the augmented reality device.
CN201910905977.0A 2019-09-24 2019-09-24 puncture surgery navigation method based on motion capture and mixed reality technology Pending CN110537980A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910905977.0A CN110537980A (en) 2019-09-24 2019-09-24 puncture surgery navigation method based on motion capture and mixed reality technology

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910905977.0A CN110537980A (en) 2019-09-24 2019-09-24 puncture surgery navigation method based on motion capture and mixed reality technology

Publications (1)

Publication Number Publication Date
CN110537980A true CN110537980A (en) 2019-12-06

Family

ID=68714407

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910905977.0A Pending CN110537980A (en) 2019-09-24 2019-09-24 puncture surgery navigation method based on motion capture and mixed reality technology

Country Status (1)

Country Link
CN (1) CN110537980A (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112083799A (en) * 2020-07-23 2020-12-15 常州锦瑟医疗信息科技有限公司 Augmented reality assisted puncture positioning method
CN112107366A (en) * 2020-07-23 2020-12-22 常州锦瑟医疗信息科技有限公司 Mixed reality ultrasonic navigation system
CN113133814A (en) * 2021-04-01 2021-07-20 上海复拓知达医疗科技有限公司 Augmented reality-based puncture surgery navigation device and computer-readable storage medium
CN113662594A (en) * 2021-09-10 2021-11-19 上海联影医疗科技股份有限公司 Breast puncture positioning/biopsy method, device, computer device and storage medium
CN114767270A (en) * 2022-04-26 2022-07-22 广州柏视医疗科技有限公司 Navigation display system for lung operation puncture
CN114840110A (en) * 2022-03-17 2022-08-02 杭州未名信科科技有限公司 Puncture navigation interactive assistance method and device based on mixed reality
WO2024021637A1 (en) * 2022-07-26 2024-02-01 湖南卓世创思科技有限公司 Puncture path determination system, method and apparatus for brain lesions, and medium

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112083799A (en) * 2020-07-23 2020-12-15 常州锦瑟医疗信息科技有限公司 Augmented reality assisted puncture positioning method
CN112107366A (en) * 2020-07-23 2020-12-22 常州锦瑟医疗信息科技有限公司 Mixed reality ultrasonic navigation system
CN113133814A (en) * 2021-04-01 2021-07-20 上海复拓知达医疗科技有限公司 Augmented reality-based puncture surgery navigation device and computer-readable storage medium
CN113662594A (en) * 2021-09-10 2021-11-19 上海联影医疗科技股份有限公司 Breast puncture positioning/biopsy method, device, computer device and storage medium
CN113662594B (en) * 2021-09-10 2024-04-23 上海联影医疗科技股份有限公司 Mammary gland puncture positioning/biopsy method, device, computer equipment and storage medium
CN114840110A (en) * 2022-03-17 2022-08-02 杭州未名信科科技有限公司 Puncture navigation interactive assistance method and device based on mixed reality
CN114767270A (en) * 2022-04-26 2022-07-22 广州柏视医疗科技有限公司 Navigation display system for lung operation puncture
CN114767270B (en) * 2022-04-26 2023-05-12 广州柏视医疗科技有限公司 Navigation display system for pulmonary surgery puncture
WO2024021637A1 (en) * 2022-07-26 2024-02-01 湖南卓世创思科技有限公司 Puncture path determination system, method and apparatus for brain lesions, and medium

Similar Documents

Publication Publication Date Title
CN110537980A (en) puncture surgery navigation method based on motion capture and mixed reality technology
US11871913B2 (en) Computed tomography enhanced fluoroscopic system, device, and method of utilizing the same
CN109416841B (en) Method for enhancing image fidelity and application thereof method for surgical guidance on wearable glasses
CA2973479C (en) System and method for mapping navigation space to patient space in a medical procedure
US10258413B2 (en) Human organ movement monitoring method, surgical navigation system and computer readable medium
Freysinger et al. Image-guided endoscopic ENT surgery
US11712307B2 (en) System and method for mapping navigation space to patient space in a medical procedure
EP2584990B1 (en) Focused prostate cancer treatment system
US10357317B2 (en) Handheld scanner for rapid registration in a medical navigation system
CN103479430A (en) Image guiding intervention operation navigation system
JP2007508913A (en) Intraoperative targeting system and method
WO2015148529A1 (en) Interactive systems and methods for real-time laparoscopic navigation
CN111887988B (en) Positioning method and device of minimally invasive interventional operation navigation robot
Zhu et al. A neuroendoscopic navigation system based on dual-mode augmented reality for minimally invasive surgical treatment of hypertensive intracerebral hemorrhage
Ferguson et al. Toward practical and accurate touch-based image guidance for robotic partial nephrectomy
Azimi et al. Interactive navigation system in mixed-reality for neurosurgery
CN111631814B (en) Intraoperative blood vessel three-dimensional positioning navigation system and method
Vogt et al. Augmented reality system for MR-guided interventions: Phantom studies and first animal test
Freysinger et al. A full 3D-navigation system in a suitcase
CN113940756B (en) Operation navigation system based on mobile DR image
Meinzer et al. Computer-assisted soft tissue interventions
Chen et al. Development and evaluation of ultrasound-based surgical navigation system for percutaneous renal interventions
CN113940737A (en) Visual aspiration biopsy external member
CN115607275A (en) Image display mode, device, storage medium and electronic equipment
CN116172704A (en) Craniocerebral positioning puncture method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20191206

RJ01 Rejection of invention patent application after publication