CN102949240B - Image-guided lung interventional operation system - Google Patents

Image-guided lung interventional operation system Download PDF

Info

Publication number
CN102949240B
CN102949240B CN201110246692.4A CN201110246692A CN102949240B CN 102949240 B CN102949240 B CN 102949240B CN 201110246692 A CN201110246692 A CN 201110246692A CN 102949240 B CN102949240 B CN 102949240B
Authority
CN
China
Prior art keywords
image
module
unit
space
operative region
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201110246692.4A
Other languages
Chinese (zh)
Other versions
CN102949240A (en
Inventor
高欣
刘海红
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to CN201110246692.4A priority Critical patent/CN102949240B/en
Publication of CN102949240A publication Critical patent/CN102949240A/en
Application granted granted Critical
Publication of CN102949240B publication Critical patent/CN102949240B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Abstract

The invention discloses an image-guided lung interventional operation system which comprises a first image acquiring module, a lung region reconstruction module, a lung respiratory movement model construction module, an operative instrument and operative region image space mapping module, a dynamic focus space-positioning and guide module and a second image acquiring module. The first image acquiring module is connected with the lung region reconstruction module which is connected with the lung respiratory movement model construction module, the lung respiratory movement model construction module is connected with the dynamic focus space-positioning and guide module, and the second image acquiring module is connected with the operative instrument and operative region image space-mapping module which is connected with the dynamic focus space-positioning and guide module. Real-time space positioning of dynamic focuses is achieved, early diagnosis of carcinogenesis of lung nodule can be achieved, and error of space positioning precision is within 2mm.

Description

A kind of image navigation pulmonary intervene operation system
Technical field
What the present invention relates to is a kind of system of armarium technical field, specifically a kind of image navigation pulmonary intervene operation system.
Background technology
Percutaneous Transthoracic Biopsy method is the prefered method of peripheral type carcinoma of lung lesser tubercle diagnosis, by image information, carries out interventional navigation, will greatly improve accuracy rate and safety.The existing percutaneous pneumocentesis based on image navigation technology has two kinds substantially: conventional CT guiding and ct fluoroscopy guiding.
These two kinds of methods have all reached the object of three dimensional display to a certain extent, get involved doctor with reference to the physical location that is presented at 3-dimensional image on computer monitor and observes operating theater instruments, guiding Wicresoft intervention procedure.Yet these two kinds of technology itself have very large limitation, no matter be conventional CT guiding or ct fluoroscopy guiding, all do not consider that respiratory movement is for the impact of image navigation intervene operation, small lesion is easily subject to that diaphram is respirometric to be affected, be offset, cause focal zone and locus, actual focal district in image inconsistent, the disposable hit rate of impact puncture.
Conventional CT guided puncture process is more loaded down with trivial details, need to get involved doctor and commute scanning room and scan control chamber, when probe inserts behind patient thoracic cavity, need again carry out CT scan and confirm whether probe reaches focal zone, this process easily causes the displacement of probe, affects the extraction of lesion tissue.
Ct fluoroscopy guidance method has been accomplished the object of three-dimensional real-time demonstration, having shortened puncture puts the pin time, compare sensitivity and the negative no significant difference of biopsy with conventional CT guidance method, not only make the dosage of doctor and person under inspection's exposed ray increase, and easily damage CT equipment.
In addition, there is no a set of independently software platform, can only select the incidental software of CT manufacturer, some necessary IPFs are seldom provided.And for the lesser tubercle of diameter < 1cm, the rate of correct diagnosis of focus is 66.7%, to compare with the focus accuracy rate of diagnosis 82.8% of diameter 1-3cm, precision needs further to improve.
Summary of the invention
The present invention is directed to prior art above shortcomings, a kind of image navigation pulmonary interventional systems is provided, native system will be taken into account trachea and angiokinetic pulmonary respiratory movement model integration in 3 D stereo location technology, dynamic focus and operating theater instruments are carried out to real-time space orientation and navigation, overcome static CT information and dynamic internal internal organs position in the percutaneous Transthoracic Biopsy art of existing CT guiding, small lesion position particularly, unmatched defect.
For realizing above-mentioned technical purpose, reach above-mentioned technique effect, the present invention is achieved through the following technical solutions:
A kind of image navigation pulmonary intervene operation system, it comprises: in preoperative preparation module and art, implement module, described preoperative preparation module comprises: the first image acquiring module, lung regional reconstruction module and pulmonary respiratory movement model build module, implement module and comprise in described art: operating theater instruments and operative region image space mapping block, dynamic focus space orientation navigation module and the second image acquiring module; Described the first image acquiring module connects described lung regional reconstruction module, the 4D sequential images information that output is obtained; Described lung regional reconstruction module connects described pulmonary respiratory movement model and builds module, the 4D structural information of output lung table, trachea and lung blood vessel etc.; Described pulmonary respiratory movement model builds module and connects described dynamic focus space orientation navigation module, and output takes into account trachea, angiokinetic pulmonary respiratory movement model; Described the second image acquiring module connects described operating theater instruments and operative region image space mapping block, and output is with the low resolution 3D image information of gauge point; Described operating theater instruments is connected described dynamic focus space orientation navigation module with operative region image space mapping block, output is mapped to the operating theater instruments physical space coordinate of operative region image space coordinate; Described dynamic focus space orientation navigation module builds dynamic image, calculates the locus of operative region focus and operating theater instruments, by Visualization Platform, obtains navigation information in real-time art.
Preferably, described preoperative preparation module also comprises an operation pathway planning module for benchmark 3D rendering operation in 4D sequential images, described the first image acquiring module outfan also connects described operation pathway planning module the 4D sequential images information obtained of output, and described operation pathway planning module connects described dynamic focus space orientation navigation module and exports lesions position and size, wound position and operating theater instruments enter the information such as the angle of operative region and the degree of depth and with reference to image space coordinate system.
Preferably, in described art, implement module and also comprise an operation pathway planning module for the operation of low resolution 3D image, described the second image acquiring module outfan connects described operation pathway planning module, the low resolution 3D image information with gauge point that output is obtained, described operation pathway planning module connects described dynamic focus space orientation navigation module, and output lesions position and size, wound position and operating theater instruments enter the information such as the angle of operative region and the degree of depth and with reference to image space coordinate system.
Further, implement module and also comprise a guide of fluoroscopy module in described art, described guide of fluoroscopy module connects described dynamic focus space orientation navigation module and exports 2D fluoroscopic image information.
Further, implement module and also comprise a focus acquisition module in described art, described dynamic focus space orientation navigation module connects described focus acquisition module.
Further, described lung regional reconstruction module comprises: image cutting unit and reconstruction unit, wherein, described image cutting unit first extracts chest area in 4D sequence image, in the region of extracting again by lung table, trachea and blood vessel segmentation out, and export result to described reconstruction unit, and described reconstruction unit carries out 3D volume reconstruction by sequences segmentation result, and result exports described pulmonary respiratory movement model to and builds module.
Further, described pulmonary respiratory movement model builds module and comprises: deformable registration unit and deformation domain description unit, wherein, described deformable registration unit carries out 4D deformable registration to lung regional structure, output movement vector set is to described deformation domain description unit, described deformation domain description unit is expressed motion vector collection in the mode in deformation territory, result transfers to described dynamic focus space orientation navigation module.
Further, described operation pathway planning module comprises: man-machine interaction focal zone indexing unit and man-machine interaction wound planning unit, wherein, described man-machine interaction focal zone indexing unit allows to get involved doctor and in reference to image, draws a circle to approve lesions position and size, described man-machine interaction wound planning unit allows to get involved doctor and in reference to image, demarcates wound position, planning operating theater instruments enters angle and the depth information of human body, this two unit output informations and be transferred to described dynamic focus space orientation navigation module with reference to image space coordinate system by wound.
Further, described operating theater instruments and operative region image space mapping block comprise: gauge point is chosen unit and registration unit, wherein, described gauge point choose unit need to get involved doctor operative region physical space and with the low resolution 3D image information space of gauge point one by one correspondence choose gauge point, the gauge point of two space coordinatess of output is to described registration unit, registration unit by these one to one gauge point carry out spatial mappings, obtain the mapping relations of operative region physical space and image space, export the operative region coordinate of expressing to described dynamic focus space orientation navigation module in low resolution 3D image space coordinate.
Preferably, described dynamic focus space orientation navigation module comprises: registration unit, dynamic image construction unit, three-dimensional localization unit and image visualization, described registration unit is mapped to low resolution 3D image space coordinate system with reference in image space coordinate, in the operative region coordinate of expressing in low resolution 3D image space coordinate being mapped directly to reference to image space coordinate simultaneously, output with reference to the low resolution 3D image in image space coordinate and operative region coordinate to dynamic image construction unit, all space coordinatess are unified to be arrived with reference to image space coordinate, described dynamic image construction unit be take with reference to low resolution 3D image in image space coordinate as basis, in conjunction with deformation territory, lung territory, builds the dynamic image data collection of simulation autonomous respiration campaign, and dynamic image is outputed to three-dimensional localization unit, described three-dimensional localization unit by using operative region coordinate, in conjunction with dynamic image, calculates the locus of operative region organ and operating theater instruments, obtains navigation information in real-time art, and these information are passed to described image visualization, described image visualization has multiplanar reconstruction and volume drawing function, with 2D and 3D mode, show operative region organ, the relative position of operating theater instruments in operative region, the operating theater instruments of the lesions position of planning and size and planning enters angle and the depth information of human body by wound.
Preferably, described dynamic focus space orientation navigation module comprises: dynamic image construction unit, three-dimensional localization unit and image visualization, it is basis that described dynamic image construction unit be take with reference to low resolution 3D image in image space coordinate, in conjunction with deformation territory, lung territory, build the dynamic image data collection of simulation autonomous respiration campaign, and dynamic image is outputed to three-dimensional localization unit; Described three-dimensional localization unit by using operative region coordinate, in conjunction with dynamic image, calculates the locus of operative region organ and operating theater instruments, obtains navigation information in real-time art, and these information are passed to described image visualization; Described image visualization has multiplanar reconstruction and volume drawing function, with 2D and 3D mode, show operative region organ, the relative position of operating theater instruments in operative region, the operating theater instruments of the lesions position of planning and size and planning enters angle and the depth information of human body by wound.Also comprise: 2D-3D registration unit, 2D fluoroscopic image information is mapped on dynamic image, and the result after mapping is passed to three-dimensional localization unit, described three-dimensional localization unit by using operative region coordinate, in conjunction with dynamic image, and the fluoroscopic image information showing on dynamic image, calculate the locus of operative region organ and operating theater instruments, obtain navigation information in real-time art, and these information are passed to described image visualization.
Compared with prior art, the present invention is directed to image navigation intervene operation, introduce respiratory movement model, solved and be subject to autonomous respiration motion image, the inconsistent difficult problem of operative region navigation image space coordinates static with it, carries out real-time sterically defined target thereby realize to dynamic focus.The present invention can carry out early diagnosis to the canceration of Small pulmonary nodule, and spatial positioning accuracy error is in 2mm.
Above-mentioned explanation is only the general introduction of technical solution of the present invention, in order to better understand technological means of the present invention, and can be implemented according to the content of description, below with preferred embodiment of the present invention and coordinate accompanying drawing to be described in detail as follows.The specific embodiment of the present invention is provided in detail by following examples and accompanying drawing thereof.
Accompanying drawing explanation
Accompanying drawing described herein is used to provide a further understanding of the present invention, forms the application's a part, and schematic description and description of the present invention is used for explaining the present invention, does not form inappropriate limitation of the present invention.In the accompanying drawings:
Fig. 1 has provided the structural representation of an embodiment of image navigation of the present invention pulmonary intervene operation system.
Fig. 2 has provided the structural representation of another embodiment of image navigation of the present invention pulmonary intervene operation system.
Number in the figure explanation: 1, preoperative preparation module, 101, the first image acquiring module, 102, lung regional reconstruction module, 103, pulmonary respiratory movement model builds module, 12, operation pathway planning module, 2, in art, implement module, 201, operating theater instruments and operative region image space mapping block, 202, dynamic focus space orientation navigation module, 203, focus acquisition module, 204, guide of fluoroscopy module, the 205, second image acquiring module.
The specific embodiment
Below with reference to the accompanying drawings and in conjunction with the embodiments, describe the present invention in detail.
Embodiment 1:
As shown in Figure 1, the image navigation pulmonary intervene operation system of the present embodiment, comprise: in preoperative preparation module 1 and art, implement module 2, described preoperative preparation module 1 comprises: the first image acquiring module 101, lung regional reconstruction module 102 and pulmonary respiratory movement model build module 103, implement module 2 and comprise in described art: operating theater instruments and operative region image space mapping block 201, dynamic focus space orientation navigation module 202 and the second image acquiring module 205; Described the first image acquiring module 101 connects described lung regional reconstruction module 102, the 4D sequential images information that output is obtained; Described lung regional reconstruction module 102 connects described pulmonary respiratory movement model and builds module 103, the 4D structural information of output lung table, trachea and lung blood vessel etc.; Described pulmonary respiratory movement model builds module 103 and connects described dynamic focus space orientation navigation module 202, and output takes into account trachea, angiokinetic pulmonary respiratory movement model; Described the second image acquiring module 205 connects described operating theater instruments and operative region image space mapping block 201, and output is with the low resolution 3D image information of gauge point; Described operating theater instruments is connected described dynamic focus space orientation navigation module 202 with operative region image space mapping block 201, output is mapped to the operating theater instruments physical space coordinate of operative region image space coordinate; Described dynamic focus space orientation navigation module 202 builds dynamic image, calculates the locus of operative region focus and operating theater instruments, by Visualization Platform, obtains navigation information in real-time art.
Further, described preoperative preparation module 1 also comprises an operation pathway planning module 12 for benchmark 3D rendering operation in 4D sequential images, described the first image acquiring module 101 outfans also connect described operation pathway planning module 12 and export the 4D sequential images information of obtaining, described operation pathway planning module 12 connects described dynamic focus space orientation navigation modules 202 and also exports lesions position and size, and wound position and operating theater instruments enter the information such as the angle of operative region and the degree of depth and with reference to image space coordinate system.
Further, implement module 2 and also comprise a guide of fluoroscopy module 204 in described art, described guide of fluoroscopy module 204 connects described dynamic focus space orientation navigation module 202 and exports 2D fluoroscopic image information.
Further, implement module 2 and also comprise a focus acquisition module 203 in described art, described dynamic focus space orientation navigation module 202 connects described focus acquisition module 203.
Further, described lung regional reconstruction module 102 comprises: image cutting unit and reconstruction unit, wherein, described image cutting unit extracts chest area in 4D sequence image, in the region of extracting again by lung table, trachea and blood vessel segmentation out, and export result to described reconstruction unit, and described reconstruction unit carries out 3D volume reconstruction by sequences segmentation result, and result exports pulmonary respiratory movement model to and builds module 103.
By to adopting the Japanese Aquilion One of Toshiba, 320 rows, 640 layers of CT machine comprise that to patient the chest (pulmonary) of focal zone carries out 4D computed tomography (CT), and sweep limits to diaphragmatic level, is advised patient's eupnea from apertura thoracis superior.Sweep parameter is as follows: 120kv tube voltage, and 25 Effective mAs, 0.5s rotating speed, 0.5-1mm rebuilds bed thickness, and 0.5-1mm Reconstruction Interval scans total duration 10.5s, gathers altogether 3-5 breathing cycle.In the up MPR of three-dimensional work station, MIP and VR post processing.The 4D sequential images data that gather are input to lung regional reconstruction module with DICOM (Digitalimaging and Communications in Medicine) form.Image cutting unit extracts chest area in 4D sequence chest scan image, on this basis, cuts apart lung table, trachea and vascular tissue, with different numerical value, cut zone is made marks.The sequence faultage image of reconstruction unit after to labelling carries out 3D volume reconstruction, and final lung region (comprising lung table, trachea, blood vessel) sequence anatomical structure transfers to pulmonary respiratory movement model and builds module 103.
Further, described pulmonary respiratory movement model builds module 103 and comprises: deformable registration unit and deformation domain description unit, wherein, described deformable registration unit carries out 4D deformable registration to lung regional structure, output movement vector set is to described deformation domain description unit, described deformation domain description unit is expressed motion vector collection in the mode in deformation territory, result transfers to described dynamic focus space orientation navigation module 202.
Deformable registration unit carries out the three dimensional structure of sequence lung table, trachea and blood vessel can deformable registration, take and obtain image data as some fiducial time when first is held one's breath, lung region anatomical structure At All Other Times and fiducial time point lung region anatomical structure carry out can deformable registration, thereby obtain lung area surfaces, trachea and blood vessel at the motion vector collection of different time points, again the motion vector collection of corresponding time points of all cycles is averaged, just obtain lung area surfaces, trachea and each breathing cycle of vascular anatomy structure motion vector collection.Deformation domain description unit, carries out principal component analysis to the motion vector collection of these structures, in the mode in deformation territory, expresses, and result transfers to dynamic focus space orientation navigation module 202.
Further, described operation pathway planning module 12 comprises: man-machine interaction focal zone indexing unit and man-machine interaction wound planning unit, wherein, described man-machine interaction focal zone indexing unit allows to get involved doctor and in reference to image, draws a circle to approve lesions position and size, described man-machine interaction wound planning unit allows to get involved doctor and in reference to image, demarcates wound position, planning operating theater instruments enters angle and the depth information of human body, this two unit output informations and be transferred to described dynamic focus space orientation navigation module 202 with reference to image space coordinate system by wound.
The 3D volume data (with reference to image) that described operation pathway planning module 12 was put for the fiducial time in sequence volume data image operates.Get involved doctor, by man-machine interaction focal zone indexing unit, semi-automatic delineation is carried out in the focal zone in the 3D volume data of some fiducial time, get involved doctor by browsing 3D layer data, determine the position of focal zone (lesser tubercle), with mouse or felt pen marked lesion regional center point, boundary curve expands outward to lesser tubercle Edge-stopping with central point, if doctor thinks focal zone, border is inaccurate, can pass through mouse or felt pen redeterimination focus region (position and size).Afterwards, get involved doctor according to lesions position, by browsing 3D fault image, avoid blood vessel and liquefaction and necrosis tissue, select suitable puncture entry point, demarcate body surface wound position, the line in itself and focus region, just determined the path that apparatus is got involved, this comprises that operating theater instruments enters angle and the degree of depth of human body by wound.The selection of wound position is extremely important, and the generation of pneumothorax can be greatly avoided in suitable selection.The above information transmission is to dynamic focus space orientation navigation module 202.
Further, described operating theater instruments and operative region image space mapping block 201 comprise: gauge point is chosen unit and registration unit, wherein, described gauge point choose unit need to get involved doctor operative region physical space and with the low resolution 3D image information space of gauge point one by one correspondence choose gauge point, the gauge point of two space coordinatess of output is to described registration unit, registration unit by these one to one gauge point carry out spatial mappings, obtain the mapping relations of operative region physical space and image space, export the operative region coordinate of expressing to described dynamic focus space orientation navigation module 202 in low resolution 3D image space coordinate.
Before operation is implemented, get involved doctor and need to paste several medical gauge points on operative region surface, for subsequent registration.Allow as far as possible patient get the most comfortable position, under the state of holding one's breath, patient is carried out to the low dose of 3D scanning of low resolution, this cover 3D volume data is for " Reference Map " of image navigation.Gauge point is chosen unit requirement doctor under spatial pursuit localizer, choose successively the gauge point that is affixed on operative region, then doctor is again in low resolution 3D faultage image, choose one to one successively gauge point, make gauge point in operative region physical space coordinate system and the gauge point in image space coordinate system set up one-to-one relationship.Registration unit is by the registration of the gauge point in two groups of different spaces coordinate systems, actual physics space is consistent with the image space coupling for navigating, and export the operative region coordinate of expressing to dynamic focus space orientation navigation module 202 in low resolution 3D image space coordinate.In this example, the 3 D stereo position measuring instrument Aurora that spatial pursuit localizer adopts Canadian Northern Digital company to produce.
Further, described dynamic focus space orientation navigation module 202 comprises: registration unit, dynamic image construction unit, three-dimensional localization unit and image visualization, described registration unit is mapped to low resolution 3D image space coordinate system with reference in image space coordinate, in the operative region coordinate of expressing in low resolution 3D image space coordinate being mapped directly to reference to image space coordinate simultaneously, output with reference to the low resolution 3D image in image space coordinate and operative region coordinate to dynamic image construction unit, all space coordinatess are unified to be arrived with reference to image space coordinate, described dynamic image construction unit be take with reference to low resolution 3D image in image space coordinate as basis, in conjunction with deformation territory, lung territory, builds the dynamic image data collection of simulation autonomous respiration campaign, and dynamic image is outputed to three-dimensional localization unit, described three-dimensional localization unit by using operative region coordinate, in conjunction with dynamic image, calculates the locus of operative region organ and operating theater instruments, obtains navigation information in real-time art, and these information are passed to described image visualization, described image visualization has multiplanar reconstruction and volume drawing function, with 2D and 3D mode, show operative region organ, the relative position of operating theater instruments in operative region, the operating theater instruments of the lesions position of planning and size and planning enters angle and the depth information of human body by wound.
Respiratory frequency while obtaining sequential images according to patient, gets involved doctor it is carried out to respiratory training, and it is breathed with normal frequency.Registration unit is carried out registration by low resolution 3D volume data (while holding one's breath) and the 3D volume data (with reference to image) of putting the fiducial time in sequence volume data image, all space coordinates unifications are to reference in image space coordinate: during the operative region coordinate of expressing in low resolution 3D image space coordinate is mapped directly to reference to image space coordinate, output with reference to the low resolution 3D image in image space coordinate and operative region coordinate to dynamic image construction unit; Described dynamic image construction unit be take with reference to low resolution 3D image in image space coordinate as basis, in conjunction with deformation territory, lung territory, builds the dynamic image data collection of simulation autonomous respiration campaign, and dynamic image is outputed to three-dimensional localization unit; Three-dimensional localization unit is by coordinate Mapping relation, in with reference to image space coordinate, in conjunction with dynamic image, calculate the locus of operative region organ and operating theater instruments, obtain navigation information in real-time art, and these information are passed to described image visualization; Image visualization shows operative region organ with 2D and 3D mode, the form and the relative position of operating theater instruments in operative region that comprise focus, the operating theater instruments of the lesions position of planning and size and planning enters angle and the depth information of human body by wound.
Further, implement module 2 and also comprise a focus acquisition module 203 in described art, described dynamic focus space orientation navigation module 202 connects described focus acquisition module 203.
This example carries out work in the following manner: before operation, first want patient to loosen, in the situation that naturally steadily breathing, carry out sequence computed tomography (CT), sweep limits is patient chest, comprises whole pulmonary, 3 breathing cycles of continuous sweep.There is the 3D volume data of 2~3 sequences average each breathing cycle.The 4D image data that scanning is obtained inputs to lung regional reconstruction module 102 with DICOM form, the Sequence Structure Information of lung table, trachea and the blood vessel of reconstruction patients operative region, utilize the 4D volume data of lung table, trachea and blood vessel, the deformation domain of lung table, trachea and blood vessel that pulmonary respiratory movement model structure module 103 acquisition lung respiratory movements cause.The 3D data of choosing first breath holding time point from 4D image data transfer to operation pathway planning module 12 as benchmark volume data simultaneously, determine position and the size in focus region, wound position and needle angle and the degree of depth etc. with get involved relevant important parameter, the spatial domain of these 3D data is the reference coordinate systems in image in image-guided surgery.
Beginning is implemented in operation, gets involved doctor and first in Patients with Lung area surfaces, pastes 5-6 gauge point, and patient is still on CT bed with comfortable posture, carries out the low dose of CT scan of low resolution under the situation of holding one's breath, and obtains a set of low resolution 3D image volume data.Get involved doctor's selected marker point successively in Patients with Lung field surface and low resolution 3D image data respectively, operative region actual physics space coordinates are transformed in low resolution 3D image volume data space coordinates.In dynamic focus space orientation navigation module 202, low resolution 3D image volume data spatial domain is mapped in reference images coordinate system, thereby operative region actual physics transform of spatial domain is changed in reference images coordinate system.In conjunction with lung regional deformation territory, the static 3D image of low resolution volume data becomes the dynamic 3D image volume data of regular movement in time, imports also regular movement in time of the position, focal zone of this module and wound position into.
During operation is implemented, get involved doctor and guarantee that by dynamic focus space orientation navigation module 202 pulmonary's dynamic image is consistent with pulmonary's respiratory movement of patient, doctor, with reference to the many flat images information on display screen and 3D stereoscopic image information, selects wound position at operative region.Along with probe enters human body, doctor is with reference to predefined needle angle inserting needle, the attitude data of probe in human body is presented in operative region pulmonary dynamic image in real time, doctor advances in process at probe, according to the position of dynamic image middle probe and predefined intervention path, adjust at any time angle and the degree of depth of probe, when probe reaches focus region, doctor just implements biopsy, takes out lesion tissue and carries out pathological tissue diagnosis.If needed, doctor can be in getting involved piercing process, enable guide of fluoroscopy function, by in 2D fluoroscopic image message reflection 3D pulmonary volume data, compensation, because respiratory movement causes the displacement of focus, blood vessel and trachea, can accurately be controlled probe with respect to the locus of each tissue of pulmonary to get involved doctor.
Embodiment 2:
As shown in Figure 2, the image navigation pulmonary intervene operation system of the present embodiment, comprise: comprising: in preoperative preparation module 1 and art, implement module 2, described preoperative preparation module 1 comprises: the first image acquiring module 101, lung regional reconstruction module 102 and pulmonary respiratory movement model build module 103, implement module 2 and comprise in described art: operating theater instruments and operative region image space mapping block 201, dynamic focus space orientation navigation module 202 and the second image acquiring module 205; Described the first image acquiring module 101 connects described lung regional reconstruction module 102, the 4D sequential images information that output is obtained; Described lung regional reconstruction module 102 connects described pulmonary respiratory movement model and builds module 103, the 4D structural information of output lung table, trachea and lung blood vessel etc.; Described pulmonary respiratory movement model builds module 103 and connects described dynamic focus space orientation navigation module 202, and output takes into account trachea, angiokinetic pulmonary respiratory movement model; Described the second image acquiring module 205 connects described operating theater instruments and operative region image space mapping block 201, and output is with the low resolution 3D image information of gauge point; Described operating theater instruments is connected described dynamic focus space orientation navigation module 202 with operative region image space mapping block 201, output is mapped to the operating theater instruments physical space coordinate of operative region image space coordinate; Described dynamic focus space orientation navigation module 202 builds dynamic image, calculates the locus of operative region focus and operating theater instruments, by Visualization Platform, obtains navigation information in real-time art.
Further, in described art, implement module 2 and also comprise an operation pathway planning module 12 for the operation of low resolution 3D image, described the second image acquiring module 205 outfans connect the low resolution 3D image information with gauge point that described operation pathway planning module 12 outputs are obtained, described operation pathway planning module 12 connects described dynamic focus space orientation navigation module 202 output lesions position and sizes, and wound position and operating theater instruments enter the information such as the angle of operative region and the degree of depth and with reference to image space coordinate system.
Further, implement module 2 and also comprise a guide of fluoroscopy module 204 in described art, described guide of fluoroscopy module 204 connects described dynamic focus space orientation navigation module 202 and exports 2D fluoroscopic image information.
Further, implement module 2 and also comprise a focus acquisition module 203 in described art, described dynamic focus space orientation navigation module 202 connects described focus acquisition module 203.
Further, described lung regional reconstruction module 102 comprises: image cutting unit and reconstruction unit, wherein, described image cutting unit first extracts chest area in 4D sequence image, in the region of extracting again by lung table, trachea and blood vessel segmentation out, and export result to described reconstruction unit, and described reconstruction unit carries out 3D volume reconstruction by sequences segmentation result, and result exports pulmonary respiratory movement model to and builds module 103.
Further, described pulmonary respiratory movement model builds module 103 and comprises: deformable registration unit and deformation domain description unit, wherein, described deformable registration unit carries out 4D deformable registration to lung regional structure, output movement vector set is to described deformation domain description unit, described deformation domain description unit is expressed motion vector collection in the mode in deformation territory, result transfers to described dynamic focus space orientation navigation module 202.
Further, described operation pathway planning module 12 comprises: man-machine interaction focal zone indexing unit and man-machine interaction wound planning unit, wherein, described man-machine interaction focal zone indexing unit allows to get involved doctor and in reference to image, draws a circle to approve lesions position and size, described man-machine interaction wound planning unit allows to get involved doctor and in reference to image, demarcates wound position, planning operating theater instruments enters angle and the depth information of human body, this two unit output informations and be transferred to described dynamic focus space orientation navigation module 202 with reference to image space coordinate system by wound.
Further, described operating theater instruments and operative region image space mapping block 201 comprise: gauge point is chosen unit and registration unit, wherein, described gauge point choose unit need to get involved doctor operative region physical space and with the low resolution 3D image information space of gauge point one by one correspondence choose gauge point, the gauge point of two coordinate spaces of output is to described registration unit, registration unit by these one to one gauge point carry out spatial mappings, obtain the mapping relations of operative region physical space and image space, export the operative region coordinate of expressing to described dynamic focus space orientation navigation module 202 in low resolution 3D image coordinate space.
Further, described dynamic focus space orientation navigation module 202 comprises: registration unit, dynamic image construction unit, three-dimensional localization unit and image visualization, described registration unit is mapped to low resolution 3D image space coordinate system with reference in image space coordinate, in the operative region coordinate of expressing in low resolution 3D image space coordinate being mapped directly to reference to image coordinate space simultaneously, output with reference to the low resolution 3D image in image space coordinate and operative region coordinate to dynamic image construction unit, all coordinate spaces are unified to be arrived with reference to image coordinate space, described dynamic image construction unit be take with reference to low resolution 3D image in image coordinate space as basis, in conjunction with deformation territory, lung territory, builds the dynamic image data collection of simulation autonomous respiration campaign, and dynamic image is outputed to three-dimensional localization unit, described three-dimensional localization unit by using operative region coordinate, in conjunction with dynamic image, calculates the locus of operative region organ and operating theater instruments, obtains navigation information in real-time art, and these information are passed to described image visualization, described image visualization has multiplanar reconstruction and volume drawing function, with 2D and 3D mode, show operative region organ, the relative position of operating theater instruments in operative region, the operating theater instruments of the lesions position of planning and size and planning enters angle and the depth information of human body by wound.
Further, described dynamic focus space orientation navigation module 202 also will comprise: 2D-3D registration unit, 2D fluoroscopic image information is mapped on dynamic image, and the result after mapping is passed to three-dimensional localization unit, described three-dimensional localization unit by using operative region coordinate, in conjunction with dynamic image, and the fluoroscopic image information showing on dynamic image, calculate the locus of operative region organ and operating theater instruments, obtain navigation information in real-time art, and these information are passed to described image visualization.
The foregoing is only the preferred embodiments of the present invention, be not limited to the present invention, for a person skilled in the art, the present invention can have various modifications and variations.Within the spirit and principles in the present invention all, any modification of doing, be equal to replacement, improvement etc., within all should being included in protection scope of the present invention.

Claims (11)

1.Yi Zhong image navigation pulmonary intervene operation system, it is characterized in that, comprise: in preoperative preparation module (1) and art, implement module (2), described preoperative preparation module (1) comprising: the first image acquiring module (101), lung regional reconstruction module (102) and pulmonary respiratory movement model build module (103), implement module (2) and comprise in described art: operating theater instruments and operative region image space mapping block (201), dynamic focus space orientation navigation module (202) and the second image acquiring module (205); Described the first image acquiring module (101) connects described lung regional reconstruction module (102), the 4D sequential images information that output is obtained; Described lung regional reconstruction module (102) connects described pulmonary respiratory movement model and builds module (103), output lung table, trachea and pulmonary vascular 4D structural information; Described pulmonary respiratory movement model builds module (103) and connects described dynamic focus space orientation navigation module (202), and output takes into account trachea, angiokinetic pulmonary respiratory movement model; Described the second image acquiring module (205) connects described operating theater instruments and operative region image space mapping block (201), and output is with the low resolution 3D image information of gauge point; Described operating theater instruments is connected described dynamic focus space orientation navigation module (202) with operative region image space mapping block (201), and output is mapped to the operating theater instruments physical space coordinate of operative region image space coordinate; Described dynamic focus space orientation navigation module (202) builds dynamic image, calculates the locus of operative region focus and operating theater instruments, by Visualization Platform, obtains navigation information in real-time art; Described pulmonary respiratory movement model builds module (103) and comprising: deformable registration unit and deformation domain description unit, wherein, described deformable registration unit carries out 4D deformable registration to lung regional structure, output movement vector set is to described deformation domain description unit, described deformation domain description unit is expressed motion vector collection in the mode in deformation territory, result transfers to described dynamic focus space orientation navigation module (202).
2. image navigation according to claim 1 pulmonary intervene operation system, it is characterized in that: described preoperative preparation module (1) also comprises an operation pathway planning module (12) for benchmark 3D rendering operation in 4D sequential images, described the first image acquiring module (101) outfan also connects described operation pathway planning module (12) and exports the 4D sequential images information of obtaining, described operation pathway planning module (12) connects described dynamic focus space orientation navigation module (202) and exports lesions position and size, wound position and operating theater instruments enter the angle of operative region and depth information and with reference to image space coordinate system.
3. image navigation according to claim 1 pulmonary intervene operation system, it is characterized in that: in described art, implement module (2) and also comprise an operation pathway planning module (12) for the operation of low resolution 3D image, described the second image acquiring module (205) outfan connects described operation pathway planning module (12), the low resolution 3D image information with gauge point that output is obtained, described operation pathway planning module (12) connects described dynamic focus space orientation navigation module (202), output lesions position and size, wound position and operating theater instruments enter the angle of operative region and depth information and with reference to image space coordinate system.
4. according to the image navigation pulmonary intervene operation system described in claim 2 or 3, it is characterized in that: in described art, implement module (2) and also comprise a guide of fluoroscopy module (204), described guide of fluoroscopy module (204) connects described dynamic focus space orientation navigation module (202) and exports 2D fluoroscopic image information.
5. image navigation according to claim 4 pulmonary intervene operation system, it is characterized in that, in described art, implement module (2) and also comprise a focus acquisition module (203), described dynamic focus space orientation navigation module (202) connects described focus acquisition module (203).
6. according to the image navigation pulmonary intervene operation system described in claim 1 or 2 or 3, it is characterized in that: described lung regional reconstruction module (102) comprising: image cutting unit and reconstruction unit, wherein, described image cutting unit first extracts chest area in 4D sequence image, in the region of extracting again by lung table, trachea and blood vessel segmentation out, and export result to described reconstruction unit, described reconstruction unit carries out 3D volume reconstruction by sequences segmentation result, and result exports described pulmonary respiratory movement model to and builds module (103).
7. according to the image navigation pulmonary intervene operation system described in claim 2 or 3, it is characterized in that, described operation pathway planning module (12) comprising: man-machine interaction focal zone indexing unit and man-machine interaction wound planning unit, wherein, described man-machine interaction focal zone indexing unit allows to get involved doctor and in reference to image, draws a circle to approve lesions position and size, described man-machine interaction wound planning unit allows to get involved doctor and in reference to image, demarcates wound position, planning operating theater instruments enters angle and the depth information of human body by wound, this two unit output informations and be transferred to described dynamic focus space orientation navigation module (202) with reference to image space coordinate system.
8. according to the image navigation pulmonary intervene operation system described in claim 1 or 2 or 3, it is characterized in that, described operating theater instruments and operative region image space mapping block (201) comprising: gauge point is chosen unit and registration unit, wherein, described gauge point choose unit need to get involved doctor operative region physical space and with the low resolution 3D image information space of gauge point one by one correspondence choose gauge point, the gauge point of two space coordinatess of output is to described registration unit, registration unit by these one to one gauge point carry out spatial mappings, obtain the mapping relations of operative region physical space and image space, export the operative region coordinate of expressing to described dynamic focus space orientation navigation module (202) in low resolution 3D image space coordinate.
9. image navigation according to claim 2 pulmonary intervene operation system, it is characterized in that, described dynamic focus space orientation navigation module (202) comprising: registration unit, dynamic image construction unit, three-dimensional localization unit and image visualization, described registration unit is mapped to low resolution 3D image space coordinate system with reference in image space coordinate, in the operative region coordinate of expressing in low resolution 3D image space coordinate being mapped directly to reference to image space coordinate simultaneously, output with reference to the low resolution 3D image in image space coordinate and operative region coordinate to dynamic image construction unit, all space coordinatess are unified to be arrived with reference to image space coordinate, described dynamic image construction unit be take with reference to low resolution 3D image in image space coordinate as basis, in conjunction with deformation territory, lung territory, builds the dynamic image data collection of simulation autonomous respiration campaign, and dynamic image is outputed to three-dimensional localization unit, described three-dimensional localization unit by using operative region coordinate, in conjunction with dynamic image, calculates the locus of operative region organ and operating theater instruments, obtains navigation information in real-time art, and these information are passed to described image visualization, described image visualization has multiplanar reconstruction and volume drawing function, with 2D and 3D mode, show operative region organ, the relative position of operating theater instruments in operative region, the operating theater instruments of the lesions position of planning and size and planning enters angle and the depth information of human body by wound.
10. image navigation according to claim 3 pulmonary intervene operation system, it is characterized in that, described dynamic focus space orientation navigation module (202) comprising: dynamic image construction unit, three-dimensional localization unit and image visualization, it is basis that described dynamic image construction unit be take with reference to low resolution 3D image in image space coordinate, in conjunction with deformation territory, lung territory, build the dynamic image data collection of simulation autonomous respiration campaign, and dynamic image is outputed to three-dimensional localization unit; Described three-dimensional localization unit by using operative region coordinate, in conjunction with dynamic image, calculates the locus of operative region organ and operating theater instruments, obtains navigation information in real-time art, and these information are passed to described image visualization; Described image visualization has multiplanar reconstruction and volume drawing function, with 2D and 3D mode, show operative region organ, the relative position of operating theater instruments in operative region, the operating theater instruments of the lesions position of planning and size and planning enters angle and the depth information of human body by wound.
11. image navigation according to claim 10 pulmonary intervene operation systems, it is characterized in that, described dynamic focus space orientation navigation module (202) also comprises: 2D-3D registration unit, 2D fluoroscopic image information is mapped on dynamic image, and the result after mapping is passed to three-dimensional localization unit, described three-dimensional localization unit by using operative region coordinate, in conjunction with dynamic image, and the fluoroscopic image information showing on dynamic image, calculate the locus of operative region organ and operating theater instruments, obtain navigation information in real-time art, and these information are passed to described image visualization.
CN201110246692.4A 2011-08-26 2011-08-26 Image-guided lung interventional operation system Expired - Fee Related CN102949240B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201110246692.4A CN102949240B (en) 2011-08-26 2011-08-26 Image-guided lung interventional operation system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201110246692.4A CN102949240B (en) 2011-08-26 2011-08-26 Image-guided lung interventional operation system

Publications (2)

Publication Number Publication Date
CN102949240A CN102949240A (en) 2013-03-06
CN102949240B true CN102949240B (en) 2014-11-26

Family

ID=47759011

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201110246692.4A Expired - Fee Related CN102949240B (en) 2011-08-26 2011-08-26 Image-guided lung interventional operation system

Country Status (1)

Country Link
CN (1) CN102949240B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109199586A (en) * 2018-11-09 2019-01-15 山东大学 A kind of laser bone-culting operation robot system and its paths planning method

Families Citing this family (45)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104302241B (en) 2012-05-14 2018-10-23 直观外科手术操作公司 The registration arrangement and method of the Medical Devices of search space for using reduction
US10039473B2 (en) 2012-05-14 2018-08-07 Intuitive Surgical Operations, Inc. Systems and methods for navigation based on ordered sensor records
CN106456250B (en) 2013-08-13 2019-10-08 波士顿科学国际有限公司 Dissect the computer visualization of item
CN104434313B (en) * 2013-09-23 2019-03-01 中国科学院深圳先进技术研究院 A kind of abdominal surgery navigation methods and systems
EP3054862B1 (en) * 2013-10-07 2020-02-12 Technion Research & Development Foundation Ltd. Gripper for robotic image guided needle insertion
US20170000571A1 (en) * 2013-12-12 2017-01-05 Koninklijke Philips N.V. Method and system for respiratory monitoring during ct-guided interventional procedures
EP3114648B1 (en) 2014-03-04 2019-06-05 Xact Robotics Ltd. Dynamic planning method for needle insertion
CN104055520B (en) 2014-06-11 2016-02-24 清华大学 Human organ motion monitoring method and operation guiding system
CN104224126A (en) * 2014-09-17 2014-12-24 中国人民解放军空军总医院 Device and system for diagnosing skin symptoms
CN105559887B (en) * 2015-12-11 2018-01-30 哈尔滨工业大学 The surgical cut training system and method based on force feedback for operating robot
CN105796161A (en) * 2016-03-02 2016-07-27 赛诺威盛科技(北京)有限公司 Method for conducting puncture navigation in CT interventional therapy and puncture navigation device
CN106137394B (en) * 2016-06-15 2019-09-24 苏州铸正机器人有限公司 A method of obtaining pedicle of vertebral arch standard axle bitmap
CN106109016A (en) * 2016-08-17 2016-11-16 北京柏惠维康医疗机器人科技有限公司 Abdominal-cavity minimal-invasion surgery system and wherein play the determination method in pin moment
CN106236258B (en) * 2016-08-17 2019-03-12 北京柏惠维康科技有限公司 The method and device for planning of abdominal-cavity minimal-invasion surgery puncture path
CN114795479A (en) * 2016-10-28 2022-07-29 奥尔索夫特Ulc公司 Robotic cutting workflow
CN106529188B (en) * 2016-11-25 2019-04-19 苏州国科康成医疗科技有限公司 Image processing method applied to surgical navigational
CN106725852A (en) * 2016-12-02 2017-05-31 上海精劢医疗科技有限公司 The operation guiding system of lung puncture
WO2018127522A1 (en) 2017-01-03 2018-07-12 Koninklijke Philips N.V. Medical navigation system using shape-sensing device and method of operation thereof
CN107296645B (en) * 2017-08-03 2020-04-14 东北大学 Optimal path planning method for lung puncture operation and lung puncture operation navigation system
CN108175502B (en) * 2017-11-29 2021-08-17 苏州朗开医疗技术有限公司 Bronchoscope electromagnetic navigation system
CN110090076A (en) * 2018-01-30 2019-08-06 埃达技术股份有限公司 For to enhance with mixed reality the method and system through deflation lung shape for estimating to assist thoracic operation for video
EA202091648A1 (en) * 2018-03-09 2020-12-09 Парамевиа Пте. Лтд. DIAGNOSTIC SUPPORT PROGRAM
CN110403698B (en) * 2018-04-28 2020-10-30 北京柏惠维康科技有限公司 Instrument intervention device and system
CN108652720A (en) * 2018-05-15 2018-10-16 吴可知 Make the design method of the puncture guide plate of trigeminal neuralgia radio-frequency ablation procedure
CN109410170B (en) * 2018-09-14 2022-09-02 东软医疗***股份有限公司 Image data processing method, device and equipment
CN109360219A (en) * 2018-10-23 2019-02-19 东北大学 A kind of augmented reality auxiliary operation method and system
WO2020107166A1 (en) * 2018-11-26 2020-06-04 苏州朗开医疗技术有限公司 Lung biopsy device and system
CN109793558A (en) * 2018-12-19 2019-05-24 江苏集萃智能制造技术研究所有限公司 Puncturing operation space mapping method based on binocular visual positioning
CN110613519B (en) * 2019-09-20 2020-09-15 真健康(北京)医疗科技有限公司 Dynamic registration positioning device and method
CN110731821B (en) * 2019-09-30 2021-06-01 艾瑞迈迪医疗科技(北京)有限公司 Method and guide bracket for minimally invasive tumor ablation based on CT/MRI
CN111067622B (en) * 2019-12-09 2023-04-28 天津大学 Respiratory motion compensation method for pulmonary percutaneous puncture
CN110840534B (en) * 2019-12-19 2022-05-17 上海钛米机器人科技有限公司 Puncture speed planning method and device, puncture equipment and computer storage medium
CN110974419B (en) * 2019-12-24 2021-07-06 武汉大学 Guide wire navigation method and system for portal stenosis in endoscopic biliary stent implantation
CN111513849B (en) * 2020-04-30 2022-04-19 京东方科技集团股份有限公司 Surgical system for puncture, control method and control device
CN112330603B (en) * 2020-10-19 2023-04-18 浙江省肿瘤医院 System and method for estimating motion of target in tissue based on soft tissue surface deformation
CN113133828B (en) * 2021-04-01 2023-12-01 上海复拓知达医疗科技有限公司 Interactive registration system, method, electronic device and readable storage medium for surgical navigation
CN113100935A (en) * 2021-04-13 2021-07-13 上海大学 Preoperative puncture path planning method and training system for lung puncture operation
CN113229936A (en) * 2021-05-06 2021-08-10 卫飞鹏 Method and system for improving liver intervention target positioning accuracy
CN114073581B (en) * 2021-06-29 2022-07-12 成都科莱弗生命科技有限公司 Bronchus electromagnetic navigation system
CN113893033B (en) * 2021-07-01 2023-05-12 中国科学院苏州生物医学工程技术研究所 Pulmonary percutaneous puncture navigation method and system
CN113425411B (en) * 2021-08-04 2022-05-10 成都科莱弗生命科技有限公司 Device of pathological change location navigation
CN113855239B (en) * 2021-09-24 2023-10-20 深圳高性能医疗器械国家研究院有限公司 Guide wire navigation system and method in vascular intervention operation
WO2023050307A1 (en) * 2021-09-30 2023-04-06 中国科学院深圳先进技术研究院 Ct-compatible lung biopsy system and method
CN114931435B (en) * 2022-06-02 2023-02-17 上海市胸科医院 Three-dimensional model processing method and device and electronic equipment
CN117274506B (en) * 2023-11-20 2024-02-02 华中科技大学同济医学院附属协和医院 Three-dimensional reconstruction method and system for interventional target scene under catheter

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1907233A (en) * 2005-08-05 2007-02-07 西门子公司 Device and method for automated planning of an access path for a percutaneous, minimally invasive intervention
CN101474075A (en) * 2009-01-15 2009-07-08 复旦大学附属中山医院 Navigation system of minimal invasive surgery
CN101799935A (en) * 2009-12-31 2010-08-11 华中科技大学 Dynamic three-dimensional reconstruction method of single-arm X-ray angiogram maps
CN101862205A (en) * 2010-05-25 2010-10-20 中国人民解放军第四军医大学 Intraoperative tissue tracking method combined with preoperative image
EP2346398A1 (en) * 2008-10-23 2011-07-27 Koninklijke Philips Electronics N.V. Cardiac- and/or respiratory-gated image acquisition system and method for virtual anatomy enriched real-time 2d imaging in interventional radiofrequency ablation or pacemaker placement procedures

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7697972B2 (en) * 2002-11-19 2010-04-13 Medtronic Navigation, Inc. Navigation system for cardiac therapies

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1907233A (en) * 2005-08-05 2007-02-07 西门子公司 Device and method for automated planning of an access path for a percutaneous, minimally invasive intervention
EP2346398A1 (en) * 2008-10-23 2011-07-27 Koninklijke Philips Electronics N.V. Cardiac- and/or respiratory-gated image acquisition system and method for virtual anatomy enriched real-time 2d imaging in interventional radiofrequency ablation or pacemaker placement procedures
CN101474075A (en) * 2009-01-15 2009-07-08 复旦大学附属中山医院 Navigation system of minimal invasive surgery
CN101799935A (en) * 2009-12-31 2010-08-11 华中科技大学 Dynamic three-dimensional reconstruction method of single-arm X-ray angiogram maps
CN101862205A (en) * 2010-05-25 2010-10-20 中国人民解放军第四军医大学 Intraoperative tissue tracking method combined with preoperative image

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
周付根等.Demons算法在四维CT 图像配准中的应用.《CT理论与应用研究》.2009,第18卷(第1期),第69-75页. *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109199586A (en) * 2018-11-09 2019-01-15 山东大学 A kind of laser bone-culting operation robot system and its paths planning method

Also Published As

Publication number Publication date
CN102949240A (en) 2013-03-06

Similar Documents

Publication Publication Date Title
CN102949240B (en) Image-guided lung interventional operation system
US11547377B2 (en) System and method for navigating to target and performing procedure on target utilizing fluoroscopic-based local three dimensional volume reconstruction
US20210137351A1 (en) Apparatus and Method for Airway Registration and Navigation
AU2019203998B2 (en) System and method for local three dimensional volume reconstruction using a standard fluoroscope
US11712213B2 (en) System and method for pose estimation of an imaging device and for determining the location of a medical device with respect to a target
EP3097885B1 (en) Method and apparatus for registering a physical space to image space
CA2617313C (en) A method and a system for registering a 3d pre-acquired image coordinates system with a medical positioning system coordinate system and with a 2d image coordinate system
AU2018290995A1 (en) System and method for identifying, marking and navigating to a target using real-time two-dimensional fluoroscopic data
CN104055520A (en) Human organ motion monitoring method and human body navigation system
EP1727471A1 (en) System for guiding a medical instrument in a patient body
CN110123449A (en) The system and method for carrying out partial 3 d volume reconstruction using standard fluorescence mirror
CN111839727A (en) Prostate particle implantation path visualization method and system based on augmented reality
US11950951B2 (en) Systems and methods for C-arm fluoroscope camera pose refinement with secondary movement compensation
WO2019075074A1 (en) System and method for identifying and marking a target in a fluoroscopic three-dimensional reconstruction
WO2022165112A1 (en) Systems and methods for c-arm fluoroscope camera pose refinement with secondary movement compensation
US11864935B2 (en) Systems and methods for pose estimation of a fluoroscopic imaging device and for three-dimensional imaging of body structures
Li et al. The technology of image navigation for skull surgery
WO2024079639A1 (en) Systems and methods for confirming position or orientation of medical device relative to target
CN117179893A (en) Mammary gland puncture positioning path planning system

Legal Events

Date Code Title Description
DD01 Delivery of document by public notice

Addressee: Liu Haihong

Document name: Notification of Passing Preliminary Examination of the Application for Invention

C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20141126

Termination date: 20180826