WO2022176874A1 - 医療画像処理装置、医療画像処理方法及びプログラム - Google Patents
医療画像処理装置、医療画像処理方法及びプログラム Download PDFInfo
- Publication number
- WO2022176874A1 WO2022176874A1 PCT/JP2022/006046 JP2022006046W WO2022176874A1 WO 2022176874 A1 WO2022176874 A1 WO 2022176874A1 JP 2022006046 W JP2022006046 W JP 2022006046W WO 2022176874 A1 WO2022176874 A1 WO 2022176874A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- distance
- medical image
- distance information
- region
- resection
- Prior art date
Links
- 238000012545 processing Methods 0.000 title claims abstract description 89
- 238000003672 processing method Methods 0.000 title claims abstract description 13
- 238000000926 separation method Methods 0.000 claims abstract description 58
- 238000009877 rendering Methods 0.000 claims description 37
- 238000000034 method Methods 0.000 claims description 12
- 238000003860 storage Methods 0.000 claims description 9
- 239000003086 colorant Substances 0.000 claims description 5
- 238000002059 diagnostic imaging Methods 0.000 claims description 2
- 238000002271 resection Methods 0.000 description 204
- 206010028980 Neoplasm Diseases 0.000 description 81
- 230000003902 lesion Effects 0.000 description 78
- 210000001519 tissue Anatomy 0.000 description 61
- 201000011510 cancer Diseases 0.000 description 58
- 238000010586 diagram Methods 0.000 description 44
- 238000000605 extraction Methods 0.000 description 31
- 238000005259 measurement Methods 0.000 description 31
- 238000004891 communication Methods 0.000 description 20
- 210000004204 blood vessel Anatomy 0.000 description 16
- 210000005036 nerve Anatomy 0.000 description 16
- 238000004321 preservation Methods 0.000 description 16
- 238000002595 magnetic resonance imaging Methods 0.000 description 14
- 230000006870 function Effects 0.000 description 12
- 210000000664 rectum Anatomy 0.000 description 11
- 210000000056 organ Anatomy 0.000 description 9
- 238000002679 ablation Methods 0.000 description 8
- 239000000284 extract Substances 0.000 description 7
- 238000001356 surgical procedure Methods 0.000 description 7
- 208000015634 Rectal Neoplasms Diseases 0.000 description 6
- 238000009826 distribution Methods 0.000 description 6
- 206010038038 rectal cancer Diseases 0.000 description 6
- 201000001275 rectum cancer Diseases 0.000 description 6
- 238000003745 diagnosis Methods 0.000 description 4
- 230000000694 effects Effects 0.000 description 4
- 238000012986 modification Methods 0.000 description 4
- 230000004048 modification Effects 0.000 description 4
- 230000011218 segmentation Effects 0.000 description 4
- 238000013527 convolutional neural network Methods 0.000 description 3
- 210000001035 gastrointestinal tract Anatomy 0.000 description 3
- 238000003709 image segmentation Methods 0.000 description 3
- 239000012528 membrane Substances 0.000 description 3
- 239000004065 semiconductor Substances 0.000 description 3
- 239000007787 solid Substances 0.000 description 3
- 238000011247 total mesorectal excision Methods 0.000 description 3
- 231100000216 vascular lesion Toxicity 0.000 description 3
- 210000003484 anatomy Anatomy 0.000 description 2
- 210000000467 autonomic pathway Anatomy 0.000 description 2
- 210000000621 bronchi Anatomy 0.000 description 2
- 210000004351 coronary vessel Anatomy 0.000 description 2
- 230000006866 deterioration Effects 0.000 description 2
- 210000004197 pelvis Anatomy 0.000 description 2
- 206010006187 Breast cancer Diseases 0.000 description 1
- 208000026310 Breast neoplasm Diseases 0.000 description 1
- 208000028389 Nerve injury Diseases 0.000 description 1
- 201000001880 Sexual dysfunction Diseases 0.000 description 1
- 210000001015 abdomen Anatomy 0.000 description 1
- 210000000577 adipose tissue Anatomy 0.000 description 1
- 238000013528 artificial neural network Methods 0.000 description 1
- 230000017531 blood circulation Effects 0.000 description 1
- 210000000988 bone and bone Anatomy 0.000 description 1
- 238000002591 computed tomography Methods 0.000 description 1
- 239000000470 constituent Substances 0.000 description 1
- 230000007423 decrease Effects 0.000 description 1
- 238000013135 deep learning Methods 0.000 description 1
- 229940079593 drug Drugs 0.000 description 1
- 239000003814 drug Substances 0.000 description 1
- 206010013990 dysuria Diseases 0.000 description 1
- 238000005401 electroluminescence Methods 0.000 description 1
- 238000011156 evaluation Methods 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 210000002429 large intestine Anatomy 0.000 description 1
- 210000003041 ligament Anatomy 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 210000001165 lymph node Anatomy 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000000691 measurement method Methods 0.000 description 1
- 210000003205 muscle Anatomy 0.000 description 1
- 230000008764 nerve damage Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 238000002600 positron emission tomography Methods 0.000 description 1
- 231100000872 sexual dysfunction Toxicity 0.000 description 1
- 210000000626 ureter Anatomy 0.000 description 1
- 230000002792 vascular Effects 0.000 description 1
- 238000012800 visualization Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/12—Edge-based segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/08—Volume rendering
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/10—Geometric effects
- G06T15/20—Perspective computation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H30/00—ICT specially adapted for the handling or processing of medical images
- G16H30/20—ICT specially adapted for the handling or processing of medical images for handling medical images, e.g. DICOM, HL7 or PACS
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H30/00—ICT specially adapted for the handling or processing of medical images
- G16H30/40—ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/70—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for mining of medical data, e.g. analysing previous cases of other patients
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0033—Features or image-related aspects of imaging apparatus classified in A61B5/00, e.g. for MRI, optical tomography or impedance tomography apparatus; arrangements of imaging apparatus in a room
- A61B5/0037—Performing a preliminary scan, e.g. a prescan for identifying a region of interest
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/05—Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves
- A61B5/055—Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves involving electronic [EMR] or nuclear [NMR] magnetic resonance, e.g. magnetic resonance imaging
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/42—Detecting, measuring or recording for evaluating the gastrointestinal, the endocrine or the exocrine systems
- A61B5/4222—Evaluating particular parts, e.g. particular organs
- A61B5/4255—Intestines, colon or appendix
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/50—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment specially adapted for specific body parts; specially adapted for specific clinical applications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2200/00—Indexing scheme for image data processing or generation, in general
- G06T2200/04—Indexing scheme for image data processing or generation, in general involving 3D image data
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10024—Color image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10072—Tomographic images
- G06T2207/10088—Magnetic resonance imaging [MRI]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20021—Dividing image into blocks, subimages or windows
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30028—Colon; Small intestine
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30096—Tumor; Lesion
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2210/00—Indexing scheme for image generation or computer graphics
- G06T2210/41—Medical
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2219/00—Indexing scheme for manipulating 3D models or images for computer graphics
- G06T2219/004—Annotating, labelling
Definitions
- the present disclosure relates to a medical image processing device, a medical image processing method, and a program.
- Rectal cancer originates from the lumen of the intestinal tract and invades tissues outside the intestinal tract as it progresses.
- total mesrectectomy is performed to remove the entire mesrectum, the fatty tissue surrounding the rectum.
- Total mesorectal excision is expressed as TME using the abbreviation for total mesorectal excision, which is an English notation.
- Non-Patent Document 1 states that in function-preserving rectal cancer surgery, the detachment layer is selected according to the progression of cancer to maintain curability. The document also states that nerve damage increases or decreases due to the level of the delaminated layer.
- Patent Document 1 describes an endoscopic image diagnosis support device that superimposes and displays the distance distribution from the bronchial wall to the region of interest outside the bronchus in a volume rendering image.
- the device described in the document uses three-dimensional images to generate a first projection image and a second projection image in which a region of interest is projected onto the bronchus inner wall. The first projection image and the second projection image are highlighted according to the projection distance.
- Patent Document 2 describes a blood flow visualization device that automatically analyzes vascular lesions using a computer.
- the device described in the document obtains a medical image, determines a vascular lesion using a vascular shape constructed from the medical image, and calculates geometric features of the vascular lesion.
- Patent Document 3 describes a medical image processing device that measures the thickness of a fat region existing around a coronary artery.
- the apparatus described in the same document measures the distribution and thickness of fat regions formed around coronary arteries based on volume data, and generates two-dimensional fat thickness distribution data.
- Patent Document 4 describes an image display technique used for surgical planning of breast cancer resection surgery.
- the image processing device described in the document is a device that generates a VR image or the like using reconstructed data, specifies the tumor range, calculates the distance on the body surface from the reference point to the tumor range, Display on the VR image.
- Patent Document 5 describes a medical image processing apparatus that divides an organ of a subject into a plurality of sections, corrects the edges and boundaries of the sections, and uses the division results to calculate the resection target area of the organ.
- Patent Documents 1 to 5 a display for evaluating the three-dimensional distance between the separation surface for separating the tissue to be excised and the tissue to be excised and the three-dimensional distance between the tissue to be preserved.
- the present invention has been made in view of such circumstances, and is a medical image processing apparatus and a medical image processing method capable of supporting determination of a separation plane based on a three-dimensional distance between a tissue to be excised and a tissue to be preserved. and to provide programs.
- a medical image processing apparatus is a medical image processing apparatus that includes a processor and a storage device that stores a program to be executed using the processor, wherein the processor executes instructions of the program, A three-dimensional medical image is acquired, a separation plane is set when separating the first region specified in the medical image, and the nearest neighbor distance between an arbitrary position on the surface of the first region and the separation plane is the first When measuring the distance and measuring the second distance, which is the closest distance between an arbitrary position on the surface of the second region specified in the medical image and the separation plane, and displaying the medical image on the display device, the first A medical image processing apparatus for displaying at least one of first distance information representing a distance and second distance information representing a second distance.
- At least one of the three-dimensional distance between the first region and the separation plane and the three-dimensional distance between the second region and the separation plane can be grasped at a glance, and the separation plane can be determined. can help.
- the first region can specify the region containing the tissue to be separated.
- a second region may identify a region containing tissue to be preserved.
- a third area existing between the first area and the second area can be applied to the area where the separation plane is specified.
- the processor when causing the display device to display a cross-sectional image corresponding to an arbitrary cross-section in the medical image, moves the outer edge region on the side of the first region of the separation plane to the second region. Display one-distance information.
- obstruction of the visibility of the separation surface can be suppressed. Also, the side to which the first distance is applied with respect to the separation plane can be grasped at a glance.
- the edge region may be a region in contact with the separation surface or may be a region out of contact with the separation surface.
- the processor when causing the display device to display a cross-sectional image corresponding to an arbitrary cross-section in the medical image, moves the outer edge region on the side of the second region of the separation plane to the second region. Display two-distance information.
- obstruction of the visibility of the separation surface can be suppressed. Also, the side to which the second distance is applied with respect to the separation plane can be grasped at a glance.
- the processor applies a color map representing distance using colors to display the first distance information and the second distance information.
- the first distance and the second distance can be grasped at a glance based on the information represented by the colors of the color map.
- color maps include a mode in which differences in color are applied to represent differences in distance, and a mode in which differences in density are applied to represent differences in distance.
- the processor causes the display device to display a volume rendering image corresponding to the medical image, and displays at least one of the first distance information and the second distance information superimposed on the volume rendering image.
- At least one of the three-dimensional distance between the first region and the separation plane and the three-dimensional distance between the second region and the separation plane can be grasped at a glance, thereby supporting determination of the separation plane. can.
- the processor changes the display aspect of the first distance information and the display aspect of the second distance information according to the user's viewpoint in the volume rendering image.
- the first distance and the second distance seen from the user's viewpoint can be grasped at a glance.
- the processor superimposes the first distance information on the first area in the volume rendering image when the first area exists on the user's viewpoint side of the separation plane.
- the first distance information can be displayed at a position visible from the user's viewpoint for the first area visible from the user's viewpoint.
- the processor superimposes and displays the second distance information on the separation plane when the second region exists on the side of the separation plane opposite to the user's viewpoint in the volume rendering image.
- the second distance information can be displayed at a position visible from the user's viewpoint for the second area invisible from the user's viewpoint.
- the processor superimposes and displays the first distance information on the separation plane when the first region exists on the side of the separation plane opposite to the user's viewpoint in the volume rendering image.
- the first distance information can be displayed at a position visible from the user's viewpoint for the first area invisible from the user's viewpoint.
- the processor superimposes the second distance information on the second area in the volume rendering image when the second area exists on the user's viewpoint side of the separation plane.
- the second distance information can be displayed at a position visible from the user's viewpoint for the second area visible from the user's viewpoint.
- a computer acquires a three-dimensional medical image, sets a separation plane for separating a first region specified in the medical image, measuring a first distance that is the nearest distance between the position and the separation plane, measuring a second distance that is the nearest distance between the arbitrary position on the surface of the second region identified in the medical image and the separation plane, A medical image processing method for displaying at least one of first distance information representing a first distance and second distance information representing a second distance when displaying a medical image on a display device.
- the program according to the present disclosure provides a computer with a function of acquiring a three-dimensional medical image, a function of setting a separation plane when separating the first region specified in the medical image, an arbitrary position on the surface of the first region A function of measuring a first distance that is the nearest distance between the separation plane and the separation plane, and a function of measuring a second distance that is the nearest distance between an arbitrary position on the surface of the second region specified in the medical image and the separation plane and a program for realizing a function of displaying at least one of first distance information representing a first distance and second distance information representing a second distance when displaying a medical image on a display device.
- At least one of the three-dimensional distance between the first region and the separation surface and the three-dimensional distance between the second region and the separation surface can be grasped at a glance, and the determination of the separation surface can be supported.
- FIG. 1 is a diagram showing an example of measuring CRM from an MRI image of a rectal cancer patient.
- FIG. 2 illustrates a stereotaxic anterior resection.
- FIG. 3 is an image diagram showing the orientation of FIG.
- FIG. 4 is a diagram showing a cut surface.
- FIG. 5 is a schematic diagram showing an arbitrary section of an MRI image.
- FIG. 6 is an image diagram showing the orientation of FIG.
- FIG. 7 is a schematic diagram showing a first example of resection lines.
- FIG. 8 is a schematic diagram showing a second example of resection lines.
- FIG. 9 is a schematic diagram showing a third example of resection lines.
- FIG. 10 is a functional block diagram of the medical image processing apparatus according to the embodiment.
- FIG. 10 is a functional block diagram of the medical image processing apparatus according to the embodiment.
- FIG. 11 is a block diagram showing a configuration example of the medical image processing apparatus according to the embodiment.
- FIG. 12 is a flow chart showing the procedure of the medical image processing method according to the embodiment.
- FIG. 13 is an image diagram showing an example of displaying three-dimensional distance information superimposed on a two-dimensional cross section.
- FIG. 14 is an image diagram showing another example of displaying three-dimensional distance information superimposed on a two-dimensional cross section.
- FIG. 15 is a schematic diagram showing an example of a display screen for first distance measurement in the distance measurement process.
- FIG. 16 is a schematic diagram showing an example of a display screen for second distance measurement in the distance measurement process.
- FIG. 17 is a schematic diagram showing a display example of the first color map.
- FIG. 18 is a schematic diagram showing a display example of the second color map.
- FIG. 19 is a schematic diagram showing a display example of a color map according to the modification.
- FIG. 20 is a schematic diagram showing a display example of three-dimensional distance information in a volume rendering image.
- FIG. 21 is a schematic diagram of FIG. 20 viewed from the side of the tissue to be preserved.
- FIG. 22 is a schematic diagram of FIG. 20 viewed from the side of the tissue to be excised.
- FIG. 23 is a schematic diagram showing a display example of three-dimensional distance information applied to each tissue shown in FIG.
- FIG. 24 is a block diagram showing a configuration example of a medical information system including the medical image processing apparatus according to the embodiment.
- FIG. 1 is a diagram showing an example of measuring CRM from an MRI image of a rectal cancer patient.
- cancer TU, muscularis propria MP, and mesorectum ME are distinguished by different filling patterns.
- MRI image actually acquired is a three-dimensional image.
- MRI is an abbreviation for Magnetic Resonance Imaging.
- 3D slice images are continuously captured to obtain 3D data showing the 3D morphology of the object.
- 3D image can include the concept of a collection of 2D slice images and a 2D sequence taken in succession.
- image may also include the meaning of image data.
- CT is an abbreviation for Computed Tomography.
- the arrowed line shown in FIG. 1 represents the closest part where the distance from the cancer TU to the mesorectal ME is the closest.
- the closest distance which is the distance of the closest part, is one of the important parameters when determining the degree of cancer progression.
- a CRM is positive if the closest distance is 1 millimeter or less.
- FIG. 2 illustrates a stereotaxic anterior resection.
- FIG. 3 is an image diagram showing the orientation of FIG.
- FIG. 2 is a view looking inside the pelvis from the side of the head.
- arrow lines are used to illustrate the positions on the belly side in FIG.
- Fig. 2 shows a state in which the cancer is resected by applying stereotaxic anterior resection in which the fat 2 around the rectum 1 where the cancer occurs is peeled off and the entire cancer is resected.
- stereotaxic anterior resection in which the fat 2 around the rectum 1 where the cancer occurs is peeled off and the entire cancer is resected.
- FIG. 4 is a diagram showing a resection surface.
- the cancer is resected at the resection plane 4 . If the resection surface 4 is too much inside the rectum 1, the cancer will be exposed. On the other hand, if the cut surface 4 is too far outside the rectum 1, the nerve 3 may be damaged.
- FIG. 5 is a schematic diagram showing an arbitrary cross section of an MRI image.
- the figure shows a state in which a cancer 5 developed in the rectum 1 crosses the boundary 6 between the rectum 1 and the fat 2 and reaches the fat 2 .
- FIG. 6 is an image diagram showing the orientation of FIG. A region 7 shown in FIG. 6 represents the position of the rectum 1 etc. shown in FIG.
- the rectum 1, fat 2, and cancer 5 shown in FIG. 5 respectively correspond to the muscularis strict MP, mesorectum ME, and cancer TU shown in FIG.
- FIG. 7 is a schematic diagram showing a first example of resection lines.
- a resection line 4A shown in the figure is set when the cancer 5 is resected together with the fat 2 around the rectum 1 where the cancer 5 has developed. That is, the resection line 4A coincides with the outline of the fat 2.
- FIG. 7 is a schematic diagram showing a first example of resection lines.
- a resection line 4A shown in the figure is set when the cancer 5 is resected together with the fat 2 around the rectum 1 where the cancer 5 has developed. That is, the resection line 4A coincides with the outline of the fat 2.
- FIG. 7 is a schematic diagram showing a first example of resection lines.
- a resection line 4A in the cross-sectional image shown in FIG. 7 corresponds to the resection plane 4 shown in FIG. The same applies to the resection line 4B shown in FIG. 8 and the resection line 4C shown in FIG.
- FIG. 8 is a schematic diagram showing a second example of resection lines.
- a resection line 4B shown in the figure is set when the cancer 5 is resected while avoiding the nerve 3 and leaving a part of the fat 2 . That is, a portion of the resection line 4B coincides with the contour of the fat 2 and a portion of the resection line 4B is inside the fat 2.
- FIG. 9 is a schematic diagram showing a third example of resection lines.
- the resection line 4C shown in the figure is set when the cancer 5 is resected while avoiding the nerve 3 and leaving a part of the fat 2.
- the margin from 3 and the margin from cancer 5 are well taken.
- a resection plane 4 that provides a sufficient margin from the tissue to be preserved and a resection plane 4 that provides a sufficient margin from the tissue to be resected can be planned before surgery. Note that if it is found in advance that a sufficient margin cannot be obtained, it is possible to give up the preservation of the tissue to be preserved and select the surgical method of excising the nerve 3 and the blood vessel.
- FIG. 10 is a functional block diagram of the medical image processing apparatus according to the embodiment.
- the medical image processing apparatus 20 shown in the figure is implemented using computer hardware and software.
- Software is synonymous with program.
- the term medical image used in the embodiments described below is synonymous with the term medical image.
- the medical image processing apparatus 20 includes an image acquisition section 222, a lesion area extraction section 224, a lesion area input reception section 226, a resection plane extraction section 228, a resection plane input reception section 230, a distance measurement section 232, and a display image generation section 234. .
- An input device 214 and a display device 216 are also connected to the medical image processing apparatus 20 .
- the image acquisition unit 222 acquires medical images to be processed from an image storage server or the like.
- an MRI image captured using an MRI apparatus is exemplified as a medical image to be processed.
- a medical image to be processed is not limited to an MRI image, and may be an image captured by another modality such as a CT apparatus.
- image diagnosis of cancer occurring in the gastrointestinal tract such as the large intestine is assumed.
- a three-dimensional image in which an area including a region is photographed as one image can be mentioned.
- the target image is preferably isotropic voxel high-resolution three-dimensional data.
- the lesion area extraction unit 224 performs lesion area extraction processing to automatically extract the lesion area included in the medical image by applying image recognition from the medical image acquired via the image acquisition unit 222 .
- the lesion area in this embodiment is a cancer area.
- machine learning represented by deep learning is applied, and image segmentation is performed on the input medical image using a trained model that has learned the task of image segmentation. and extracting a lesion area.
- a convolutional neural network is an example of a learning model that performs image segmentation.
- a convolutional neural network may be referred to as a CNN using the abbreviation for the English notation Convolution Neural Network.
- the lesion area extraction unit 224 may be configured to perform segmentation for classifying each of a plurality of areas around the lesion area, not limited to segmentation for classifying the lesion area and other areas.
- the lesion area extraction unit 224 receives an input of a three-dimensional image, classifies each area such as a cancer area as a lesion area and other peripheral organs as a preservation area, and attaches a mask pattern to each area. Extraction of the lesion area and extraction of the preservation area may be performed using a learned model that has been trained to output an image of the segmentation result obtained.
- a lesion area input reception unit 226 receives input of information on a lesion area automatically extracted using the lesion area extraction unit 224 .
- the lesion area input reception unit 226 also receives input of information on the lesion area specified by the user using the input device 214 .
- a doctor who is a user can use the input device 214 to freely specify a region different from the lesion region automatically extracted by the lesion region extraction unit 224 .
- the doctor can use the input device 214 to specify a cancer-recognized area or a cancer-suspected area among the areas that were not extracted as cancer in the automatic extraction.
- the lesion area described in the embodiment is an example of the first area specified in the medical image.
- the preserved region described in the embodiment is an example of the second region specified in the medical image.
- the resection plane extraction unit 228 performs processing for extracting a candidate for the resection plane 4, which serves as a reference for resecting the lesion area from the medical image, based on the lesion area information obtained through the lesion area input reception unit 226. . Since the spread of cancer changes depending on the degree of cancer progression, candidates for the appropriate resection plane 4 are automatically extracted according to the degree of cancer progression.
- a candidate for the cut surface 4 may be a curved surface of arbitrary shape.
- the resection plane input reception unit 230 receives input of information on candidates for the resection plane 4 automatically extracted using the resection plane extraction unit 228 .
- the resection plane input reception unit 230 also receives input of information on the resection plane specified by the user using the input device 214 .
- a doctor who is a user, can use the input device 214 to specify a resection plane 4 that is different from the candidates for the resection plane 4 automatically extracted using the resection plane extraction unit 228 .
- the resection plane 4 different from the automatically extracted candidate for the resection plane 4 may be a plane in which a part of the automatically extracted candidate for the resection plane 4 is corrected.
- the resection plane 4 is determined based on the information input via the resection plane input reception unit 230 .
- the resection surface 4 may be an arbitrary curved surface.
- the doctor when examining the resection plane before surgery, can specify not only an anatomically existing boundary plane, but also a virtual plane within a layer as a candidate for the resection plane 4.
- the resection plane 4 described in the embodiment is an example of a separation plane used when separating the first region specified in the medical image.
- the distance measurement unit 232 measures the three-dimensional distance from the cancer 5 to the resection surface 4 and the three-dimensional distance from the tissue to be preserved to the resection surface.
- the tissue to be preserved is the tissue located on the side of the resection surface 4 opposite to the cancer 5 .
- An example of the tissue to be preserved is the nerve 3 shown in FIG.
- the entire lesion area including the automatically extracted lesion area and the lesion area specified by the user is grouped, regarded as one lesion area as a whole, and the distance between the lesion area and the resection plane 4 is measured. Aspects are also possible. Preferably, it is configured such that these measurement modes can be selected.
- the following methods can be applied as distance measurement methods. For each of the points on the outline of the lesion area to be measured, calculate the distance to the closest point on the resection plane 4, and select the two points with the smallest distance among the calculated distances. The distance between is obtained as the nearest neighbor distance between the resection plane 4 and the lesion area.
- the distance between the resection plane 4 and the preservation area is measured, and the closest distance between the resection plane 4 and the preservation area is obtained.
- the plurality of points on the contour of the lesion area described in the embodiment are an example of arbitrary positions on the surface of the first area.
- a plurality of points on the contour of the preservation area described in the embodiment is an example of arbitrary positions on the surface of the second area.
- the display image generation unit 234 generates a display image to be displayed on the display device 216.
- the display image generator 234 generates a two-dimensional image for display from the three-dimensional image. Examples of two-dimensional images for display include cross-sectional images and volume rendering images.
- the display image generation unit 234 generates at least one cross-sectional image among an axial cross-sectional image, a coronal cross-sectional image, and a sagittal cross-sectional image.
- the display image generator 234 may generate all three types of cross-sectional images described above.
- the display image generation unit 234 generates an image representing the resection plane 4 determined using the resection plane extraction unit 228 .
- An image representing the cut surface 4 is superimposed on the cross-sectional image or the like.
- An example of an image representing the resection plane 4 superimposed on the cross-sectional image is a resection line 4C shown in FIG.
- the display image generation unit 234 generates first distance information representing the distance between the resection plane 4 and the lesion area measured using the distance measurement unit 232 . In addition, the display image generation unit 234 generates second distance information representing the distance between the resection plane 4 and the preservation region measured using the distance measurement unit 232 .
- the display image generation unit 234 can apply graphic elements such as color maps and color bars as the first distance information and the second distance information.
- the first distance information and the second distance information generated using the display image generation unit 234 are superimposed on the cross-sectional image.
- FIG. 11 is a block diagram showing a configuration example of the medical image processing apparatus according to the embodiment.
- the medical imaging apparatus 20 comprises a processor 202 , a non-transitory tangible computer readable medium 204 , a communication interface 206 and an input/output interface 208 .
- the form of the medical image processing apparatus 20 may be a server, a personal computer, a workstation, or a tablet terminal.
- the processor 202 includes a CPU (Central Processing Unit). Processor 202 may include a GPU (Graphics Processing Unit). Processor 202 is coupled to computer-readable media 204 , communication interface 206 , and input/output interface 208 via bus 210 . Input device 214 and display device 216 are connected to bus 210 via input/output interface 208 .
- CPU Central Processing Unit
- GPU Graphics Processing Unit
- the input device 214 can be a keyboard, mouse, multi-touch panel, other pointing devices, voice input devices, and the like. Input device 214 may apply any combination of devices.
- a liquid crystal display, an organic EL display, a projector, or the like can be applied to the display device 216 .
- Display device 216 may apply any combination of devices.
- EL in the organic EL display is an abbreviation for Electro-Luminescence.
- the computer-readable medium 204 includes a memory as a main memory and a storage as an auxiliary memory.
- the computer-readable medium 204 can apply a semiconductor memory, a hard disk device, a solid state drive device, and the like. Computer readable medium 204 may apply any combination of devices.
- the hard disk device can be called HDD, which is an abbreviation for Hard Disk Drive in English.
- a solid state drive device may be referred to as SSD, which is an abbreviation for the English notation Solid State Drive.
- the medical image processing apparatus 20 is connected to a communication line via a communication interface 206, and is communicably connected to devices such as the DICOM server 40 and viewer terminal 46 that are connected to the intra-medical institution network. Illustration of communication lines is omitted. DICOM is an abbreviation for Digital Imaging and Communication in Medicine.
- the computer-readable medium 204 stores a plurality of programs including a medical image processing program 220 and a display control program 260.
- the computer-readable medium 204 stores various data, various parameters, and the like.
- the processor 202 executes commands of the medical image processing program 220, and performs an image acquisition section 222, a lesion area extraction section 224, a lesion area input reception section 226, a resection plane extraction section 228, a resection plane input reception section 230, and a distance measurement section 232. and functions as a display image generation unit 234 .
- the display control program 260 generates display signals necessary for display output to the display device 216 and controls the display of the display device 216 .
- FIG. 12 is a flow chart showing the procedure of the medical image processing method according to the embodiment.
- the processor 202 acquires a medical image to be processed from the DICOM server 40 or the like.
- the processor 202 automatically extracts a lesion area as tissue to be excised from the input medical image. Also, in the lesion area extraction step S14, the processor 202 extracts a preservation area as a tissue to be preserved from the input medical image.
- the membrane surrounding the organ with the lesion is excised together with the fat.
- the lesion area extraction step S14 a fat area and a membrane surrounding the fat area for which a resection plane can be set are extracted.
- the organ may be an organ other than the organ including the lesion area, or may be an area other than the lesion area in the organ including the lesion area.
- the processor 202 can segment the medical image using the trained model and extract the lesion area and the preservation area.
- the processor 202 may determine the automatically extracted lesion area and sparing area as distance measurement targets.
- the processor 202 automatically extracts candidates for the resection plane 4 based on the identified lesion area.
- the processor 202 may extract the membrane surrounding the fat region as a candidate for the resection plane 4 .
- the processor 202 may determine the automatically extracted candidate resection plane 4 as the resection plane 4 if no user input representing a modification of the candidate resection plane 4 is obtained.
- the processor 202 can extract candidates for the resection plane 4 in consideration of margins specified in advance by the user. For example, if a margin of 2 millimeters is specified, the circumference of the mesorectum is extracted as a candidate for the resection surface 4 in areas with a CRM of 2 millimeters or more, and the area with a CRM of less than 2 millimeters exceeds the mesorectum, A plane 2 millimeters from the circumference of the cancer can be extracted as a candidate for the resection plane 4 .
- the processor 202 measures the first distance, which is the closest distance between the resection plane 4 and the lesion area. Also, in the distance measurement step S18, the processor 202 measures a second distance, which is the closest distance between the resection surface 4 and the preservation area.
- the processor 202 In the display image generation step S20, the processor 202 generates a display image in which the first distance information representing the first distance and the second distance information representing the second distance are superimposed on an arbitrary cross-sectional image.
- the processor 202 causes the display device 216 to display the generated display image. This allows the user to grasp at a glance the distance between the resection plane 4 and the lesion area and the distance between the resection plane 4 and the preservation area in the cross-sectional image.
- the display image generation step S20 can generate a display image representing the processing results in the lesion area extraction step S14 and the resection plane determination step S16.
- the display step S22 can cause the display device 216 to display a display image representing the processing results in the lesion area extraction step S14 and the resection plane determination step S16.
- the processor 202 receives input of various instructions from the user.
- the user can perform various inputs such as specification of the lesion area and specification of the resection plane 4 from the input device 214 .
- the processor 202 determines whether or not there is an input specifying a lesion area and an input specifying a preservation area from the input device 214. If the user performs an operation of specifying a suspected lesion in the medical image, and there is input of information specifying a lesion area and information specifying a preservation area, the determination is YES. In the case of YES determination, the processor 202 proceeds to the lesion area extraction step S14, and determines the designated lesion area and preservation area as targets for distance measurement.
- the processor 202 determines that there is no input specifying a lesion area in the lesion area specification input determination step S26, the determination is NO. In the case of NO determination, the processor 202 proceeds to the resection plane specifying step S28.
- the processor 202 determines whether or not there is an input specifying the resection plane 4 from the input device 214.
- the determination is YES.
- the processor 202 proceeds to the resection plane determination step S16, and determines the designated resection plane 4 as a reference for distance measurement.
- the processor 202 determines in the resection plane specifying step S28 that there is no input of information specifying the resection plane 4, the determination is NO. In the case of NO determination, the processor 202 proceeds to the end determination step S30.
- a mode in which the edit is invalidated can be applied.
- An error message may be displayed on the display device 216 if the edit is invalidated.
- the processor 202 determines whether or not to terminate the display of the medical images.
- the termination condition of the display may be input of an end instruction by the user, or may be an end instruction based on the program. If the termination condition is not satisfied and the determination result of the termination determination step S30 is NO, the processor 202 proceeds to the display step S22 and continues the display.
- the termination determination step S30 if the termination condition is satisfied, such as the user performing an operation to close the display window, the determination is YES. If the determination is YES, processor 202 terminates the display and terminates the flowchart of FIG.
- processing functions of the medical image processing apparatus 20 can also be realized by sharing the processing using a plurality of computers.
- FIG. 13 is an image diagram showing an example of displaying three-dimensional distance information superimposed on a two-dimensional cross section.
- the figure shows a two-dimensional cross-sectional image of an arbitrary axial cross-section.
- the resection line 301 is adjusted based on the designation of the doctor who is the user in the region where the cancer 5 is close to the nerve 3 .
- a first color map 302 showing the distribution of the nearest neighbor distances between the resection line 301 and the cancer 5 is displayed in the outer edge region of the resection line 301 on the cancer 5 side.
- a second color map 304 showing the distribution of the nearest neighbor distances between the resection line 301 and the nerve 3 is displayed in the edge area on the nerve 3 side of the resection line 301 .
- a marginal region outside the resection line 301 may apply to a region in contact with the resection line 301 .
- a border area outside the ablation line 301 may apply an area that is not in contact with the ablation line 301 .
- the distance from the resection line 301 to the first color map 302 is preferably less than or equal to the width of the resection line 301 or less than or equal to the width of the first color map 302 .
- the distance from the resection line 301 to the second color map 304 is preferably less than or equal to the distance of the resection line 301 or less than or equal to the width of the second color map 304 .
- the first color map 302 shows the distribution of the closest distance between the resection line 301 and the cancer 5 for each position on the resection line 301 .
- the resection line 301 shown in FIG. 13 has a narrower line width as the distance between the resection line 301 and the cancer 5 is shorter, and a wider line width as the distance between the resection line 301 and the cancer 5 is longer.
- FIG. 13 shows the first color map 302 having a uniform line width. The same is true for the second color map 304 as well.
- the first color map 302 an arbitrary color is added to the region where the nearest neighbor distance between the resection line 301 and the cancer 5 is equal to or less than a specified threshold.
- the first color map 302 can be adapted to display the numerical value of the nearest neighbor distance between the resection line 301 and the cancer 5 when user input such as mouse over and click is acquired.
- the first color map 302 shown in FIG. 13 does not express colors.
- the colors of the first color map 302 and the second color map 304 may be different to distinguish between the first color map 302 and the second color map 304 .
- FIG. 13 shows a cross-sectional image 300 in which a first color map 302 and a second color map 304 are displayed for a partial area of the resection line 301.
- FIG. A first color map 302 and a second color map 304 may be shown in multiple regions of the ablation line 301 .
- the first color map 302 may be hidden for areas where the nearest neighbor distance between the resection line 301 and the cancer 5 exceeds a prescribed distance.
- the second color map 304 may be hidden for regions where the nearest neighbor distance between the resection line 301 and the nerve 3 exceeds a prescribed distance.
- At least one of the first color map 302 and the second color map 304 can be displayed for the area of the resection line 301 determined according to the designation of the doctor who is the user.
- FIG. 14 is an image diagram showing another example of superimposing and displaying three-dimensional distance information on a two-dimensional cross section.
- a first color map 312 and a second color map 314 are displayed for the entire area of the resection line 301 in the cross-sectional image 310 shown in FIG.
- a configuration similar to that of the first color map 302 shown in FIG. 13 can be applied to the first color map 312 shown in FIG.
- the second color map 314 shown in FIG. 14 can apply the same configuration as the second color map 304 shown in FIG.
- the display device 216 shown in FIG. 10 displays cross-sectional images representing the results of the processing performed in each step shown in FIG.
- the display device 216 displays the segmentation results of the lesion area and the preserved area. For example, the display device 216 displays the cross-sectional image shown in FIG. 5 as the processing result of the lesion area extraction step S14.
- the display device 216 displays the resection line 4A shown in FIG. 7, the resection line 4B shown in FIG. 8, the resection line 4C shown in FIG. It is superimposed and displayed as a candidate. Display device 216 may display ablation line 4C shown in FIG. 9 as the determined ablation line.
- FIG. 15 is a schematic diagram showing an example of the display screen for the first distance measurement in the distance measurement process.
- a display window 320 shown in the figure is an example of a display mode in the distance measurement step S18.
- the display window 320 can also be applied when displaying the processing results in the lesion area extraction step S14 and the resection plane determination step S16.
- the display window 320 shown in FIG. 15 includes a main display area 322, a first sub-display area 324A, a second sub-display area 324B and a third sub-display area 324C.
- a main display area 322 is located in the center of the display window 320 .
- the first sub-display area 324A, the second sub-display area 324B and the third sub-display area 324C are arranged on the left side of the main display area 322 in the figure.
- the first sub-display area 324A, the second sub-display area 324B and the third sub-display area 324C are arranged side by side in the vertical direction in the figure.
- an axial cross-sectional image is displayed in the main display area 322, and an arrow line 332 indicating the distance between the resection line 301 and the cancer 5 at an arbitrary position on the resection line 301 is superimposed on the axial cross-sectional image. be done.
- a numerical value may be displayed instead of the arrow line 332 .
- FIG. 15 illustrates the cross-sectional image 310 shown in FIG. 14 as an axial cross-sectional image. The same applies to the axial cross-sectional images in each of FIGS. 16 to 19.
- FIG. 15 illustrates the cross-sectional image 310 shown in FIG. 14 as an axial cross-sectional image. The same applies to the axial cross-sectional images in each of FIGS. 16 to 19.
- FIG. 15 illustrates the cross-sectional image 310 shown in FIG. 14 as an axial cross-sectional image. The same applies to the axial cross-sectional images in each of FIGS. 16 to 19.
- FIG. 15 illustrates the cross-sectional image 310 shown in FIG. 14 as an axial cross-sectional image. The same applies to the axial cross-sectional images in each of FIGS. 16 to 19.
- FIG. 15 illustrates the cross-sectional image 310 shown in FIG. 14 as an axial cross-sectional image. The same applies to the axial cross-sectional images in each of FIGS. 16 to 19.
- FIG. 16 is a schematic diagram showing an example of the second distance measurement display screen in the distance measurement process.
- FIG. 16 shows an example in which an arrow line 334 indicating the distance between the resection line 301 and the nerve 3 is displayed.
- Arrow line 332 shown in FIG. 15 and arrow line 334 shown in FIG. 16 may be displayed on the same display screen.
- the arrow lines 332 shown in FIG. 15 and the arrow lines 334 shown in FIG. 16 may be displayed for all distance measurement positions, or may be displayed for some distance measurement positions such as representative measurement positions.
- FIG. 17 is a schematic diagram showing a display example of the first color map.
- This figure shows the first color map 312 shown in FIG.
- the first color map 312 shown in FIG. 17 uses both display of nearest neighbor distance using line width and display of nearest neighbor distance using color. That is, the first color map 312 has a darker density as the nearest neighbor distance between the resection line 301 and the cancer 5 becomes relatively shorter. As the nearest neighbor distance between the line 301 and the cancer 5 becomes relatively longer, the density becomes thinner.
- first color region 312A in first color map 312 has a higher density than second color region 312B and third color region 312C, and first color region 312A has second color region 312B and third color region 312B.
- the resection line 301 and the cancer 5 are closer than the region 312C.
- the first color map 312 shown in FIG. 17 represents stepwise changes in the nearest neighbor distance between the resection line 301 and the cancer 5, but continuously shows changes in the nearest neighbor distance between the resection line 301 and the cancer 5.
- a first color map may be applied that represents the
- the first color map 312 shown in FIG. 17 applies density to represent the difference in nearest neighbor distance between the resection line 301 and the cancer 5, but applies color, saturation of the same color, and brightness of the same color. , a density may be applied to represent the difference in nearest neighbor distance between the resection line 301 and the cancer 5 .
- FIG. 18 is a schematic diagram showing a display example of the second color map. This figure shows the second color map 314 shown in FIG.
- the second colormap 314 is configured similarly to the first colormap 312 shown in FIG. A detailed description of the second color map 314 is omitted here.
- FIG. 19 is a schematic diagram showing a display example of a color map according to a modification.
- FIG. 19 shows an example in which a first color map 312 and a second color map 314 are displayed superimposed on the cross-sectional image 310 .
- Either the first color map 312 or the second color map 314 may be selectively displayed superimposed on the cross-sectional image 310 .
- a mode of displaying only the first color map 312, a mode of displaying only the second color map 314, and a mode of displaying both the first color map 312 and the second color map 314 are selectively switched according to user input. good too.
- FIGS. 15 to 19 illustrate the mode of displaying the first color map 312 and the second color map 314 shown in FIG. 14.
- the first color map 302 and the second color map 304 shown in FIG. may
- the display image generation unit 234 shown in FIG. 10 may generate a volume rendering image from a group of multiple cross-sectional images.
- the display image generator 234 can generate a plurality of volume rendering images from different viewpoints of the user.
- Display image generator 234 may generate a color map corresponding to the volume rendering image.
- FIG. 20 is a schematic diagram showing a display example of three-dimensional distance information in a volume rendering image.
- FIG. 20 shows a volume rendering image 400 in which a resection surface 402, a tumor 404 that is a tissue to be resected, and a blood vessel 410 that is a tissue to be preserved are viewed from above.
- An arrow line 420 indicates the direction of the viewpoint when viewing the cut surface 402 from the blood vessel 410 side.
- An arrow line 422 indicates the direction of the viewpoint when viewing the resection plane 402 from the side of the tumor 404 .
- a first color map 406A is superimposed on the tumor 404 at a position facing the resection plane 402 .
- a first color map 406A represents information on the closest distance between the resection plane 402 and the tumor 404 .
- a first color map 406 A is applied to the volume rendered image as viewed from the tumor 404 side of the resection plane 402 .
- FIG. 21 is a schematic diagram of FIG. 20 viewed from the side of the tissue to be preserved.
- the first color map 406B is displayed on the surface of the cut surface 402 on the blood vessel 410 side.
- the position of the first color map 406B on the resection plane 402 corresponds to the position of the tumor 404 when the tumor 404 is projected onto the resection plane 402.
- the position of the tumor 404 may apply the position of the centroid of the tumor 404 .
- the shape of first color map 406B corresponds to the shape of tumor 404 when tumor 404 is projected onto resection plane 402 .
- a first color map 406B is on the side of the resection plane 402 opposite the blood vessel 410 side and indicates the closest distance between the resection plane 402 and the tumor 404 that is not visible in the volume rendering image 424 .
- a first color map 406B is applied with a color distinguishable from the cut surface 402 and the blood vessel 410 .
- the density according to the closest distance between the tumor 404 and the resection surface 402 is applied to the first color map 406B. For example, if the nearest neighbor distance between tumor 404 and resection plane 402 is relatively short, the density of first color map 406B is relatively dark.
- the volume rendering image 424 may be superimposed with a second color map representing the closest distance between the resection plane 402 and the blood vessel 410 .
- illustration of the second color map is omitted.
- FIG. 22 is a schematic diagram of FIG. 20 viewed from the tissue to be excised.
- the second color map 412B is superimposed on the tumor 404 side of the resection plane 402 and displayed.
- a second color map 412B is on the side of the resection plane 402 opposite to the tumor 404 side and indicates the closest distance between the resection plane 402 and the blood vessel 410 that is not visible in the volume rendering image 430 .
- a first color map 406B is applied with a color distinguishable from the resection plane 402 and the tumor 404 .
- the volume rendering image 430 may be superimposed with a first color map representing the closest distance between the resection plane 402 and the tumor 404 .
- illustration of the first color map is omitted.
- the position of the second color map 412B on the resection plane 402, the shape of the second color map 412B, and the display mode of the closest distance between the resection plane 402 and the blood vessel 410 are the same as the first color map 406B shown in FIG.
- FIG. 23 is a schematic diagram showing a display example of three-dimensional distance information applied to each tissue shown in FIG.
- a first color map 406A representing the nearest neighbor distance between the resection plane 402 and the tumor 404
- a second color map 412A representing the nearest neighbor distance between the resection plane 402 and the blood vessel 410 are superimposed. be done.
- the first color map 406A is the tissue on the tumor 404 side of the resection plane 402, and is displayed in the tissue between the resection plane 402 and the tumor 404.
- the second color map 412A is the tissue on the blood vessel 410 side of the resection plane 402 and is displayed on the tissue between the resection plane 402 and the blood vessel 410.
- the volume rendering image seen from the viewpoint of the tissue to be preserved is on the opposite side of the resection plane, and if the tissue to be resected is not visible, the first color map is displayed on the resection plane.
- a second color map is displayed on the tissue between the resection plane and the tissue to be preserved.
- the volume rendered image from the point of view of the tissue to be resected is on the opposite side of the resection plane, and if the tissue to be preserved is not visible, a second color map is applied to the resection plane. display. Also, a first color map is displayed on the tissue between the resection plane and the tissue to be preserved.
- FIG. 24 is a block diagram showing a configuration example of a medical information system including the medical image processing apparatus according to the embodiment.
- a medical information system 100 is a computer network built in a medical institution such as a hospital.
- the medical information system 100 includes a modality 30 that captures medical images, a DICOM server 40, a medical image processing device 20, an electronic chart system 44, and a viewer terminal 46.
- the components of medical information system 100 are connected via communication line 48 .
- Communication line 48 may be a local communication line within a medical institution. Also, part of the communication line 48 may be a wide area communication line.
- Examples of the modality 30 include a CT device 31, an MRI device 32, an ultrasonic diagnostic device 33, a PET device 34, an X-ray diagnostic device 35, an X-ray fluoroscopic diagnostic device 36, an endoscope device 37, and the like.
- the types of modalities 30 connected to the communication line 48 can be combined in various ways for each medical institution.
- PET is an abbreviation for Positron Emission Tomography.
- the DICOM server 40 is a server that operates according to the DICOM specifications.
- the DICOM server 40 is a computer that stores and manages various data including images captured using the modality 30, and has a large-capacity external storage device and a database management program.
- the DICOM server 40 communicates with other devices via the communication line 48 to send and receive various data including image data.
- the DICOM server 40 receives image data generated using the modality 30 and various other data via a communication line 48, and stores and manages them in a recording medium such as a large-capacity external storage device.
- the storage format of the image data and the communication between each device via the communication line 48 are based on the DICOM protocol.
- the medical image processing apparatus 20 can acquire data from the DICOM server 40 or the like via the communication line 48.
- the medical image processing apparatus 20 can also send the processing results to the DICOM server 40 and the viewer terminal 46 .
- the processing functions of the medical image processing apparatus 20 may be installed in the DICOM server 40 or the viewer terminal 46 .
- Various data stored in the database of the DICOM server 40 and various information including the processing results generated by the medical image processing apparatus 20 can be displayed on the viewer terminal 46.
- the viewer terminal 46 is a terminal for viewing images called a PACS viewer or DICOM viewer.
- a plurality of viewer terminals 46 can be connected to the communication line 48 .
- the form of the viewer terminal 46 is not particularly limited, and may be a personal computer, a workstation, a tablet terminal, or the like.
- the input device of the viewer terminal 46 may be configured such that designation of the lesion area, designation of the measurement reference plane, and the like can be performed.
- a program that causes a computer to realize the processing functions of the medical image processing apparatus 20 is stored in a computer-readable medium that is a tangible non-temporary information storage medium such as an optical disk, magnetic disk, semiconductor memory, etc., and the program is provided through the information storage medium. It is possible to
- the various processors include a CPU that is a general-purpose processor that executes programs and functions as various processing units, a GPU that is a processor specialized for image processing, and a processor whose circuit configuration can be changed after manufacturing, such as an FPGA. It includes a dedicated electric circuit, which is a processor having a circuit configuration specially designed to execute a specific process such as a certain programmable logic device, ASIC, and the like.
- FPGA is an abbreviation for Field Programmable Gate Array.
- ASIC is an abbreviation for Application Specific Integrated Circuit.
- a programmable logic device may be referred to as a PLD using the abbreviation for Programmable Logic Device, which is an English notation.
- a single processing unit may be composed of one of these various processors, or may be composed of two or more processors of the same type or different types.
- one processing unit may be configured by a plurality of FPGAs, a combination of CPU and FPGA, or a combination of CPU and GPU.
- a plurality of processing units may be configured by a single processor.
- a single processor is configured by combining one or more CPUs and software. There is a form in which a processor functions as multiple processing units.
- SoC System On Chip
- IC is an abbreviation for Integrated Circuit.
- the hardware structure of these various processors is, more specifically, an electric circuit that combines circuit elements such as semiconductor elements.
- An electric circuit can be called using the English notation, circuitry.
- the medical image processing apparatus 20 according to the embodiment has the following effects.
- a resection plane is specified between the tissue to be resected and the tissue to be preserved, and the nearest neighbor distance between the resection plane and the tissue to be resected and the resection plane and the preservation A nearest neighbor distance to the tissue of interest is measured. Based on the measurement results, the first distance information representing the information of the nearest neighbor distance between the resection plane and the tissue to be resected and the second distance information representing the information of the nearest neighbor distance between the resection plane and the tissue to be preserved are obtained in a two-dimensional cross section. It is superimposed on the image. This may assist in determining the resection plane when resecting tissue to be resected.
- the first distance information displayed on the cross-sectional image is on the side of the tissue to be resected of the resection plane and is displayed on the edge region outside the resection plane.
- the second distance information displayed on the cross-sectional image is on the side of the tissue to be preserved of the resection plane and is displayed on the edge region outside the resection plane.
- the first distance information displayed on the volume rendered image is displayed on the side of the tissue to be preserved of the resection plane. Thereby, the first distance information corresponding to the resection target tissue hidden in the resection plane can be visually recognized.
- the second distance information displayed on the volume rendering image is displayed on the side of the tissue to be resected of the resection plane. Thereby, the second distance information corresponding to the tissue to be preserved hidden in the resection plane can be visually recognized.
- a color map is applied to the first distance information and the second distance information. Thereby, a user who has seen the first distance information and the second distance information can grasp the content of the first distance information at a glance.
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Medical Informatics (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Radiology & Medical Imaging (AREA)
- Primary Health Care (AREA)
- Epidemiology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Computer Graphics (AREA)
- Data Mining & Analysis (AREA)
- Quality & Reliability (AREA)
- Computing Systems (AREA)
- Geometry (AREA)
- Computer Hardware Design (AREA)
- Software Systems (AREA)
- General Engineering & Computer Science (AREA)
- Biomedical Technology (AREA)
- Databases & Information Systems (AREA)
- Pathology (AREA)
- Magnetic Resonance Imaging Apparatus (AREA)
- Apparatus For Radiation Diagnosis (AREA)
Abstract
Description
図1は直腸がんの患者を撮影したMRI画像からCRMを計測した例を示す図である。図1では、がんTU、固有筋層MP及び直腸間膜MEは、それぞれに異なる塗りつぶしパターンを付して区別をする。図1には二次元画像を示すが、実際に取得されるMRI画像は三次元画像である。なお、MRIはMagnetic Resonance Imagingの省略語である。
図2は定位前方切除術を示す図である。図3は図2の向きを示すイメージ図である。図2は頭の側から骨盤の中をのぞいた図である。図3には矢印線を用いて、図2におけるお腹の側の位置を図示する。
図10は実施形態に係る医療画像処理装置の機能ブロック図である。同図に示す医療画像処理装置20は、コンピュータのハードウェアとソフトウェアとを用いて実現される。ソフトウェアはプログラムと同義である。以下に示す実施形態に用いられる医療画像という用語は、医用画像という用語と同義である。
図12は実施形態に係る医療画像処理方法の手順を示すフローチャートである。画像取得工程S12において、プロセッサ202は、DICOMサーバ40等から処理対象の医療画像を取得する。
図13は三次元距離情報を二次元断面へ重畳表示させる一例を示すイメージ図である。同図には、任意のアキシャル断面における二次元断面画像を示す。同図に示す断面画像300は、がん5が神経3と近接している領域において、ユーザである医師の指定に基づき切除ライン301が調整されている。
次に、ユーザインターフェースの具体例について説明する。図10に示すディスプレイ装置216は、図12に示す各工程において実施される処理の結果を表す断面画像が表示される。
図13に示す断面画像300等と同様に、ボリュームレンダリング画像においても、図13等に示す第一カラーマップ302に対応する第一距離情報及び第二カラーマップ304に対応する第二距離情報の重畳表示が可能である。
図24は実施形態に係る医療画像処理装置を含む医療情報システムの構成例を示すブロック図である。医療情報システム100は、病院などの医療機関に構築されるコンピュータネットワークである。
医療画像処理装置20における処理機能のコンピュータに実現させるプログラムを、光ディスク、磁気ディスク及び半導体メモリその他の有体物たる非一時的な情報記憶媒体であるコンピュータ可読媒体に記憶し、情報記憶媒体を通じてプログラムを提供することが可能である。
医療画像処理装置20における画像取得部222、病変領域抽出部224、病変領域入力受付部226、切除面抽出部228、切除面入力受付部230、距離計測部232及び表示画像生成部234などの各種の処理を実行する処理部のハードウェア的な構造は、例えば、次に示すような各種のプロセッサ(processor)である。なお、処理部はprocessing unitと称される態様が含まれ得る。プロセッサは英語表示であるprocessorと称される場合があり得る。
実施形態に係る医療画像処理装置20は、以下の作用効果が得られる。
病変領域等の切除対象の組織を切除する際に、切除対象の組織と温存対象の組織との間に切除面が指定され、切除面と切除対象の組織との最隣接距離及び切除面と温存対象の組織との最隣接距離が計測される。計測結果に基づき、切除面と切除対象の組織との最隣接距離の情報を表す第一距離情報及び切除面と温存対象の組織との最隣接距離の情報を表す第二距離情報が二次元断面画像へ重畳表示される。これにより、切除対象の組織を切除する際の切除面の決定を支援し得る。
断面画像へ表示される第一距離情報は、切除面の切除対象の組織の側であり、切除面の外側の縁領域へ表示される。これにより、切除面の視認性の低下が抑制され、かつ、第一距離情報の視認性及び第一距離が適用される側を一目で把握し得る。
断面画像へ表示される第二距離情報は、切除面の温存対象の組織の側であり、切除面の外側の縁領域へ表示される。これにより、切除面の視認性の低下が抑制され、かつ、第二距離情報の視認性及び第一距離が適用される側を一目で把握し得る。
ボリュームレンダリング画像へ表示される第一距離情報は、切除面の温存対象の組織の側に表示される。これにより、切除面に隠れる切除対象の組織に対応する第一距離情報を視認し得る。
ボリュームレンダリング画像へ表示される第二距離情報は、切除面の切除対象の組織の側に表示される。これにより、切除面に隠れる温存対象の組織に対応する第二距離情報を視認し得る。
第一距離情報及び第二距離情報は、カラーマップが適用される。これにより、第一距離情報及び第二距離情報を見たユーザは、一目で第一距離情報の内容を把握し得る。
2 脂肪
3 神経
4 切除面
4A 切除ライン
4B 切除ライン
4C 切除ライン
5 がん
6 境界
7 領域
20 医療画像処理装置
30 モダリティ
31 CT装置
32 MRI装置
33 超音波診断装置
34 PET装置
35 X線診断装置
36 X線透視診断装置
37 内視鏡装置
40 DICOMサーバ
44 電子カルテシステム
46 ビューワ端末
48 通信回線
100 医療情報システム
202 プロセッサ
204 コンピュータ可読媒体
206 通信インターフェース
208 入出力インターフェース
210 バス
214 入力装置
216 ディスプレイ装置
220 医療画像処理プログラム
222 画像取得部
224 病変領域抽出部
226 病変領域入力受付部
228 切除面抽出部
230 切除面入力受付部
232 距離計測部
234 表示画像生成部
260 表示制御プログラム
300 断面画像
302 第一カラーマップ
304 第二カラーマップ
310 断面画像
312 第一カラーマップ
312A 第一カラー領域
312B 第二カラー領域
312C 第三カラー領域
314 第二カラーマップ
320 表示ウインドウ
322 メイン表示エリア
324A 第一サブ表示エリア
324B 第二サブ表示エリア
324C 第三サブ表示エリア
332 矢印線
334 矢印線
402 切除面
404 腫瘍
406A 第一カラーマップ
406B 第一カラーマップ
410 血管
412A 第二カラーマップ
412B 第二カラーマップ
420 矢印線
422 矢印線
424 ボリュームレンダリング画像
430 ボリュームレンダリング画像
440 ボリュームレンダリング画像
ME 直腸間膜
MP 固有筋層
TU がん
S12~S30 医療画像処理方法の各ステップ
Claims (13)
- プロセッサと、
前記プロセッサを用いて実行されるプログラムが記憶される記憶装置と、を備える医療画像処理装置であって、
前記プロセッサは、前記プログラムの命令を実行して、
三次元の医療画像を取得し、
前記医療画像において特定される第一領域を分離する際の分離面を設定し、
前記第一領域の表面における任意の位置と前記分離面との最隣接距離である第一距離を計測し、
前記医療画像において特定される第二領域の表面における任意の位置と前記分離面との最隣接距離である第二距離を計測し、
前記医療画像をディスプレイ装置へ表示させる際に、前記第一距離を表す第一距離情報及び前記第二距離を表す第二距離情報の少なくともいずれかを表示させる医療画像処理装置。 - 前記プロセッサは、
前記医療画像における任意の断面に対応する断面画像を前記ディスプレイ装置へ表示させる際に、前記分離面の前記第一領域の側における外側の縁領域へ、前記第一距離情報を表示させる請求項1に記載の医療画像処理装置。 - 前記プロセッサは、
前記医療画像における任意の断面に対応する断面画像を前記ディスプレイ装置へ表示させる際に、前記分離面の前記第二領域の側における外側の縁領域へ、前記第二距離情報を表示させる請求項1又は2に記載の医療画像処理装置。 - 前記プロセッサは、
色を用いて距離を表すカラーマップを適用して、前記第一距離情報及び前記第二距離情報を表示させる請求項1から3のいずれか一項に記載の医療画像処理装置。 - 前記プロセッサは、
前記医療画像に対応するボリュームレンダリング画像を前記ディスプレイ装置へ表示させ、
前記第一距離情報及び前記第二距離情報の少なくともいずれかを、前記ボリュームレンダリング画像へ重畳表示させる請求項1に記載の医療画像処理装置。 - 前記プロセッサは、
前記ボリュームレンダリング画像におけるユーザの視点に応じて、前記第一距離情報の表示態様及び前記第二距離情報の表示態様を変更する請求項5に記載の医療画像処理装置。 - 前記プロセッサは、
前記ボリュームレンダリング画像において、前記第一領域が前記分離面の前記ユーザの視点の側に存在する場合に、前記第一距離情報を前記第一領域へ重畳表示させる請求項6に記載の医療画像処理装置。 - 前記プロセッサは、
前記ボリュームレンダリング画像において、前記第二領域が前記分離面の前記ユーザの視点の反対の側に存在する場合に、前記分離面へ前記第二距離情報を重畳表示させる請求項6又は7に記載の医療画像処理装置。 - 前記プロセッサは、
前記ボリュームレンダリング画像において、前記第一領域が前記分離面の前記ユーザの視点の反対の側に存在する場合に、前記分離面へ前記第一距離情報を重畳表示させる請求項6に記載の医療画像処理装置。 - 前記プロセッサは、
前記ボリュームレンダリング画像において、前記第二領域が前記分離面の前記ユーザの視点の側に存在する場合に、前記第二距離情報を前記第二領域へ重畳表示させる請求項6又は9に記載の医療画像処理装置。 - コンピュータが、
三次元の医療画像を取得し、
前記医療画像において特定される第一領域を分離する際の分離面を設定し、
前記第一領域の表面における任意の位置と前記分離面との最隣接距離である第一距離を計測し、
前記医療画像において特定される第二領域の表面における任意の位置と前記分離面との最隣接距離である第二距離を計測し、
前記医療画像をディスプレイ装置へ表示させる際に、前記第一距離を表す第一距離情報及び前記第二距離を表す第二距離情報の少なくともいずれかを表示させる医療画像処理方法。 - コンピュータに、
三次元の医療画像を取得する機能、
前記医療画像において特定される第一領域を分離する際の分離面を設定する機能、
前記第一領域の表面における任意の位置と前記分離面との最隣接距離である第一距離を計測する機能、
前記医療画像において特定される第二領域の表面における任意の位置と前記分離面との最隣接距離である第二距離を計測する機能、及び
前記医療画像をディスプレイ装置へ表示させる際に、前記第一距離を表す第一距離情報及び前記第二距離を表す第二距離情報の少なくともいずれかを表示させる機能を実現させるプログラム。 - 非一時的かつコンピュータ読取可能な記録媒体であって、請求項12に記載のプログラムが記録された記録媒体。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP22756192.5A EP4295762A1 (en) | 2021-02-22 | 2022-02-16 | Medical image processing device, medical image processing method, and program |
JP2023500869A JPWO2022176874A1 (ja) | 2021-02-22 | 2022-02-16 | |
US18/452,568 US20230394672A1 (en) | 2021-02-22 | 2023-08-21 | Medical image processing apparatus, medical image processing method, and program |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2021026334 | 2021-02-22 | ||
JP2021-026334 | 2021-02-22 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US18/452,568 Continuation US20230394672A1 (en) | 2021-02-22 | 2023-08-21 | Medical image processing apparatus, medical image processing method, and program |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022176874A1 true WO2022176874A1 (ja) | 2022-08-25 |
Family
ID=82931776
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2022/006046 WO2022176874A1 (ja) | 2021-02-22 | 2022-02-16 | 医療画像処理装置、医療画像処理方法及びプログラム |
Country Status (4)
Country | Link |
---|---|
US (1) | US20230394672A1 (ja) |
EP (1) | EP4295762A1 (ja) |
JP (1) | JPWO2022176874A1 (ja) |
WO (1) | WO2022176874A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2024122027A1 (ja) * | 2022-12-08 | 2024-06-13 | 国立研究開発法人国立がん研究センター | 画像処理装置、手術支援システム、画像処理方法及びプログラム |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH10309281A (ja) * | 1997-05-13 | 1998-11-24 | Olympus Optical Co Ltd | 蛍光診断装置 |
JP2009022369A (ja) | 2007-07-17 | 2009-02-05 | Toshiba Corp | 医用画像撮影装置、医用画像処理装置および医用画像処理プログラム |
JP2011218037A (ja) | 2010-04-13 | 2011-11-04 | Toshiba Corp | 医用画像処理装置及び脂肪領域計測用制御プログラム |
JP2012165910A (ja) * | 2011-02-15 | 2012-09-06 | Fujifilm Corp | 手術支援装置、手術支援方法および手術支援プログラム |
JP2016039874A (ja) | 2014-08-13 | 2016-03-24 | 富士フイルム株式会社 | 内視鏡画像診断支援装置、システム、方法およびプログラム |
WO2017047820A1 (ja) | 2015-09-18 | 2017-03-23 | イービーエム株式会社 | 病変血流特徴量可視化装置、その方法、及びそのコンピュータソフトウェアプログラム |
US20180055575A1 (en) * | 2016-09-01 | 2018-03-01 | Covidien Lp | Systems and methods for providing proximity awareness to pleural boundaries, vascular structures, and other critical intra-thoracic structures during electromagnetic navigation bronchoscopy |
JP2020074926A (ja) * | 2018-11-07 | 2020-05-21 | ソニー株式会社 | 医療用観察システム、信号処理装置及び医療用観察方法 |
JP2020120828A (ja) | 2019-01-29 | 2020-08-13 | ザイオソフト株式会社 | 医用画像処理装置、医用画像処理方法、及び医用画像処理プログラム |
-
2022
- 2022-02-16 WO PCT/JP2022/006046 patent/WO2022176874A1/ja active Application Filing
- 2022-02-16 JP JP2023500869A patent/JPWO2022176874A1/ja active Pending
- 2022-02-16 EP EP22756192.5A patent/EP4295762A1/en active Pending
-
2023
- 2023-08-21 US US18/452,568 patent/US20230394672A1/en active Pending
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH10309281A (ja) * | 1997-05-13 | 1998-11-24 | Olympus Optical Co Ltd | 蛍光診断装置 |
JP2009022369A (ja) | 2007-07-17 | 2009-02-05 | Toshiba Corp | 医用画像撮影装置、医用画像処理装置および医用画像処理プログラム |
JP2011218037A (ja) | 2010-04-13 | 2011-11-04 | Toshiba Corp | 医用画像処理装置及び脂肪領域計測用制御プログラム |
JP2012165910A (ja) * | 2011-02-15 | 2012-09-06 | Fujifilm Corp | 手術支援装置、手術支援方法および手術支援プログラム |
JP2016039874A (ja) | 2014-08-13 | 2016-03-24 | 富士フイルム株式会社 | 内視鏡画像診断支援装置、システム、方法およびプログラム |
WO2017047820A1 (ja) | 2015-09-18 | 2017-03-23 | イービーエム株式会社 | 病変血流特徴量可視化装置、その方法、及びそのコンピュータソフトウェアプログラム |
US20180055575A1 (en) * | 2016-09-01 | 2018-03-01 | Covidien Lp | Systems and methods for providing proximity awareness to pleural boundaries, vascular structures, and other critical intra-thoracic structures during electromagnetic navigation bronchoscopy |
JP2020074926A (ja) * | 2018-11-07 | 2020-05-21 | ソニー株式会社 | 医療用観察システム、信号処理装置及び医療用観察方法 |
JP2020120828A (ja) | 2019-01-29 | 2020-08-13 | ザイオソフト株式会社 | 医用画像処理装置、医用画像処理方法、及び医用画像処理プログラム |
Non-Patent Citations (1)
Title |
---|
YUSUKE KINUGASA: "A dissection layer for the operation of the perirectal fascia and function-preserving rectal cancer as viewed from surgeons", JAPANESE RESEARCH SOCIETY OF CLINICAL ANATOMY, 8 September 2012 (2012-09-08) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2024122027A1 (ja) * | 2022-12-08 | 2024-06-13 | 国立研究開発法人国立がん研究センター | 画像処理装置、手術支援システム、画像処理方法及びプログラム |
Also Published As
Publication number | Publication date |
---|---|
JPWO2022176874A1 (ja) | 2022-08-25 |
US20230394672A1 (en) | 2023-12-07 |
EP4295762A1 (en) | 2023-12-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Fang et al. | Consensus recommendations of three-dimensional visualization for diagnosis and management of liver diseases | |
US10083515B2 (en) | Method and system for segmenting medical imaging data according to a skeletal atlas | |
WO2015161728A1 (zh) | 三维模型的构建方法及装置、图像监控方法及装置 | |
US20120249546A1 (en) | Methods and systems for visualization and analysis of sublobar regions of the lung | |
JP4688361B2 (ja) | 臓器の特定領域抽出表示装置及びその表示方法 | |
US9443161B2 (en) | Methods and systems for performing segmentation and registration of images using neutrosophic similarity scores | |
US10524823B2 (en) | Surgery assistance apparatus, method and program | |
US20130257910A1 (en) | Apparatus and method for lesion diagnosis | |
US20200242776A1 (en) | Medical image processing apparatus, medical image processing method, and system | |
US9649167B2 (en) | Pattern and surgery support set, apparatus, method and program | |
KR20220026534A (ko) | 딥러닝 기반의 조직 절제 계획 방법 | |
JP7164345B2 (ja) | 医用画像処理装置、医用画像処理方法、及び医用画像処理プログラム | |
US20230394672A1 (en) | Medical image processing apparatus, medical image processing method, and program | |
JP4193201B2 (ja) | 臓器の切除領域抽出表示装置 | |
US9123163B2 (en) | Medical image display apparatus, method and program | |
JP2007014483A (ja) | 医用診断装置及び診断支援装置 | |
JP6066197B2 (ja) | 手術支援装置、方法およびプログラム | |
WO2022176873A1 (ja) | 医療画像処理装置、医療画像処理方法およびプログラム | |
JP2019165923A (ja) | 診断支援システム及び診断支援方法 | |
JP5992853B2 (ja) | 手術支援装置、方法およびプログラム | |
Paolucci et al. | Ultrasound based planning and navigation for non-anatomical liver resections–an Ex-Vivo study | |
US11941808B2 (en) | Medical image processing device, medical image processing method, and storage medium | |
US11657547B2 (en) | Endoscopic surgery support apparatus, endoscopic surgery support method, and endoscopic surgery support system | |
Sun et al. | Digital kidney for planning in laparoscopic partial nephrectomy | |
WO2022270150A1 (ja) | 画像処理装置、方法およびプログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22756192 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2023500869 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2022756192 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2022756192 Country of ref document: EP Effective date: 20230922 |