WO2013141067A1 - 画像生成装置 - Google Patents
画像生成装置 Download PDFInfo
- Publication number
- WO2013141067A1 WO2013141067A1 PCT/JP2013/056697 JP2013056697W WO2013141067A1 WO 2013141067 A1 WO2013141067 A1 WO 2013141067A1 JP 2013056697 W JP2013056697 W JP 2013056697W WO 2013141067 A1 WO2013141067 A1 WO 2013141067A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- display
- image generation
- unit
- information
- Prior art date
Links
- 238000001514 detection method Methods 0.000 claims abstract description 103
- 238000004458 analytical method Methods 0.000 claims abstract description 84
- 230000002123 temporal effect Effects 0.000 claims abstract description 30
- 230000017531 blood circulation Effects 0.000 claims description 107
- 238000003745 diagnosis Methods 0.000 claims description 101
- 210000004072 lung Anatomy 0.000 claims description 85
- 230000008859 change Effects 0.000 claims description 69
- 238000012545 processing Methods 0.000 claims description 26
- 238000004364 calculation method Methods 0.000 claims description 23
- 241001465754 Metazoa Species 0.000 claims description 9
- 230000000737 periodic effect Effects 0.000 claims description 8
- 239000003086 colorant Substances 0.000 claims description 2
- 238000012800 visualization Methods 0.000 abstract description 2
- 238000003384 imaging method Methods 0.000 description 54
- 230000000241 respiratory effect Effects 0.000 description 52
- 238000010586 diagram Methods 0.000 description 43
- 230000029058 respiratory gaseous exchange Effects 0.000 description 38
- 238000000034 method Methods 0.000 description 28
- 230000005855 radiation Effects 0.000 description 27
- 230000033001 locomotion Effects 0.000 description 21
- 210000004204 blood vessel Anatomy 0.000 description 17
- 230000002685 pulmonary effect Effects 0.000 description 17
- 210000000038 chest Anatomy 0.000 description 15
- 238000004891 communication Methods 0.000 description 15
- 238000005452 bending Methods 0.000 description 12
- 230000008569 process Effects 0.000 description 10
- 208000010378 Pulmonary Embolism Diseases 0.000 description 8
- 238000000605 extraction Methods 0.000 description 7
- 230000000007 visual effect Effects 0.000 description 7
- 230000006870 function Effects 0.000 description 5
- 206010038669 Respiratory arrest Diseases 0.000 description 4
- 230000002159 abnormal effect Effects 0.000 description 4
- 238000001914 filtration Methods 0.000 description 4
- 238000005259 measurement Methods 0.000 description 4
- 230000010349 pulsation Effects 0.000 description 4
- 239000004065 semiconductor Substances 0.000 description 4
- 230000003205 diastolic effect Effects 0.000 description 3
- 239000000284 extract Substances 0.000 description 3
- 238000010191 image analysis Methods 0.000 description 3
- 230000003434 inspiratory effect Effects 0.000 description 3
- 238000012544 monitoring process Methods 0.000 description 3
- 238000002601 radiography Methods 0.000 description 3
- 238000009423 ventilation Methods 0.000 description 3
- 230000005856 abnormality Effects 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 2
- 238000006243 chemical reaction Methods 0.000 description 2
- 238000004195 computer-aided diagnosis Methods 0.000 description 2
- 238000012790 confirmation Methods 0.000 description 2
- 230000007423 decrease Effects 0.000 description 2
- 230000004069 differentiation Effects 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 210000000056 organ Anatomy 0.000 description 2
- 210000005259 peripheral blood Anatomy 0.000 description 2
- 239000011886 peripheral blood Substances 0.000 description 2
- 230000004044 response Effects 0.000 description 2
- 230000002792 vascular Effects 0.000 description 2
- 208000012661 Dyskinesia Diseases 0.000 description 1
- 208000015592 Involuntary movements Diseases 0.000 description 1
- 210000000709 aorta Anatomy 0.000 description 1
- 230000006793 arrhythmia Effects 0.000 description 1
- 206010003119 arrhythmia Diseases 0.000 description 1
- 210000001367 artery Anatomy 0.000 description 1
- 210000004369 blood Anatomy 0.000 description 1
- 239000008280 blood Substances 0.000 description 1
- 210000000988 bone and bone Anatomy 0.000 description 1
- 238000011976 chest X-ray Methods 0.000 description 1
- 238000004040 coloring Methods 0.000 description 1
- 230000008602 contraction Effects 0.000 description 1
- 238000007796 conventional method Methods 0.000 description 1
- 230000010339 dilation Effects 0.000 description 1
- 229940079593 drug Drugs 0.000 description 1
- 239000003814 drug Substances 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000003203 everyday effect Effects 0.000 description 1
- 238000005206 flow analysis Methods 0.000 description 1
- 230000001678 irradiating effect Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 210000003205 muscle Anatomy 0.000 description 1
- 230000017311 musculoskeletal movement, spinal reflex action Effects 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 230000008855 peristalsis Effects 0.000 description 1
- 238000004445 quantitative analysis Methods 0.000 description 1
- 208000024584 respiratory abnormality Diseases 0.000 description 1
- 230000036387 respiratory rate Effects 0.000 description 1
- 210000002345 respiratory system Anatomy 0.000 description 1
- 230000000717 retained effect Effects 0.000 description 1
- 210000005241 right ventricle Anatomy 0.000 description 1
- 230000015541 sensory perception of touch Effects 0.000 description 1
- 210000001519 tissue Anatomy 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
- 210000001835 viscera Anatomy 0.000 description 1
- 230000021542 voluntary musculoskeletal movement Effects 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/42—Arrangements for detecting radiation specially adapted for radiation diagnosis
- A61B6/4208—Arrangements for detecting radiation specially adapted for radiation diagnosis characterised by using a particular type of detector
- A61B6/4233—Arrangements for detecting radiation specially adapted for radiation diagnosis characterised by using a particular type of detector using matrix detectors
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/60—Editing figures and text; Combining figures or text
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/46—Arrangements for interfacing with the operator or the patient
- A61B6/461—Displaying means of special interest
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/46—Arrangements for interfacing with the operator or the patient
- A61B6/461—Displaying means of special interest
- A61B6/463—Displaying means of special interest characterised by displaying multiple images or images and diagnostic data on one display
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/46—Arrangements for interfacing with the operator or the patient
- A61B6/467—Arrangements for interfacing with the operator or the patient characterised by special input means
- A61B6/469—Arrangements for interfacing with the operator or the patient characterised by special input means for selecting a region of interest [ROI]
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/48—Diagnostic techniques
- A61B6/486—Diagnostic techniques involving generating temporal series of image data
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/50—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment specially adapted for specific body parts; specially adapted for specific clinical applications
- A61B6/507—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment specially adapted for specific body parts; specially adapted for specific clinical applications for determination of haemodynamic parameters, e.g. perfusion CT
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/52—Devices using data or image processing specially adapted for radiation diagnosis
- A61B6/5211—Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data
- A61B6/5217—Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data extracting a diagnostic or physiological parameter from medical diagnostic data
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04845—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
- G06T7/0014—Biomedical image inspection using an image reference approach
- G06T7/0016—Biomedical image inspection using an image reference approach involving temporal comparison
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/246—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
- G06T7/248—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments involving reference images or patches
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/30—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for calculating health indices; for individual health risk assessment
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10116—X-ray image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20172—Image enhancement details
- G06T2207/20208—High dynamic range [HDR] image processing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20212—Image combination
- G06T2207/20221—Image fusion; Image merging
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30061—Lung
Definitions
- the present invention relates to an image generating device for a dynamic image in which a predetermined part of a human body or an animal is photographed.
- a semiconductor image sensor such as an FPD (flat panel detector) can be used to capture a dynamic image of a subject area including the target region. It has become possible to carry out diagnosis by analyzing the movement of a part or the like. For example, extraction of ventilation information in the lung field from chest X-ray dynamic images, and examination of supporting diagnosis / treatment (CAD for X-ray animation) by quantitative analysis of dynamic functions based on changes in lung field concentration and movement Has been.
- FPD flat panel detector
- the present invention has been made in view of such circumstances, and an object of the present invention is to provide an image generating apparatus with good workability and visibility when displaying a dynamic image of a predetermined part of a human body or an animal. .
- An image generation apparatus includes a dynamic image acquisition unit that acquires a dynamic image capturing a predetermined part of a human body or an animal in time sequence, and a detection unit that detects a temporal change in a physical state of the predetermined part.
- a diagnosis support information generation unit that performs analysis based on a temporal change in the physical state of the predetermined part detected by the detection unit, and generates the analysis result as diagnosis support information; and the diagnosis support information is the dynamic image
- the display image includes a dynamic image display portion for displaying the dynamic image, a first analysis result and a second analysis result of the diagnosis support information.
- Knowledge A list display portion for displaying capable to allow list in the time axis direction, characterized in that an image containing.
- FIG. 1 is a diagram showing an overall configuration of a radiation dynamic image capturing system according to each embodiment.
- FIG. 2 is a diagram illustrating an example of a dynamic image captured by radiological image capturing.
- FIG. 3 is a diagram illustrating a display image in the reference example.
- FIG. 4 is a block diagram illustrating a functional configuration of the image generation apparatus according to the first embodiment.
- FIG. 5 is a block diagram illustrating a functional configuration of the image generation apparatus according to the first embodiment.
- FIG. 6 is a block diagram illustrating a functional configuration of the image generation apparatus according to the sixth embodiment.
- FIG. 7 is a schematic view illustrating a part of a waveform measured by an electrocardiograph.
- FIG. 8 is a schematic diagram illustrating the setting state of coordinate axes for an image.
- FIG. 8 is a schematic diagram illustrating the setting state of coordinate axes for an image.
- FIG. 9 is a schematic view illustrating heart wall fluctuation.
- FIG. 10 is a schematic view illustrating the fluctuation cycle of the lateral width of the heart related to the pulmonary dynamic image during respiratory arrest.
- FIG. 11 is a diagram illustrating a display relating to blood flow.
- FIG. 12 is a diagram exemplifying a waveform showing a time change of a blood flow signal value in a pulmonary blood vessel region.
- FIG. 13 is a schematic view illustrating contour extraction of a lung field region.
- FIG. 14 is a schematic view illustrating the positions of feature points in the lung field region.
- FIG. 15 is a diagram schematically illustrating generation of respiratory phase diagnosis support information.
- FIG. 16 is a diagram illustrating a display image according to the first embodiment.
- FIG. 16 is a diagram illustrating a display image according to the first embodiment.
- FIG. 17 is a diagram illustrating a display image according to the second embodiment and the third embodiment.
- FIG. 18 is a flowchart for explaining the basic operation of the image generation apparatus realized in the first embodiment.
- FIG. 19 is a block diagram illustrating a functional configuration of an image generation apparatus according to the fourth embodiment.
- FIG. 20 is a schematic diagram showing waveform data of heartbeat (blood flow) in time series.
- FIG. 21 is a schematic diagram showing waveform data of respiratory information in time series.
- FIG. 22 is a flowchart for explaining the basic operation of the image generation apparatus realized in the fourth embodiment.
- FIG. 23 is a block diagram illustrating a functional configuration of an image generation apparatus according to the fifth embodiment.
- FIG. 24 is a flowchart for explaining the basic operation of the image generating apparatus realized in the fifth embodiment.
- FIG. 25 is a diagram illustrating the joint angle in joint flexion and extension.
- FIG. 26 is a diagram schematically illustrating the generation of the bending / extension phase diagnosis support information.
- FIG. 27 is a diagram illustrating a display image when blood flow information is displayed as diagnosis support information.
- FIG. 28 is a block diagram illustrating a part of the functional configuration of the image generation apparatus according to this embodiment.
- the radiation dynamic image capturing system captures a radiation image of a subject using a human or animal body as a subject, and generates a desired display image.
- FIG. 1 is a diagram showing an overall configuration of a radiation dynamic image capturing system according to the first embodiment.
- the radiation dynamic imaging system 100 includes an imaging device 1, an imaging control device 2 (imaging console), an image generation device 3 (diagnosis console), and an electrocardiograph 4.
- the imaging device 1 and the electrocardiograph 4 are connected to the imaging control device 2 via a communication cable or the like, and the imaging control device 2 and the image generation device 3 are connected via a communication network NT such as a LAN (Local Area Network).
- NT such as a LAN (Local Area Network).
- Each device constituting the radiation dynamic image capturing system 100 conforms to the DICOM (Digital Image and Communication Communications in Medicine) standard, and communication between the devices is performed according to the DICOM standard.
- DICOM Digital Image and Communication Communications in Medicine
- the imaging apparatus 1 is configured by, for example, an X-ray imaging apparatus or the like, and is an apparatus that captures the chest dynamics of the subject M accompanying breathing. Dynamic imaging is performed by acquiring a plurality of images sequentially in time while repeatedly irradiating the chest of the subject M with radiation such as X-rays. A series of images obtained by this continuous shooting is called a dynamic image. Each of the plurality of images constituting the dynamic image is called a frame image.
- the imaging apparatus 1 includes an irradiation unit (radiation source) 11, a radiation irradiation control device 12, an imaging unit (radiation detection unit) 13, a reading control device 14, a cycle detection sensor 15, The cycle detection device 16 is provided.
- the irradiation unit 11 irradiates the subject M with radiation (X-rays) according to the control of the radiation irradiation control device 12.
- the illustrated example is a system for the human body, and the subject M corresponds to the person to be inspected.
- the subject M is also referred to as a “subject”.
- the radiation irradiation control device 12 is connected to the imaging control device 2 and performs radiation imaging by controlling the irradiation unit 11 based on the radiation irradiation conditions input from the imaging control device 2.
- the imaging unit 13 is configured by a semiconductor image sensor such as an FPD, and converts the radiation irradiated from the irradiation unit 11 and transmitted through the subject M into an electrical signal (image information).
- a semiconductor image sensor such as an FPD
- the reading control device 14 is connected to the photographing control device 2.
- the reading control device 14 controls the switching unit of each pixel of the imaging unit 13 based on the image reading condition input from the imaging control device 2, and switches the reading of the electric signal accumulated in each pixel.
- the image data is acquired by reading the electrical signal accumulated in the imaging unit 13.
- the reading control device 14 outputs the acquired image data (frame image) to the imaging control device 2.
- the image reading conditions are, for example, a frame rate, a frame interval, a pixel size, an image size (matrix size), and the like.
- the frame rate is the number of frame images acquired per second and matches the pulse rate.
- the frame interval is the time from the start of one frame image acquisition operation to the start of the next frame image acquisition operation in continuous shooting, and coincides with the pulse interval.
- the radiation irradiation control device 12 and the reading control device 14 are connected to each other, and exchange synchronization signals with each other to synchronize the radiation irradiation operation and the image reading operation.
- the cycle detection device 16 detects the respiratory cycle of the subject M and outputs cycle information to the control unit 21 of the imaging control device 2.
- the cycle detection device 16 also measures and controls the cycle detection sensor 15 that detects the movement of the chest of the subject M (respiration cycle of the subject M) by laser irradiation and the time of the respiratory cycle detected by the cycle detection sensor 15.
- a timing unit (not shown) that outputs to the unit 21.
- the imaging control device 2 outputs radiation irradiation conditions and image reading conditions to the imaging device 1 to control radiation imaging and radiographic image reading operations by the imaging device 1, and also captures dynamic images acquired by the imaging device 1. Displayed for confirmation of whether the image is suitable for confirmation of positioning or diagnosis.
- the photographing control device 2 includes a control unit 21, a storage unit 22, an operation unit 23, a display unit 24, and a communication unit 25, and each unit is connected by a bus 26. ing.
- the control unit 21 includes a CPU (Central Processing Unit), a RAM (Random Access Memory), and the like.
- the CPU of the control unit 21 reads the system program and various processing programs stored in the storage unit 22 in accordance with the operation of the operation unit 23, expands them in the RAM, and performs shooting control processing described later according to the expanded programs.
- Various processes including the beginning are executed to centrally control the operation of each part of the imaging control device 2 and the operation of the imaging device 1.
- the storage unit 22 is configured by a nonvolatile semiconductor memory, a hard disk, or the like.
- the storage unit 22 stores various programs executed by the control unit 21 and data such as parameters necessary for execution of processing by the programs or processing results.
- the operation unit 23 includes a keyboard having cursor keys, numeric input keys, various function keys, and the like, and a pointing device such as a mouse.
- the operation unit 23 is input via a keyboard key operation, a mouse operation, or a touch panel.
- the indicated instruction signal is output to the control unit 21.
- the display unit 24 is configured by a monitor such as a color LCD (Liquid Crystal Display), and displays an input instruction, data, and the like from the operation unit 23 in accordance with an instruction of a display signal input from the control unit 21.
- a monitor such as a color LCD (Liquid Crystal Display)
- LCD Liquid Crystal Display
- the communication unit 25 includes a LAN adapter, a modem, a TA (Terminal Adapter), and the like, and controls data transmission / reception with each device connected to the communication network NT.
- the image generation device 3 acquires the dynamic image transmitted from the imaging device 1 via the imaging control device 2, and displays an image for a doctor or the like to perform an interpretation diagnosis.
- the image generation device 3 includes a control unit 31, a storage unit 32, an operation unit 33, a display unit 34, and a communication unit 35, and each unit is connected by a bus 36. ing.
- the control unit 31 includes a CPU, a RAM, and the like.
- the CPU of the control unit 31 reads the system program and various processing programs stored in the storage unit 32 in accordance with the operation of the operation unit 33, expands them in the RAM, and executes various processes according to the expanded programs.
- the operation of each part of the image generating apparatus 3 is centrally controlled (details will be described later).
- the storage unit 32 is configured by a nonvolatile semiconductor memory, a hard disk, or the like.
- the storage unit 32 stores various programs executed by the control unit 31 and data such as parameters necessary for execution of processing by the programs or processing results.
- the storage unit 32 stores an image generation processing program for executing an image generation process described later.
- These various programs are stored in the form of readable program codes, and the control unit 31 sequentially executes operations according to the program codes.
- the operation unit 33 includes a keyboard having cursor keys, numeric input keys, various function keys, and the like, and a pointing device such as a mouse.
- the operation unit 33 is input via a keyboard key operation, a mouse operation, or a touch panel.
- the instruction signal is output to the control unit 31.
- the display unit 34 is composed of a monitor such as a color LCD, and displays an input instruction from the operation unit 33, data, and a display image to be described later in accordance with an instruction of a display signal input from the control unit 31.
- the communication unit 35 includes a LAN adapter, a modem, a TA, and the like, and controls data transmission / reception with each device connected to the communication network NT.
- the electrocardiograph 4 includes a phase detection unit 41, and the phase detection unit 41 performs an imaging operation by the imaging device 1 in response to a control signal from the CPU of the control unit 21.
- the phase of the heartbeat (blood flow) of the subject M is detected.
- the heartbeat is also treated as a kind of blood flow.
- the phase detector 41 can also be provided in the imaging control device 2.
- FIG. 2 is a dynamic image obtained by radiographic imaging with respect to the dynamics of the chest of the subject M accompanying breathing
- FIG. 3 is a diagram illustrating a display image generated in the reference example.
- the display image IG0 of FIG. 3 the frame image MI, the graph G1 indicating the position of the diaphragm, the graph G2 indicating the width of the rib cage, the graph G3 indicating respiratory information, and the heart rate information (in synchronization with the frame image MI)
- a graph G4 indicating (blood flow information) is displayed on one side of the frame image MI.
- the frame images M1 to M10 are obtained by continuously photographing one cycle of the respiratory cycle (details will be described later) at a constant photographing timing.
- the frame images M7, M8, and M9 in FIG. 2 are important frame images for diagnosis by a user who is an expert such as a doctor, and the region IR (see FIGS. 2 and 3) is the diagnosis target.
- the phase position of the graph G1 at this moment is indicated by a line LT (see FIG. 3).
- a desired frame image MI is searched for while operating the progress bar PB corresponding to the progress of reproduction of the dynamic image, and the frame image MI is observed in detail. Diagnose.
- detailed diagnostic data here, the graph G1 and the line LT
- the diagnostic efficiency is extremely poor.
- the operator can easily select and operate the desired frame image MI, and the gaze movement is reduced by reducing the gaze on the diagnostic data, so that the dynamic image to be originally diagnosed can be obtained. It is hoped that we will continue to focus and focus.
- the diagnosis time by moving image interpretation is shortened.
- the image generation device 3 of the dynamic radiographic imaging system 100 uses a state change based on a periodic time change of the heart or lungs (predetermined part) of the subject M as diagnosis support information. By generating a display image that is easily displayed in the axial direction, the work of searching for a desired frame image MI related to diagnosis is reduced.
- the control unit 31 mainly includes a dynamic image acquisition unit 110, a detection unit 120, a diagnosis support information generation unit 150, and a display image generation unit 160.
- the display unit 34 includes a reproduction time adjustment unit 341 that displays the display image generated by the display image generation unit 160 and allows the user to refer to the display image by changing the reproduction time.
- control unit 31 As shown in FIGS. 4 and 5 is described as being realized by executing a preinstalled program, but it is realized with a dedicated hardware configuration. May be.
- the dynamic image acquisition unit 110 acquires a dynamic image that captures a target part (predetermined part) of a human body or an animal photographed by the reading control device 14 of the imaging apparatus 1 in time order.
- the target region is assumed to be a heart region in FIG. 4 and a lung region in FIG.
- the imaging control device 2 is interposed and stored in the storage unit 22 of the imaging control device 2.
- the processed data is output to the communication unit 35 of the image generation device 3 via the communication unit 25.
- the detection unit 120 includes a predetermined part period specifying part 130, and detects a temporal change in the physical state of the target part (see FIGS. 4 and 5).
- the term “physical state” here refers to a geometric shape such as the heart and lungs, and also includes the concentration of blood flow (the presence or absence of blood flow).
- Periodic changes in the heart or lungs of the subject M that is, blood flow (including heartbeat) and respiratory phase information and frequency (cycle) information are detected.
- the detection of a temporal change in this case means detection of a temporal change in a region periodic physical state such as an external shape of an organ and a blood flow concentration.
- the predetermined part period specifying unit 130 specifies a target part period (predetermined part period) that is a periodic time change of the physical state of the target part.
- the first blood flow information detection method and the first and second respiration information detection methods used in the present embodiment will be described as methods for calculating the phase information by blood flow and respiration.
- FIG. 7 is a diagram illustrating an electrocardiogram waveform of the subject M.
- the horizontal axis indicates time
- the vertical axis indicates the magnitude (voltage) of the electric signal
- the so-called P wave, Q wave, R wave, S wave, T wave, and U wave shapes are shown. Curves showing changes in electrical signals including curves Pp, Qp, Rp, Sp, Tp and Up are shown.
- the predetermined region period specifying unit 130 analyzes the above points (Pp, Qp, Rp, Sp, Tp, and Up) based on the detection result acquired from the phase detection unit 41, thereby providing a heart rate (blood flow). ) Specify the period.
- the detection operation by the phase detection unit 41 is performed in synchronization with the imaging operation by the imaging device 1 (see FIG. 1).
- 4 shows the electrocardiograph 4 and the image generation device 3 directly connected to each other, but in reality, the imaging control device 2 is interposed and stored in the storage unit 22 of the imaging control device 2.
- the detection data is output to the communication unit 35 of the image generation device 3 via the communication unit 25.
- the predetermined part period specifying unit 130 is configured so that the heartbeat (blood flow) period can be set from the outside, so that it is possible to automatically acquire a periodic time change of the target part.
- the first blood flow information detection method it is possible to obtain information for generating the graph G4 indicating the blood flow information described above.
- ⁇ 1st respiration information detection method The measurement result by another apparatus It implements by the measurement by another apparatus as a 1st respiration information detection method.
- a method of measuring with another device for example, an apparatus as described in Japanese Patent No. 3793102 can be used.
- a method implemented by monitoring with a sensor composed of a laser beam and a CCD camera for example, "Study on sleep monitoring of sleepers using FG visual sensor", Hiroto Aoki, Masato Nakajima, IEICE Society Conference Lectures 2001. Information and Systems Society Conference Proceedings, 320-321, 2001-08-29, etc.
- a sensor composed of a laser beam and a CCD camera for example, "Study on sleep monitoring of sleepers using FG visual sensor", Hiroto Aoki, Masato Nakajima, IEICE Society Conference Lectures 2001. Information and Systems Society Conference Proceedings, 320-321, 2001-08-29, etc.
- the cycle detection sensor 15 of the cycle detection device 16 can be used in the detection unit 120 (predetermined part period specifying unit 130).
- a method for detecting the respiratory cycle there are a method of detecting the movement of the subject's chest using a respiratory monitor belt, and a method of detecting a respiratory airflow using an anemometer. It is also possible to apply.
- the first respiration information detection method it is possible to obtain information for generating the graph G3 indicating the respiration information described above and a color bar C1 described later.
- the predetermined part cycle specifying unit 130 is configured to be able to set the respiratory cycle from the outside, so that it is possible to automatically acquire a periodic time change of the target part.
- Second respiratory information detection method area value or distance between feature points
- the area value of the lung field is calculated using the captured image acquired by the dynamic image acquisition unit 110 By doing so, it becomes respiratory information.
- the lung field area can be obtained by extracting the contour of the lung field and defining the number of pixels in the region surrounded by the contour as the lung field region. That is, respiratory information can be obtained by detecting the position of the diaphragm and the width of the thorax.
- the second respiration information detection method it is possible to obtain information for generating the graph G1 indicating the position of the diaphragm and the graph G2 indicating the width of the rib cage.
- FIG. 13 is a schematic diagram illustrating the contour extraction of the lung field.
- the lung field may be extracted for each of the left and right sides, or may be extracted as an outline OL including the heart and spine regions.
- conventional techniques for example, “Image feature analysis and computer-aided diagnosis: Accurate determination of ribcage boundary in chest radiographs”, Xin-Wei Xu and Kunio Doi, Medical Physics, Volume 22 (1995), May pp.617-626 etc.
- image feature analysis and computer-aided diagnosis Accurate determination of ribcage boundary in chest radiographs”
- Xin-Wei Xu and Kunio Doi Medical Physics, Volume 22 (1995), May pp.617-626 etc.
- the predetermined region period specifying unit 130 extracts the contour OL of the lung field using the acquired captured image, and uses the number of pixels in the extracted region as the feature amount. Detect as the area of the lung field.
- respiration information detection method it is also possible to calculate the distance between feature points in the lung field region using the captured image acquired by the dynamic image acquisition unit 110 and use it as respiration information. That is, lung field extraction is performed in the same manner as described above, and two feature points are obtained from the extracted region, and a feature amount is calculated by obtaining a distance between the two points.
- FIG. 14 is a diagram illustrating positions of feature points in the lung field region.
- the lung apex is the upper end LT of the lung region, and from the lung apex to the body axis direction.
- FIG. 14B illustrates an example in which the intersection of the straight line and the diaphragm is extracted as the lower end LB of the lung region.
- the apex of the lung is extracted as the upper end LT of the lung region, and the lateral angle is extracted as the lower end LB of the lung region. This is an example.
- the predetermined part period specifying unit 130 extracts the contour OL of the lung field region using the acquired captured image, and obtains the distance between the feature points from the extracted region.
- the respiratory cycle is set by detecting the distance.
- the predetermined part cycle specifying unit 130 detects the respiratory cycle based on the temporal change in the area value of the lung field region or the distance between feature points (change in the shape of the predetermined part) captured in the dynamic image. Therefore, it becomes possible to automatically acquire the respiratory cycle.
- the predetermined part cycle specifying unit 130 uses a frequency analysis or the like based on the temporal change (the change in the shape of the predetermined part) of the area value of the lung field region or the distance between feature points captured in the dynamic image. Is preferably detected. As a result, it is possible to automatically extract the desired fluctuation component from which the noise component has been removed, so that the temporal change in the area value of the lung field region or the distance between the feature points (a state in which the predetermined part changes over time) is more accurate. It becomes possible to grasp.
- Diagnosis Support Information Generation Unit 150 performs analysis based on the temporal change in the physical state of the target site such as the heart or lung detected by the above-described detection unit 120 (predetermined site period specifying unit 130), and the analysis result is obtained. Generated as diagnosis support information.
- the diagnosis support information includes a first analysis result and a second analysis result based on the analysis.
- the first analysis result indicates expiration and the second analysis result indicates inspiration.
- the diagnosis support information may include a third analysis result in addition to the first analysis result and the second analysis result.
- the third analysis result includes a respiratory stop state.
- the method of holding the analysis result as metadata in the dynamic image that is, the diagnosis support information generated in the diagnosis support information generating unit 150 is the dynamic image (frame). As shown in FIG. 28, it may be associated with the image MI) in terms of time and may be retained in the retaining unit 32 or may be managed as a separate database. This is desirable because it is not necessary to calculate measurements and image feature quantities during diagnosis. Further, it may be held only for important state change information that may be displayed as diagnosis support information.
- diagnosis support information in respiratory information and blood flow information will be described.
- one cycle of the breathing cycle (breathing cycle) B is composed of inspiration and expiration, and consists of one expiration and one inspiration.
- inspiration cycle the area of the lung field in the thorax increases as the diaphragm lowers and breathes in.
- the maximum inhalation time B1 is when the maximum amount of breath is inhaled (the conversion point between inspiration and expiration).
- exhalation the area of the lung field decreases as the diaphragm rises and exhales, but when exhaling to the maximum (conversion point between exhalation and inspiration) becomes the maximum exhalation time B2 (see FIG. 21). ).
- FIG. 15 is a diagram schematically illustrating the generation of the respiratory phase diagnosis support information with respect to the respiratory information cycle specified by the predetermined part cycle specifying unit 130.
- the first analysis result indicates expiration
- the second analysis result indicates inspiration
- the respiratory phase corresponds to the time change
- diagnostic support information that can be identified in the respiratory cycle is obtained. Generate.
- display is performed using diagnosis support information in respiratory information, but display is also possible using diagnosis support information in blood flow information (see FIG. 27).
- diagnosis support information in blood flow information in the blood flow information shown in FIG. 11, the first analysis result indicates “present” of the blood flow, the second analysis result indicates “not present” of the blood flow, and changes the blood flow phase over time.
- diagnostic support information that can be identified in the blood flow cycle is generated.
- diagnosis support information is efficient if it is fixed in the case where the diagnosis contents are predetermined.
- a patient suspected of having a pulmonary embolism may employ a pulmonary blood flow phase effective for pulmonary embolism.
- a pulmonary blood flow phase effective for pulmonary embolism For patients with suspected respiratory abnormalities, in addition to adopting an effective respiratory phase for respiratory diagnosis, if several abnormal patterns of the respiratory system can be analyzed, multiple changes in the state can be seen. Diagnostic information may be employed.
- the display image generation unit 160 generates a display image for displaying the frame image MI (dynamic image) and diagnosis support information. That is, the display image is generated by associating the phase change of the target region with the temporally corresponding frame image MI.
- the display image generation unit 160 displays the dynamic image display portion 161 that displays the dynamic image and the first analysis result and the second analysis result of the diagnosis support information so that they can be identified and listed in the time axis direction.
- a display image provided with a list display portion 162 to be displayed and a playback time display portion 163 for displaying playback time information corresponding to the display of the dynamic image display portion 161 is generated (see FIGS. 4 and 5). That is, the display image includes the list display portion 162 and is generated including an index indicating a specific position of the list display portion 162 in the time axis direction. Then, the display image generation unit 160 generates a display image so that the dynamic image at the time corresponding to the specific position indicated by the index is displayed on the dynamic image display portion 161.
- the dynamic image (frame image) displayed on the dynamic image display portion 161 and the specific position in the time axis direction of the list display portion 162 indicated by the index have a correspondence relationship. Even when the dynamic image is displayed on the dynamic image display portion 161 in a moving image manner, the diagnosis support information displayed on the list display portion 162 is displayed in a still image manner, and the index of the list display portion 162 is displayed. Is displayed to move in the time axis direction.
- the dynamic image display portion 161 is rectangular, and the list display portion 162 is a long region along one side of the dynamic image display portion 161, and the longitudinal direction thereof and the time axis direction of the diagnosis support information are It corresponds.
- the detection unit 120 detects temporal changes in the physical state of the plurality of target sites, respectively, and the diagnosis support information generation unit 150 Analysis is performed based on temporal changes in the physical state of the target part, and analysis results for a plurality of target parts are generated as a plurality of diagnosis support information, respectively, and the list display part 162 displays the plurality of diagnosis support information To do.
- FIG. 16 is an example in which the display image IG generated by the display image generation unit 160 is displayed on the display screen of the display unit 34.
- the display image IG includes a frame image MI taken for the subject M and graphs G1 to G4 common to FIG. -Color bar C1, corresponding to the respiratory phase of the lung field ⁇ Progress bar PB, -Playback time display part TM Are displayed in parallel.
- the graph G3 indicating the respiratory information and the color bar C1 are information obtained from the first respiratory information detection method
- the graph G1 indicating the position of the diaphragm and the graph G2 indicating the width of the thorax are
- the graph G4 showing the blood flow information is information obtained from the first blood flow information detection method, and is information obtained from the first blood flow information detection method.
- the portion for displaying the frame image MI corresponds to the dynamic image display portion 161
- the portion of the color bar C1 corresponds to the list display portion 162
- the reproduction time display portion TM The part corresponds to the reproduction time display part 163.
- the progress bar PB and the color bar C1 are displayed in an integrated manner.
- the list display portion 162 displays the first analysis result and the second analysis result such as exhalation and inhalation in different color displays (for example, simple display with two colors) or shades (so-called gradation).
- a display form may be used.
- As a method of expressing a difference in state change it can be expressed by a difference in luminance, a difference in hue, a difference in saturation, etc., and a plurality of state changes may be expressed in luminance and hue, RG and BY. .
- the degree of phase can be expressed more clearly, and a fine grasp can be made as necessary. Therefore, referring to the display image IG makes it easy to visually recognize the diagnostic content of the target part, and the diagnostic efficiency is further improved.
- the reproduction time display part 163 (that is, the reproduction time display part TM) adopts a display form that can be recognized by being close to the list display part 162 (that is, the color bar C1). This makes it possible to visually grasp the diagnosis support information and the reproduction time at the time of reproduction.
- the list display portion 162 may be provided with a reproduction time adjustment interface portion (corresponding to the progress bar PB) for adjusting the reproduction time of the dynamic image displayed on the dynamic image display portion 161 (FIGS. 4 and 5). FIG. 16).
- FIG. 18 is a flowchart for explaining a basic operation realized in the image generation apparatus 3 according to this embodiment. Since the individual functions of each unit have already been described (see FIGS. 4 and 5), only the overall flow will be described here.
- step S ⁇ b> 1 the dynamic image acquisition unit 110 of the control unit 31 acquires a dynamic image captured by the reading control device 14 of the imaging device 1 via the imaging control device 2.
- step S2 the detection unit 120 detects temporal changes in physical states such as the heart and lungs, and the predetermined part cycle specifying unit 130 specifies cycles such as blood flow and respiration. Specifically, regarding the time change of blood flow information, the result obtained by the detection unit 120 (predetermined part period specifying unit 130) from the phase detection unit 41 of the electrocardiograph 4 (first blood flow information detection method) (See FIG. 4). Whether the detection unit 120 (predetermined part period specifying unit 130) detects the time change of the respiratory information based on the result (first respiratory information detection method) acquired from the cycle detection sensor 15 (FIG. 7 to FIG. 7). 14) (see FIG. 5), or based on the image feature amount (second respiratory information detection method) of the frame image MI in the dynamic image.
- step S3 the diagnosis support information generation unit 150 performs analysis based on the temporal change of the physical state such as the heart and lung acquired in step S2, and the diagnosis support information in which the analysis result is associated with the time change. Is generated (see FIGS. 11 and 15). Then, the diagnosis support information is temporally associated with the dynamic image and held in the holding unit 32.
- step S4 the display image generation unit 160 generates a display image IG for displaying the frame image MI (dynamic image) and diagnosis support information held in step S3 (see FIG. 16).
- step S5 the display image generation unit 160 outputs the display image IG generated in step S4 on the display unit 34, thereby displaying the display image IG on the monitor of the display unit 34, and the operation flow is ended. .
- the image generation apparatus 3 acquires a dynamic image that captures a target part such as a heart or a lung in a human body or an animal in time order, and detects a temporal change in a physical state of the target part. And it analyzes based on the time change of the physical state of the said object part, and produces
- the diagnosis support information includes a first analysis result and a second analysis result based on the analysis.
- the display image is a dynamic image display portion 161 that displays a dynamic image. 16 and a list display portion 162 (color in FIG.
- Second Embodiment> Below, 2nd Embodiment is described. The difference from the first embodiment is that the display image generated by the display image generation unit 160 is different. The remaining configuration is the same as that of the image generation apparatus 3 of the first embodiment.
- FIG. 17A is a diagram showing a display image IG in the second embodiment, which is an example displayed on the display screen of the display unit 34.
- the display image IG includes a frame image MI taken for the subject M, graphs G1 to G4 common to FIG. 3, and graphic elements as in FIG. , -Color bar C1, corresponding to the respiratory phase of the lung field ⁇ Progress bar PB, ⁇ Reproduction time display part TM,
- -Waveform graph F showing phase change
- -Playback time adjustment unit 341 Are displayed in parallel.
- the color bar C1 and the waveform graph F are information obtained from the first respiratory information detection method, and the remaining graphic elements are the same as those in the first embodiment.
- the display image IG in the second embodiment is displayed by integrating the waveform graph F showing the temporal change of the physical state of the target part on the color bar C1. Is done.
- the list display portion 162 is provided with a playback time adjustment interface portion (corresponding to the progress bar PB) for adjusting the playback time of the dynamic image display portion 161, and the display portion 34 is played back by the user via the operation portion 35.
- a reproduction time adjustment unit 341 that can refer to the display image IG by changing the reproduction time using the time adjustment interface portion is provided (see FIG. 17A). As a result, the user can refer to the display image IG by changing the reproduction time using the progress bar PB, so that the desired reproduction time of the display image IG can be accessed.
- FIG. 17B is a diagram showing a display image IG in the third embodiment, which is an example displayed on the display screen of the display unit 34.
- the display image IG includes a frame image MI taken for the subject M, graphs G1 to G4 common to FIG. 3, and graphic elements shown in FIGS. Similar to (a) ⁇ Progress bar PB, ⁇ Reproduction time display part TM, -Playback time adjustment unit 341
- -Color bar C1 corresponding to the respiratory phase of the right lung field
- -Color bar C2 corresponding to the respiratory phase of the left lung field are displayed in parallel.
- the color bars C1 and C2 are information obtained by separately detecting the right lung field and the left lung field from the second respiratory information detection method, and the remaining graphic elements are: This is the same as in the first embodiment.
- the display image IG in the third embodiment is different from the display image IG0 in the reference example shown in FIG. 3, and the progress bar PB and the color bars C1 and C2 are integrated. Is displayed.
- diagnosis support information for example, as information obtained by corresponding each of a plurality of analysis results in time, such as the left lung field and the right lung field. It is also possible to display the bars C1 and C2 close to each other. As a result, the respiratory phases of the left and right lungs can be shown at the same time, the locations where the state changes of the left and right lungs are different are clearly shown, and the abnormal points are easily understood, so that the frame selection operation for moving images becomes easy.
- the diagnosis support information is information indicating temporal changes in a plurality of analysis results corresponding to a plurality of parts
- a plurality of analysis results corresponding to the plurality of parts are obtained by referring to the display image IG. It becomes possible to visually recognize at the same time.
- the plurality of portions are the left lung field and the right lung field, the analysis results of the left lung field and the right lung field can be simultaneously viewed with reference to the display image IG.
- the reproduction time display portion 163 (that is, the reproduction time display portion TM) adopts a display form that can be recognized in the vicinity of the list display portion 162 (that is, the color bars C1 and C2). This makes it possible to visually grasp the diagnosis support information and the reproduction time at the time of reproduction.
- FIG. 19 is a diagram showing a functional configuration of the control unit 31A used in the image generating apparatus 3A configured as the fourth embodiment of the present invention.
- the control unit 31A is used as an alternative to the control unit 31 (see FIG. 4) in the image generation device 3 of the first embodiment.
- the difference from the first embodiment is that the detection unit 120A further includes a feature point calculation unit 140, and the remaining configuration is the same as that of the image generation device 3 of the first embodiment.
- the feature point calculation unit 140 calculates feature points in the temporal change of the physical state of the target part. Note that the diagnosis support information generation unit 150A and the display image generation unit 160A generate diagnosis support information including information indicating the feature points.
- FIG. 20 is a schematic diagram showing the waveform data of blood flow (heart rate) information detected by the detection unit 120A in time series
- FIG. 21 shows the waveform data of respiratory information detected by the detection unit 120A in time series.
- It is the schematic diagram shown by. 20 shows the results of monitoring the electrocardiogram waveform in the time direction when the first blood flow information detection method is adopted
- FIG. 21 shows the lung field region when the second respiration information detection method is adopted. 6 illustrates the result of calculating the temporal change of the area value or the distance between feature points.
- the maximum points in the waveform are points P1 and P4 (corresponding to point Rp in FIG. 7). Points P2 and P5 (points corresponding to point Sp in FIG. 7) as minimum points in the waveform, and points P3 and P6 (points corresponding to point Tp in FIG. 7) as maximum points in the waveform. ) And the like.
- the respiration information includes a maximum point B1, a point B3, a minimum point B2, and the like.
- the feature point calculation unit 140 may include, for example, a change point (for example, a maximum point, a minimum point, a maximum point in a primary differentiation or a secondary differentiation). It may be provided as a setting condition so as to calculate a point, a minimum point, etc.
- a change point for example, a maximum point, a minimum point, a maximum point in a primary differentiation or a secondary differentiation. It may be provided as a setting condition so as to calculate a point, a minimum point, etc.
- the diagnostic support information generation unit 150 generates the display image IG so that the feature points calculated by the feature point calculation unit 140 are superimposed on the color bar C1 (C2) described above. That is, regarding the blood flow (heart rate) information, as shown in FIG. 20, the maximum points P1 and P4 are maximum as lines LP1 and LP4, and the minimum points P2 and P5 are maximum as lines LP2 and LP5. The point P3 and the point P6 are displayed as lines LP3 and LP6, respectively. Similarly, as shown in FIG. 21, the maximum points B1 and B3 are displayed as lines LB1 and LB3, and the minimum point B2 is displayed as a line LB2. In FIG.
- the color bar C1 (C2) is shown blank for the purpose of clarifying LP1 to LP6, but in reality, the “systole” and “diastolic phase” of the heart (the “state” change of the target site) Are displayed so as to be visually identifiable.
- the important diagnostic points can be easily seen and the diagnostic efficiency is further improved.
- the diagnosis support information indicates a time change of a plurality of analysis results corresponding to a plurality of parts, for example, as shown in FIG. 17B, when the respiratory phases of the left lung field and the right lung field are respectively shown If the lines indicating the above feature points are superimposed on each other, abnormal points (feature points) corresponding to temporal changes in the state of the left lung field and the right lung field appear at different locations, for example, Is clear and useful.
- the frame image MI diagnosed as having a suspicion of abnormality in the state change by displaying the feature point may be displayed on the color bar C1 (C2) so as to be identifiable in color display or the like. Is possible. Furthermore, abnormalities are found by displaying feature points on the color bar C1 (C2) so that they can be distinguished by color display etc. even in situations where the reliability is low in the first and second analysis results such as exhalation or inspiration. It becomes easy to do.
- FIG. 22 is a diagram illustrating an operation flow of the image generation apparatus 3A according to the fourth embodiment.
- steps ST1, ST2, ST4, ST5 and ST6 are the same as steps S1 to S5 in FIG.
- the following steps are added by adding the feature point calculation unit 140 that did not exist in the first embodiment.
- step ST1 through step ST1 to step ST2, as shown in FIG. 22, the feature point calculation unit 140 in the detection unit 120A is detected in step ST2 in step ST3.
- step ST3 In the time change of the target region, feature points determined under the set conditions are calculated (see FIGS. 20 and 21).
- step ST6 the display image generation unit 160A includes information indicating the feature points generated in step ST5.
- the display image IG is displayed on the monitor of the display unit 34, and this operation flow is ended.
- the diagnosis support information includes information indicating the feature point, and thus the feature point in the time change of the target part becomes clear, and the diagnosis efficiency is further improved.
- FIG. 23 is a diagram showing a functional configuration of the control unit 31B used in the image generating apparatus 3B configured as the fifth embodiment of the present invention.
- the control unit 31B and the display unit 34B are respectively used as an alternative to the control unit 31 (31A) (see FIGS. 4 and 19) in the image generation apparatus 3 (3A) of the first (fourth) embodiment.
- the difference from the first (fourth) embodiment is that the detection unit 120B further includes a notification point calculation unit 145, and the image generation device 3B further includes a notification unit 342.
- the remaining configuration is the same as that of the image generating apparatus 3A of the fourth embodiment.
- the detection unit 120B in FIG. 23 is configured to include the feature point calculation unit 140 according to the fourth embodiment, but may not include the feature point calculation unit 140.
- the notification point calculation unit 145 calculates a point for notification (hereinafter referred to as “notification point”) determined under the setting conditions desired by the user in the time change of the target part, and outputs the calculated point to the notification unit 342. That is, the setting condition is a condition specified by the user. For example, when the time change in the physical state of the target region is the respiration information shown in FIG. 21, the user specifies to notify the maximum point.
- the notification point calculation unit 145 detects the points P3 and P6 as notification points. Note that the diagnosis support information generation unit 150B and the display image generation unit 160B generate diagnosis support information including information indicating the notification point.
- the notification unit 342 notifies the user that the setting condition is satisfied when the analysis result by the diagnosis support information generation unit 150 satisfies the setting condition (predetermined condition). That is, the notification unit 342 notifies the user of the notification point detected by the notification point calculation unit 145 by any means of visual information, auditory information, and tactile information.
- the informing unit 342 instructs the display unit 34B to display the visual information.
- the visual information is a visual representation of the time taken from the current time to the notification point, and includes indicator, progress bar display, numerical display, model diagram display, periodic diagram display, etc.
- the notification unit 342 may be a buzzer, timing sound, voice, or the like.
- notification is made by a method of announcing a few seconds after the notification point with a synthesized sound, a method of sounding a buzzer at the notification point, or the like.
- the operation can be performed without looking at the progress bar PB at all.
- the user since the user receives the notification information during the rewinding operation of the frame image MI according to the elapsed time, it is possible to reach and select the frame image MI useful for diagnosis while gazing at the moving image.
- FIG. 24 is a diagram illustrating an operation flow of the image generation device 3B according to the fifth embodiment. It is assumed that the detection unit 120B does not include the feature point calculation unit 140. In FIG. 24, steps SP1, SP2, SP4, SP5 and SP6 are the same as steps S1 to S5 in FIG.
- the following steps are added by adding the notification point calculation unit 145 and the notification unit 342 that did not exist in the first embodiment.
- step SP3 the notification point calculation unit 145 in the detection unit 120A is detected in step SP2.
- a notification point determined under the set condition is calculated.
- step SP6 the display image generation unit 160B generates a display in consideration of the timing of the notification point generated in step SP5.
- the image IG is output on the display unit 34B and displayed on the monitor of the display unit 34B. (Here, when notifying the user of the timing of the notification point by the auditory information and the tactile information, by voice or tactile sense. Output), and this operation flow is terminated.
- the image generation device 3B when the analysis result of the target part satisfies a desired setting condition, the setting condition is satisfied by notifying a doctor who has little diagnostic experience by notifying the user. It is possible to recognize the diagnosis contents.
- FIG. 6 is a diagram showing a functional configuration of the control unit 31 ′ used in the image generating device 3 ′ in the dynamic radiographic imaging system 100 ′ configured as the sixth embodiment of the present invention.
- This control unit 31 ′ is used as an alternative to the control unit 31 (FIG. 4) in the system 100 of the first embodiment.
- the difference from the first embodiment is that the detection method of blood flow information in the detection unit 120 ′ is different.
- the remaining configuration is the same as that of the image generation apparatus 3 of the first embodiment.
- the target site in the present embodiment is a heart region or a lung region. Below, the 2nd blood-flow information detection method used by this embodiment is demonstrated.
- Second blood flow information detection method heart wall motion amount
- the detection unit 120 ′ predetermined part period specifying unit 130 ′ acquires a dynamic image.
- the heart wall (blood flow) information is obtained by calculating the amount of motion of the heart wall using the captured image acquired by the unit 110. That is, it is a precondition that the heart is captured together with the lung that is the target region to be imaged in the lung dynamic image during breathing and the lung dynamic image during respiratory arrest.
- the heart pulsation at the timing when each breathing frame image and each breathing-stop frame image was taken by detecting the fluctuation of the heart wall from the breathing-pulmonary motion image and the breathing-stopping lung motion image. Are detected. Therefore, the heart wall is detected as the phase of the heart beat.
- FIG. 8 shows that for each breathing frame image and each breathing-stop frame image, a predetermined point (for example, the upper left point) is a reference point (for example, the origin), the right direction is the X axis direction, and the lower direction is the Y axis direction.
- a predetermined point for example, the upper left point
- the reference point for example, the origin
- the right direction is the X axis direction
- the lower direction is the Y axis direction.
- FIG. 9 is a schematic diagram illustrating the fluctuation of the heart wall captured in the pulmonary dynamic image during respiratory arrest.
- the fluctuation of the lateral width of the heart is employed.
- FIGS. 9A to 9C illustrate a state in which the lateral width of the heart, which is the movement of the heart wall, increases from w1 to w3 in the process of expanding the heart.
- the predetermined part cycle specifying unit 130 sets the heartbeat (blood flow) cycle by detecting the lateral width of the heart from each breathing frame image and each breathing-stop frame image.
- a method for detecting the lateral width of the heart for example, a method for detecting the outline of the heart can be cited.
- a method for detecting the outline of the heart various known methods can be employed.
- a model indicating the shape of the heart
- a feature point in the X-ray image Detecting the outline of the heart by matching with the features of the heart model (for example, "Image feature analysis and computer-aided diagnosis in digital radiography: Automated analysis of sizes of heart and lung in chest images", Nobuyuki Nakamori et al., Medical Physics, Volume 17, Issue 3, May, 1990, pp.342-350.
- FIG. 10 is a schematic view illustrating the relationship between the time taken and the lateral width of the heart for a plurality of frame images during breathing that constitute the lung dynamic images during breathing arrest.
- the horizontal axis indicates time
- the vertical axis indicates the width of the heart
- the value of the width of the heart where a circle is detected is illustrated.
- the lateral width of the heart captured at time t is Hwt
- the lateral width of the heart captured at time t + 1 is Hwt + 1
- (Hwt + 1 ⁇ Hwt) ⁇ 0 holds, it is captured at time t. If the frame image during breathing stop is classified when the heart is dilated, and (Hwt + 1 ⁇ Hwt) ⁇ 0 is satisfied, the frame image during breathing arrested at time t is classified when the heart contracts. .
- the predetermined part period specifying unit 130 ′ automatically detects the heartbeat (blood flow) period based on the motion of the heart wall (change in the shape of the predetermined part) captured in the dynamic image. Can be obtained automatically.
- the predetermined part period specifying unit 130 ′ detects the blood flow period using frequency analysis or the like based on the movement of the heart wall (change in the shape of the predetermined part) captured in the dynamic image. As a result, since a desired fluctuation component from which the noise component has been removed can be automatically extracted, it is possible to more accurately grasp the amount of motion of the heart wall (a state in which the predetermined part changes with time).
- a seventh embodiment will be described.
- the difference from the first embodiment is that the blood flow information detection method in the detection unit 120 is different.
- the remaining configuration is the same as that of the image generation apparatus 3 of the first embodiment.
- the blood flow information detection method in the seventh embodiment is different from the blood flow information detection method in the sixth embodiment. However, as shown in FIG. It is common in the point of detecting the time change of the target state). Below, the 3rd blood-flow information detection method used by this embodiment is demonstrated.
- Third blood flow information detection method blood flow phase analysis
- blood flow phase analysis is performed using a captured image acquired by the dynamic image acquisition unit 110 to obtain blood flow information.
- the blood flow phase is phase information indicating the presence or absence of blood flow according to the position where the blood flow is flowing.
- blood flow phase analysis processing blood flow information generation processing
- Japanese Patent Application No. 2011-115601 filing date: May 24, 2011
- FIG. 11 is a diagram illustrating an analysis result of a spatiotemporal change associated with the presence or absence of blood flow in the entire lung.
- lung blood vessels expand when blood is rapidly discharged from the right ventricle through the aorta due to contraction of the heart, and this expansion is extracted by analyzing dynamic images. It is output as diagnosis support information regarding the presence or absence of blood flow in the entire lung. That is, when a blood vessel dilates in the lung field, the amount of radiation transmitted in the region where the pulmonary blood vessel has expanded is relatively smaller than the amount of radiation transmitted through the lung field (alveolar region). The output signal value of the detector 13 decreases.
- a pixel unit of the radiation detection unit 13 between a series of frame images MI constituting the dynamic image, or a small region unit (pixel block unit) composed of a plurality of pixels is associated with each other, and the pixel unit or the small region unit
- the frame image MI having the lowest signal value is obtained, and the corresponding region of the frame image MI is colored as a signal indicating the timing when the pulmonary blood vessels are expanded by the blood flow.
- a series of frame images MI after coloring are sequentially displayed on the display unit 34, so that a doctor or the like can visually recognize the blood flow state. Note that the white portion shown in FIG. 11 is actually colored in red or the like.
- a signal (referred to as a blood flow signal) indicating the timing at which a pulmonary blood vessel is expanded by blood flow is a waveform (referred to as an output signal waveform) indicating a temporal change in the signal value of that pixel (small region). It can be obtained by obtaining the minimum value.
- This blood flow signal appears at the same interval as the heartbeat cycle, but if there is an abnormal part such as an arrhythmia, a minimum value appears at a different interval from the heart beat cycle regardless of blood vessel dilation There is. Therefore, in the third blood flow information detection method, the blood flow signal is accurately extracted by obtaining the correlation coefficient between the pulsation signal waveform indicating the pulsation of the heart and the output signal waveform of each small region.
- FIG. 12 is a diagram illustrating a waveform showing a temporal change in a blood flow signal value in a pulmonary blood vessel region.
- FIG. 12A shows the position of the pulmonary blood vessel region IR2 corresponding to the diagnosis target region in a series of frame images MI acquired sequentially in time.
- the signal value (representative value) of the pulmonary vascular region IR2 of each frame image MI is plotted on the coordinate space where the elapsed time from the frame (frame number) is taken and the vertical axis is the signal value (representative value) in the pulmonary vascular region IR2. A graph is shown.
- the filtering process since the phases of respiration and blood flow are mixed, the mutual effects of respiration and blood flow are eliminated by the following filtering process. That is, in the filtering process, a low-frequency signal change due to breathing or the like is removed, and a temporal change in the signal value due to blood flow is extracted. For example, with respect to the time change of the signal value for each small region, high-pass filtering is performed at a low-frequency cut-off frequency of 0.7 Hz in the quiet breathing image group and at a low-frequency cut-off frequency of 0.5 Hz in the deep-breathing image group. Alternatively, filtering may be performed by a band-pass filter that cuts off a high frequency at a high-frequency cutoff frequency of 2.5 Hz in order to remove a noise component at a higher frequency.
- the systolic time and the diastolic (relaxation) time of the heart are calculated from signal changes in the heart region of a series of frame images MI (see FIG. 12A) (second blood flow information detection method) See).
- a value obtained by multiplying the reciprocal of the diastole time by a predetermined coefficient is set as a cut-off frequency for cutting off low frequencies with a high-pass filter or a band-pass filter. In the case of a band-pass filter, the value of the systolic time is set.
- a value obtained by multiplying the reciprocal by a predetermined coefficient is set as a high-frequency cutoff frequency that cuts off the high frequency.
- the low-frequency cutoff frequency considers the frequency component due to respiration, and analyzes the position value of the diaphragm, the area value of the lung field region, or the distance between feature points from a series of frame images MI (second respiration described later).
- the frame image MI at the rest expiratory position and the rest inspiratory position is detected, and the inspiratory period is determined from the number of frames between the frame at the rest expiratory position and the frame at the next rest inspiratory position.
- a value obtained by multiplying the reciprocal and the average value of the diastole time by a predetermined coefficient may be set as the low-frequency cutoff frequency.
- the cut-off frequency that is automatically set should be limited to 0.2 to 1.0 Hz for the low cut-off frequency and 2.0 Hz or more for the high cut-off frequency. Is preferred.
- the vital signs such as the respiratory rate and the pulse rate during rest, which are measured separately (see the first blood flow information detection method and the first respiratory information detection method described later), are input as patient information.
- the cutoff frequency may be calculated from these values.
- a low frequency cut-off frequency may be a value obtained by converting a respiration rate per minute input as patient information into a respiration rate per second and multiplying the respiration rate by a predetermined coefficient. Then, the input 1-minute pulse rate may be converted into a 1-second pulse rate, and a value obtained by multiplying the 1-second respiration rate by a predetermined coefficient may be used as the high-frequency cutoff frequency. Further, a value obtained by multiplying the average value of the respiration rate for one second and the heart rate for one second by a predetermined coefficient may be set as the low-frequency cutoff frequency.
- the predetermined part cycle specifying unit of the present embodiment in order to specify the blood flow cycle based on the blood flow phase change (change in the state of the predetermined part) captured in the dynamic image, It becomes possible to automatically acquire the blood flow cycle.
- the image generation devices 3, 3 ′, 3A, and 3B are described separately for each embodiment, but these individual functions are combined with each other unless they contradict each other. Also good.
- the part of the body to be imaged that is subject to phase detection is the part of the physical state that periodically changes in time. This is not only the heart and lungs, but also involuntary movements such as peristalsis. It may be another organ that performs the movement, or may be a part that performs voluntary movement such as a muscle or a joint. In the latter case, dynamic imaging is performed while the subject is repeatedly performing the same operation.
- the respiratory information and blood flow information in chest radiography are targeted.
- joint bending / extension direction information in joint radiography may be targeted.
- FIG. 25 is a diagram exemplifying the joint angle in joint flexion and extension.
- the bending / extension direction information is calculated from the movement of the joint angle ⁇ by the detection unit 120 (120A, 120B).
- the contour extraction method of the second respiratory information detection method, threshold processing, and the like are employed to extract the bone contour region.
- the axes AX1 and AX2 are extracted from the contour region, and the bending / stretching direction is calculated from the change in the angle ⁇ at which the two axes AX1 and AX2 intersect.
- the axes AX1 and AX2 may pass through the center of the contour region or may be lines along the edge of the contour region.
- FIG. 26 is a diagram schematically illustrating the generation of the bending / stretching phase diagnosis support information with respect to the period of the bending / stretching direction information specified by the predetermined part cycle specifying unit 130.
- the first and second analysis results refer to the extending direction and the bending direction
- the bending / extension phase corresponds to the time change, and even in the bending / extension phase period, FIG.
- identifiable diagnosis support information is generated.
- FIGS. 16 and 17 as an analysis result of diagnosis support information, a case is shown in which breathing information is displayed on the list display portion 162 (color bars C1, C2) in the display image IG.
- the blood flow information shown in FIG. 11 may be displayed.
- the display image generation unit 160 not only displays the blood flow phase on the color bar C1 (C2), but also the image processing result RT (FIG. 3) by the third blood flow information detection method (blood flow phase analysis) described above. 11) may be generated so as to be superimposed on or displayed close to the frame image MI.
- FIG. 27 illustrates a display image IG when blood flow information is displayed as an analysis result of diagnosis support information.
- the detection unit 120 based on the frame image MI (dynamic image), the image processing result RT regarding the presence or absence of blood flow in the target region (see FIG. 11). Is stored in the holding unit 32 in association with the dynamic image in terms of time. Then, the display image generation unit 160 generates the display image IG so as to display the image processing result RT in synchronization with the dynamic image.
- MI dynamic image
- the display image IG is preferably provided with an image processing result display portion 164 (dashed line area in FIG. 4) for displaying the image processing result RT as display contents that can be visually recognized. That is, as shown in FIG. 27, the image processing result display portion 164 (that is, the display portion of the image processing result RT) is superimposed (not shown) on the dynamic image display portion 161 (that is, the display portion of the frame image MI). Alternatively, it is desirable to adopt a display form that can be recognized by proximity (see FIG. 27). This makes it possible to visually grasp the frame image MI, the diagnosis support information, and the image processing result RT at the same time, thereby enabling more efficient medical diagnosis.
- the blood flow phase state in the lung field is determined on the color bar C1
- the blood flow state of the pulmonary blood vessels that may have a pulmonary embolism is watched on the image processing result RT (or the frame image MI when superimposed).
- the image processing result RT or the frame image MI when superimposed.
- the user adjusts the reproduction time that the user wants to see from the phase information using the progress bar PB. be able to.
- the phases of the right lung field and the left lung field are shown in color bars C1 and C2 in FIG.
- the phases of respiratory information and blood flow information may be displayed simultaneously. This is because blood flow diagnosis is most easily recognized because the lung is the largest in the state of inspiration with the maximum respiratory phase.
- the detection unit 120 may detect a time change of only the target region and display the phase information of the attention area on the color bar C1 (C2). This is because it is desirable to display the phase information limited to the area where the suspicion of pulmonary embolism is clear.
- the blood flow phase is expressed not only in two phases depending on the presence or absence of blood flow in the specified area, but also in three phases depending on which one has blood flow by dividing the area into the main blood vessel region and peripheral blood vessel region in the lung field. May be. This facilitates the designation and selection of the reproduction time according to which of the main blood vessel and the peripheral blood vessel is suspected of pulmonary embolism.
- the display image IG generated by the display image generation unit 160, 160A, 160B is not limited to the example of this embodiment. That is, by enabling user customization, the display image IG can be generated corresponding to all diagnostic viewpoints. For example, when the user clicks a specific movement (state change) information displayed in a graph and performs a frame selection operation, the color display may be switched and changed with the movement information as a target. Further, by designating a certain pixel area, the movement in the time direction of the pixel area may be analyzed, and the color display may be switched based on the analysis result.
- the color bar C1 (C2) when the position of the frame image MI is expressed by moving the progress bar PB in the horizontal direction, the color bar C1 (C2) also changes the color corresponding to the horizontal direction coordinate along the progress bar PB. Although it changed (refer FIG.16 and FIG.17), it is not restricted to this.
- the color bar C1 (C2) may also change the color along that direction. .
- the subject may be an animal body as well as a human body.
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Medical Informatics (AREA)
- Physics & Mathematics (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Radiology & Medical Imaging (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Biomedical Technology (AREA)
- Pathology (AREA)
- Animal Behavior & Ethology (AREA)
- Surgery (AREA)
- Biophysics (AREA)
- High Energy & Nuclear Physics (AREA)
- Veterinary Medicine (AREA)
- Optics & Photonics (AREA)
- Heart & Thoracic Surgery (AREA)
- Molecular Biology (AREA)
- Human Computer Interaction (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physiology (AREA)
- Mathematical Physics (AREA)
- General Engineering & Computer Science (AREA)
- Dentistry (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Quality & Reliability (AREA)
- Data Mining & Analysis (AREA)
- Primary Health Care (AREA)
- Multimedia (AREA)
- Databases & Information Systems (AREA)
- Epidemiology (AREA)
- Apparatus For Radiation Diagnosis (AREA)
Abstract
Description
本発明の第1実施形態に係る放射線動態画像撮影システムについて以下説明する。
第1実施形態に係る放射線動態画像撮影システムは、人体または動物の身体を被写体として、被写体の放射線画像の撮影を行い、所望の表示用画像を生成する。
撮影装置1は、例えば、X線撮影装置等によって構成され、呼吸に伴う被写体Mの胸部の動態を撮影する装置である。動態撮影は、被写体Mの胸部に対し、X線等の放射線を繰り返して照射しつつ、時間順次に複数の画像を取得することにより行う。この連続撮影により得られた一連の画像を動態画像と呼ぶ。また、動態画像を構成する複数の画像のそれぞれをフレーム画像と呼ぶ。
撮影制御装置2は、放射線照射条件や画像読取条件を撮影装置1に出力して撮影装置1による放射線撮影及び放射線画像の読み取り動作を制御するとともに、撮影装置1により取得された動態画像を撮影技師によるポジショニングの確認や診断に適した画像であるか否かの確認用に表示する。
画像生成装置3は、撮像装置1から送信された動態画像を、撮影制御装置2を介して取得し、医師等が読影診断するための画像を表示する。
図1では心電計4は被検者Mとは離れて示されているが、実際には心電計4の各電極端子は被検者Mに装着されており、被検者Mの心電波形をデジタル信号として出力する。
この実施形態における画像生成装置3の詳細を説明する前提として、参考例における動画診断の問題点を説明しておく。
本発明の第1実施形態における放射線動態画像撮影システム100の画像生成装置3は、被検者Mの心臓または肺等(所定部位)の周期的な時間変化に基づく状態変化を診断支援情報として時間軸方向にわかりやすく表示した表示用画像を生成することで、診断に関係する所望のフレーム画像MIを探す作業を軽減する。
図4及び図5は、放射線動態画像撮影システム100における画像生成装置3において、CPU等が各種プログラムに従って動作することにより制御部31で実現される機能構成を他の構成とともに示す図である。この実施形態の画像生成装置3は、主として心臓および両肺を含む胸部が撮影された動態画像を使用する。
動態画像取得部110では、撮像装置1の読取制御装置14によって撮影された人体または動物の対象部位(所定部位)を時間順次に捉えた動態画像を取得する。ここで、対象部位とは、図4では心臓領域、図5では肺領域であるものとする。
検出部120は、所定部位周期特定部130を備え、対象部位の物理的状態の時間変化を検出する(図4及び図5参照)。ここでいう「物理的状態」という用語は、心臓や肺等の幾何学的形状を指すほか、血流の濃度(血流の有無)などをも包含した意味で用いている。被検者Mの心臓または肺の周期的な時間変化、すなわち、血流(心拍も含む)、及び、呼吸の位相情報や周波数(周期)情報を検出する。ここにおける時間変化の検出とは、臓器の外形や血流の濃度などの部位周期物理的状態についての時間的変化の検出を意味する。
第1の血流情報検出方法として、図4で示されるように検出部120(所定部位周期特定部130)では、心電計4の位相検出部41から取得された結果を用いる方法である。図7は、被検者Mの心電図波形を例示する図である。なお、図7では、横軸が時刻、縦軸が電気信号の大きさ(電圧)を示しており、いわゆるP波、Q波、R波、S波、T波及びU波の形状をそれぞれ示す曲線Pp,Qp,Rp,Sp,Tp及びUpを含む電気信号の変化を示す曲線が示されている。
第1の呼吸情報検出方法として、別機器による計測で実施する。別機器により計測する方法としては、例えば、特許第3793102号に記載されているような装置を用いることができる。また、レーザー光とCCDカメラで構成されたセンサによるモニタリングにより実施する手法(例えば、"FG視覚センサを用いた就寝者の呼吸モニタリングに関する検討",青木 広宙,中島 真人,電子情報通信学会ソサイエティ大会講演論文集 2001年.情報・システムソサイエティ大会講演論文集, 320-321, 2001-08-29.等参照)等を採用することもできる。
一方、第2の呼吸情報検出方法として、動態画像取得部110によって取得された撮影画像を用いて、肺野部の面積値を算出することで、呼吸情報とするものである。肺野部の面積の求め方は、肺野部の輪郭抽出を行い、輪郭に囲まれた領域の画素数を肺野領域として定義することが可能である。すなわち、横隔膜の位置や胸郭の幅を検出することで、呼吸情報を得ることができる。
診断支援情報生成部150では、上述の検出部120(所定部位周期特定部130)によって検出された心臓または肺などの対象部位の物理的状態の時間変化に基づいて解析を行い、その解析結果を診断支援情報として生成する。
表示画像生成部160では、フレーム画像MI(動態画像)及び診断支援情報を表示するための表示用画像を生成する。すなわち、対象部位の位相変化と時間的に対応するフレーム画像MIとを対応付けて、表示用画像が生成される。
・肺野の呼吸位相に応じたカラーバーC1,
・プログレスバーPB,
・再生時刻表示部TM
が並列的に表示されている。
図18は、本実施形態に係る画像生成装置3において実現される基本動作を説明するフローチャートである。既に各部の個別機能の説明は行ったため(図4及び図5参照)、ここでは全体の流れのみ説明する。
以下では、第2実施形態について説明する。第1実施形態と異なる点は、表示画像生成部160にて生成される表示用画像が異なる点である。なお、残余の構成は第1実施形態の画像生成装置3と同様である。
・肺野の呼吸位相に応じたカラーバーC1,
・プログレスバーPB,
・再生時刻表示部TM,
が表示される一方、第2実施形態における表示用画像IGでは、
・位相変化を示す波形グラフF,
・再生時刻調整部341
が並列的に表示されている。
以下では、第3実施形態について説明する。第1実施形態と異なる点は、表示画像生成部160にて生成される表示用画像が異なる点である。また、第3実施形態における表示用画像は第2実施形態における表示用画像とも異なる。なお、残余の構成は第1実施形態の画像生成装置3と同様である。
・プログレスバーPB,
・再生時刻表示部TM,
・再生時刻調整部341,
が表示される一方、第3実施形態における表示用画像IGでは、
・右肺野の呼吸位相に応じたカラーバーC1,
・左肺野の呼吸位相に応じたカラーバーC2,
が並列的に表示されている。
ユーザは動画に注視したいため、フレーム選択操作には重要な点のみを表示した簡易表示が効果的である。そこで、第4実施形態では、設定条件下で定められた特徴点を算出し、表示用画像IGに付加される。なお、ここでいう特徴点についての詳細は後述するが、上述の第2の呼吸情報検出方法及び後述の第2の血流情報検出方法における特徴点とは異なることに留意したい。
特徴点算出部140は、対象部位の物理的状態の時間変化において、特徴点を算出する。なお、診断支援情報生成部150A及び表示画像生成部160Aは、当該特徴点を指示する情報を含んで診断支援情報を生成するものとする。
続いて、図22は、第4実施形態に係る画像生成装置3Aの動作フローを例示した図である。なお、図22のうち、ステップST1,ST2,ST4,ST5,ST6は図18のステップS1~S5と同様であるため、その説明は省略する。
ユーザの所望の条件を満たすフレーム画像MIの再生時刻が到来する、もしくは、到来したタイミングを、ユーザに報知することで、特に経験の浅いユーザには効果的となる。そこで、第5実施形態では、所望の条件を満たすタイミングをユーザに報知する手段を備える。
報知点算出部145は、対象部位の時間変化において、ユーザが所望する設定条件下で定められた報知するための点(以下「報知点」と称する)を算出し、報知部342に出力する。すなわち、設定条件とは、ユーザが指定する条件であり、例えば、対象部位の物理的状態の時間変化が図21で示される呼吸情報であるときに、ユーザが最大点を報知するよう指定する場合は、報知点算出部145が点P3及び点P6を報知点として検出する。なお、診断支援情報生成部150B及び表示画像生成部160Bは、当該報知点を指示する情報を含んで診断支援情報を生成するものとする。
続いて、図24は、第5実施形態に係る画像生成装置3Bの動作フローを例示した図である。なお、検出部120Bは、特徴点算出部140を備えていない場合を想定するものとする。また、図24のうち、ステップSP1,SP2,SP4,SP5,SP6は図18のステップS1~S5と同様であるため、その説明は省略する。
図6は、本発明の第6実施形態として構成された放射線動態画像撮影システム100’における画像生成装置3’で用いられる制御部31’の機能構成を示す図である。この制御部31’は、第1実施形態のシステム100における制御部31(図4)の代替として使用される。第1実施形態と異なる点は、検出部120’における血流情報の検出方法が異なる点である。なお、残余の構成は第1実施形態の画像生成装置3と同様である。また、本実施形態における対象部位とは、心臓領域または肺領域である。以下では、本実施形態で用いられる第2の血流情報検出方法について説明する。
第2の血流情報検出方法として、図6で示されるように検出部120’(所定部位周期特定部130’)では、動態画像取得部110によって取得された撮影画像を用いて、心臓壁の動き量を算出することで、心拍(血流)情報とするものである。すなわち、呼吸中肺動態画像及び呼吸停止中肺動態画像で、撮影対象の対象部位である肺とともに心臓も捉えられていることが前提条件である。詳細には、呼吸中肺動態画像及び呼吸停止中肺動態画像から心臓壁の変動が検出されることで、各呼吸中フレーム画像及び各呼吸停止中フレーム画像が撮影されたタイミングにおける心臓の拍動の位相が検出される。したがって、心臓壁が心臓の拍動の位相として検出される。
以下では、第7実施形態について説明する。第1実施形態と異なる点は、検出部120における血流情報の検出方法が異なる点である。なお、残余の構成は第1実施形態の画像生成装置3と同様である。また、第7実施形態における血流情報の検出方法は第6実施形態における血流情報の検出方法とは異なるが、図6で示されるように、動態画像に基づき血流情報(対象部位の物理的状態の時間変化)を検出する点においては共通する。以下では、本実施形態で用いられる第3の血流情報検出方法について説明する。
第3の血流情報検出方法として、動態画像取得部110によって取得された撮影画像を用いて、血流位相解析を行うことで、血流情報とするものである。血流位相とは、血流が流れている位置に応じた血流の有無を示す位相情報である。本発明で用いる血流位相解析処理(血流情報生成処理)に関しては、例えば、本出願人による出願である「特願2011-115601号(出願日:平成23年5月24日)」を採用することができる。
以上、本発明の実施形態について説明してきたが、本発明は、上記実施形態に限定されるものではなく、様々な変形が可能である。
2 撮影制御装置
3,3’,3A,3B 画像生成装置
4 心電計
31,31’,31A,31B 制御部
34,34B 表示部
41 位相検出部
100,100’,100A,100B 放射線動態画像撮影システム
110 動態画像取得部
120 検出部
130,130’ 所定部位周期特定部
140 特徴点算出部
145 報知点算出部
150,150A,150B 診断支援情報生成部
160,160A,160B 表示画像生成部
161 動態画像表示部分
162 一覧表示部分
163 再生時刻表示部分
164 画像処理結果表示部分
341 再生時刻調整部
342 報知部
C1,C2 カラーバー
PB プログレスバー
M 被写体(被検者)
MI フレーム画像
TM 再生時刻表示部
RT 画像処理結果
Claims (14)
- 人体または動物の所定部位を時間順次に捉えた動態画像を取得する動態画像取得部と、
前記所定部位の物理的状態の時間変化を検出する検出部と、
前記検出部によって検出された前記所定部位の物理的状態の時間変化に基づいて解析を行い、その解析結果を診断支援情報として生成する診断支援情報生成部と、
前記診断支援情報を前記動態画像と時間的に関連付けて保持する保持部と、
前記動態画像及び前記診断支援情報を表示するための表示用画像を生成する表示画像生成部と、
を備え、
前記診断支援情報は、前記解析に基づく第1の解析結果と第2の解析結果とを含み、
前記表示用画像は、
前記動態画像を表示する動態画像表示部分と、
前記診断支援情報の第1の解析結果と第2の解析結果とを識別可能に時間軸方向に一覧できるように表示する一覧表示部分と、
を含む画像であることを特徴とする、
画像生成装置。 - 請求項1に記載の画像生成装置であって、
前記表示用画像は、
前記一覧表示部分と関連して設けられ、前記一覧表示部分の時間軸方向の特定位置を示す指標を含み、
前記表示画像生成部は、
前記指標によって示された特定位置と対応する時点の動態画像を前記動態画像表示部分に表示するように前記表示用画像を生成する、ことを特徴とする、
画像生成装置。 - 請求項1または請求項2に記載の画像生成装置であって、
前記検出部は、
前記所定部位の物理的状態の周期的な時間変化となる所定部位周期を特定する所定部位周期特定部、
を備える、ことを特徴とする、
画像生成装置。 - 請求項3に記載の画像生成装置であって、
前記所定部位は肺野であり、
前記第1の解析結果は呼気を示し、
前記第2の解析結果は吸気を示す、ことを特徴とする、
画像生成装置。 - 請求項1または請求項2に記載の画像生成装置であって、
前記所定部位は複数の所定部位を含み、
前記検出部は、複数の所定部位の物理的状態の時間変化をそれぞれ検出し、
前記診断支援情報生成部は、複数の所定部位の物理的状態の時間変化に基づいてそれぞれ解析を行い、複数の所定部位についての解析結果を複数の診断支援情報としてそれぞれ生成し、
前記一覧表示部分は、前記複数の診断支援情報を表示することを特徴とする、
画像生成装置。 - 請求項5に記載の画像生成装置であって、
前記複数の部位は、左肺野及び右肺野を含むことを特徴とする、
画像生成装置。 - 請求項1または請求項2に記載の画像生成装置であって、
前記検出部は、
前記所定部位の物理的状態の時間変化において、特徴点を算出する特徴点算出部、
を備えることを特徴とする、
画像生成装置。 - 請求項1ないし請求項7の何れか1項に記載の画像生成装置であって、
前記検出部は、
前記動態画像に基づき、前記所定部位の物理的状態の時間変化を検出することを特徴とする、
画像生成装置。 - 請求項1ないし請求項8の何れか1項に記載の画像生成装置であって、
前記一覧表示部分は、前記第1の解析結果と第2の解析結果とを異なる色または濃淡で表示することを特徴とする、
画像生成装置。 - 請求項1ないし請求項9の何れか1項に記載の画像生成装置であって、
前記動態画像表示部分は、矩形であり、
前記一覧表示部分は、前記動態画像表示部分の一辺に沿った長尺状の領域であり、その長手方向と前記診断支援情報の時間軸方向とが対応していることを特徴とする、
画像生成装置。 - 請求項1ないし請求項10の何れか1項に記載の画像生成装置であって、
前記表示用画像は、
前記動態画像表示部分の表示に対応した再生時刻情報を表示する再生時刻表示部分、
を更に含むことを特徴とする、
画像生成装置。 - 請求項1ないし請求項11の何れか1項に記載の画像生成装置であって、
前記動態画像表示部分に表示される動態画像の再生時刻調整用の再生時刻調整インタフェースを更に備える、ことを特徴とする、
画像生成装置。 - 請求項1ないし請求項12の何れかに記載の画像生成装置であって、
前記診断支援情報生成部による解析結果が所定の条件を満たしたときに、ユーザに前記所定の条件を満たしたことを報知する報知部、
を更に備えることを特徴とする、
画像生成装置。 - 請求項8に記載の画像生成装置であって、
前記検出部は、
前記動態画像に基づき、前記所定部位の血流の有無に関する画像処理結果を出力し、
前記表示画像生成部は、
前記画像処理結果を前記動態画像と同期して表示するように前記表示用画像を生成することを特徴とする、
画像生成装置。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2013539064A JP5408399B1 (ja) | 2012-03-23 | 2013-03-12 | 画像生成装置 |
EP13764244.3A EP2829231B1 (en) | 2012-03-23 | 2013-03-12 | Image-generating apparatus |
US14/387,179 US20150042677A1 (en) | 2012-03-23 | 2013-03-12 | Image-generating apparatus |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2012-067186 | 2012-03-23 | ||
JP2012067186 | 2012-03-23 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2013141067A1 true WO2013141067A1 (ja) | 2013-09-26 |
Family
ID=49222527
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2013/056697 WO2013141067A1 (ja) | 2012-03-23 | 2013-03-12 | 画像生成装置 |
Country Status (4)
Country | Link |
---|---|
US (1) | US20150042677A1 (ja) |
EP (1) | EP2829231B1 (ja) |
JP (1) | JP5408399B1 (ja) |
WO (1) | WO2013141067A1 (ja) |
Cited By (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2014091977A1 (ja) * | 2012-12-12 | 2014-06-19 | コニカミノルタ株式会社 | 画像処理装置及びプログラム |
JP2017018317A (ja) * | 2015-07-10 | 2017-01-26 | コニカミノルタ株式会社 | 胸部画像表示システム及び画像処理装置 |
JP2017023296A (ja) * | 2015-07-17 | 2017-02-02 | コニカミノルタ株式会社 | 放射線撮影システム及び放射線撮影制御装置 |
JP2018064848A (ja) * | 2016-10-21 | 2018-04-26 | コニカミノルタ株式会社 | 動態解析システム |
JP2018078974A (ja) * | 2016-11-15 | 2018-05-24 | コニカミノルタ株式会社 | 動態画像処理システム |
JP2018532515A (ja) * | 2015-11-09 | 2018-11-08 | コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. | X線画像吸気品質モニタリング |
JP2019171105A (ja) * | 2019-05-31 | 2019-10-10 | 株式会社島津製作所 | 画像処理装置および放射線撮影装置 |
JP2020010786A (ja) * | 2018-07-17 | 2020-01-23 | コニカミノルタ株式会社 | 放射線画像表示制御装置、放射線画像解析装置及び放射線撮影システム |
JP2020089611A (ja) * | 2018-12-07 | 2020-06-11 | コニカミノルタ株式会社 | 画像表示装置、画像表示方法及び画像表示プログラム |
JP2020092949A (ja) * | 2018-12-14 | 2020-06-18 | コニカミノルタ株式会社 | 医用画像表示装置及び医用画像表示システム |
JP2020195589A (ja) * | 2019-06-03 | 2020-12-10 | キヤノン株式会社 | 画像処理装置、画像処理方法及びプログラム |
JP7463923B2 (ja) | 2020-09-15 | 2024-04-09 | コニカミノルタ株式会社 | X線動態画像表示装置、プログラム、x線動態画像表示方法及びx線動態画像表示システム |
Families Citing this family (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2015137131A1 (ja) * | 2014-03-12 | 2015-09-17 | 古野電気株式会社 | 超音波診断装置、及び超音波診断方法 |
JP6318739B2 (ja) * | 2014-03-17 | 2018-05-09 | コニカミノルタ株式会社 | 画像処理装置、およびプログラム |
US9691433B2 (en) * | 2014-04-18 | 2017-06-27 | Toshiba Medical Systems Corporation | Medical image diagnosis apparatus and medical image proccessing apparatus |
KR102449249B1 (ko) * | 2015-05-27 | 2022-09-30 | 삼성전자주식회사 | 자기 공명 영상 장치 및 그 방법 |
JP6638729B2 (ja) * | 2015-07-22 | 2020-01-29 | コニカミノルタ株式会社 | コンソール及び動態画像撮影診断システム |
JP6540348B2 (ja) | 2015-08-07 | 2019-07-10 | コニカミノルタ株式会社 | 放射線撮影システム |
DE102015216115B4 (de) | 2015-08-24 | 2023-08-10 | Siemens Healthcare Gmbh | Verfahren und System zum Bestimmen eines Triggersignals |
CN105335981B (zh) * | 2015-10-29 | 2018-06-29 | 重庆电信***集成有限公司 | 一种基于图像的货物监控方法 |
JP2017176202A (ja) * | 2016-03-28 | 2017-10-05 | コニカミノルタ株式会社 | 動態解析システム |
JP2018000281A (ja) * | 2016-06-28 | 2018-01-11 | コニカミノルタ株式会社 | 動態解析システム |
JP6812815B2 (ja) * | 2017-01-31 | 2021-01-13 | 株式会社島津製作所 | X線撮影装置およびx線画像解析方法 |
EP3631808A1 (en) * | 2017-05-31 | 2020-04-08 | Koninklijke Philips N.V. | Machine learning on raw medical imaging data for clinical decision support |
JP7183563B2 (ja) * | 2018-04-11 | 2022-12-06 | コニカミノルタ株式会社 | 放射線画像表示装置及び放射線撮影システム |
JP7047574B2 (ja) * | 2018-04-26 | 2022-04-05 | コニカミノルタ株式会社 | 動態画像解析装置、動態画像解析システム、動態画像解析プログラム及び動態画像解析方法 |
JP7135759B2 (ja) * | 2018-11-16 | 2022-09-13 | コニカミノルタ株式会社 | 画像処理装置及びプログラム |
JP7143747B2 (ja) * | 2018-12-07 | 2022-09-29 | コニカミノルタ株式会社 | 画像表示装置、画像表示方法及び画像表示プログラム |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH01179078A (ja) * | 1988-01-06 | 1989-07-17 | Hitachi Ltd | 動画像表示方式 |
JPH0819543A (ja) * | 1994-07-08 | 1996-01-23 | Ken Ishihara | 超音波診断装置 |
JP3793102B2 (ja) | 2002-02-22 | 2006-07-05 | キヤノン株式会社 | ダイナミックx線撮影方法及びダイナミックx線画像撮影を行うための制御装置 |
WO2006137294A1 (ja) | 2005-06-21 | 2006-12-28 | National University Corporation Kanazawa University | X線診断支援装置、プログラム及び記録媒体 |
JP2009508567A (ja) | 2005-09-15 | 2009-03-05 | ギブン イメージング リミテッド | データストリームを表示するシステムおよび方法 |
JP2011115601A (ja) | 2001-10-05 | 2011-06-16 | Surmodics Inc | 粒子固定化コーティングおよびその使用 |
Family Cites Families (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6673018B2 (en) * | 2001-08-31 | 2004-01-06 | Ge Medical Systems Global Technology Company Llc | Ultrasonic monitoring system and method |
WO2007078012A1 (ja) * | 2006-01-05 | 2007-07-12 | National University Corporation Kanazawa University | 連続x線画像スクリーニング検査装置、プログラム及び記録媒体 |
US8571288B2 (en) * | 2007-12-07 | 2013-10-29 | Kabushiki Kaisha Toshiba | Image display apparatus and magnetic resonance imaging apparatus |
US8170312B2 (en) * | 2008-06-17 | 2012-05-01 | Siemens Medical Solutions Usa, Inc. | Respiratory motion compensated cardiac wall motion determination system |
US8208703B2 (en) * | 2008-11-05 | 2012-06-26 | Toshiba Medical Systems Corporation | Medical image analysis apparatus and image analysis control program |
WO2010086776A1 (en) * | 2009-01-30 | 2010-08-05 | Koninklijke Philips Electronics N.V. | System for providing lung ventilation information |
JP5874636B2 (ja) * | 2010-08-27 | 2016-03-02 | コニカミノルタ株式会社 | 診断支援システム及びプログラム |
WO2012026146A1 (ja) * | 2010-08-27 | 2012-03-01 | コニカミノルタエムジー株式会社 | 胸部診断支援システム及びプログラム |
JP2012110400A (ja) * | 2010-11-22 | 2012-06-14 | Konica Minolta Medical & Graphic Inc | 動態診断支援情報生成システム |
EP2769675B1 (en) * | 2011-10-17 | 2017-10-25 | Konica Minolta, Inc. | Dynamic radiographic imaging system and program |
WO2013150911A1 (ja) * | 2012-04-04 | 2013-10-10 | コニカミノルタ株式会社 | 画像生成装置及びプログラム |
US9836842B2 (en) * | 2012-10-04 | 2017-12-05 | Konica Minolta, Inc. | Image processing apparatus and image processing method |
US20150305650A1 (en) * | 2014-04-23 | 2015-10-29 | Mark Hunter | Apparatuses and methods for endobronchial navigation to and confirmation of the location of a target tissue and percutaneous interception of the target tissue |
-
2013
- 2013-03-12 EP EP13764244.3A patent/EP2829231B1/en not_active Not-in-force
- 2013-03-12 US US14/387,179 patent/US20150042677A1/en not_active Abandoned
- 2013-03-12 JP JP2013539064A patent/JP5408399B1/ja active Active
- 2013-03-12 WO PCT/JP2013/056697 patent/WO2013141067A1/ja active Application Filing
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH01179078A (ja) * | 1988-01-06 | 1989-07-17 | Hitachi Ltd | 動画像表示方式 |
JPH0819543A (ja) * | 1994-07-08 | 1996-01-23 | Ken Ishihara | 超音波診断装置 |
JP2011115601A (ja) | 2001-10-05 | 2011-06-16 | Surmodics Inc | 粒子固定化コーティングおよびその使用 |
JP3793102B2 (ja) | 2002-02-22 | 2006-07-05 | キヤノン株式会社 | ダイナミックx線撮影方法及びダイナミックx線画像撮影を行うための制御装置 |
WO2006137294A1 (ja) | 2005-06-21 | 2006-12-28 | National University Corporation Kanazawa University | X線診断支援装置、プログラム及び記録媒体 |
JP2009508567A (ja) | 2005-09-15 | 2009-03-05 | ギブン イメージング リミテッド | データストリームを表示するシステムおよび方法 |
Non-Patent Citations (4)
Title |
---|
HIROOKI AOKI; MASATO NAKAJIMA: "A Study on respiration monitoring of a sleeping person with FG vision sensor", THE INSTITUTE OF ELECTRONICS, INFORMATION AND COMMUNICATION ENGINEERS, SOCIETY CONFERENCE, PROCEEDINGS 2001, INFORMATION, SYSTEM SOCIETY CONFERENCE REPORT, 29 August 2001 (2001-08-29), pages 320 - 321 |
NOBUYUKI NAKAMORI ET AL.: "Image feature analysis and computer-aided diagnosis in digital radiography: Automated analysis of sizes of heart and lung in chest images", MEDICAL PHYSICS, vol. 17, no. 3, May 1990 (1990-05-01), pages 342 - 350, XP000136509, DOI: doi:10.1118/1.596513 |
See also references of EP2829231A4 |
XIN-WEI XU; KUNIO DOI: "Image feature analysis and computer-aided diagnosis: Accurate determination of ribcage boundary in chest radiographs", MEDICAL PHYSICS, vol. 22, no. 5, May 1995 (1995-05-01), pages 617 - 626, XP000518760, DOI: doi:10.1118/1.597549 |
Cited By (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9639952B2 (en) | 2012-12-12 | 2017-05-02 | Konica Minolta, Inc. | Image-processing apparatus and storage medium |
WO2014091977A1 (ja) * | 2012-12-12 | 2014-06-19 | コニカミノルタ株式会社 | 画像処理装置及びプログラム |
JP2017018317A (ja) * | 2015-07-10 | 2017-01-26 | コニカミノルタ株式会社 | 胸部画像表示システム及び画像処理装置 |
JP2017023296A (ja) * | 2015-07-17 | 2017-02-02 | コニカミノルタ株式会社 | 放射線撮影システム及び放射線撮影制御装置 |
JP2018532515A (ja) * | 2015-11-09 | 2018-11-08 | コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. | X線画像吸気品質モニタリング |
JP7004648B2 (ja) | 2015-11-09 | 2022-01-21 | コーニンクレッカ フィリップス エヌ ヴェ | X線画像吸気品質モニタリング |
JP2018064848A (ja) * | 2016-10-21 | 2018-04-26 | コニカミノルタ株式会社 | 動態解析システム |
JP2018078974A (ja) * | 2016-11-15 | 2018-05-24 | コニカミノルタ株式会社 | 動態画像処理システム |
JP2020010786A (ja) * | 2018-07-17 | 2020-01-23 | コニカミノルタ株式会社 | 放射線画像表示制御装置、放射線画像解析装置及び放射線撮影システム |
JP7047643B2 (ja) | 2018-07-17 | 2022-04-05 | コニカミノルタ株式会社 | 画像処理装置、放射線撮影システム、画像処理プログラム及び画像処理方法 |
JP2020089611A (ja) * | 2018-12-07 | 2020-06-11 | コニカミノルタ株式会社 | 画像表示装置、画像表示方法及び画像表示プログラム |
JP2020092949A (ja) * | 2018-12-14 | 2020-06-18 | コニカミノルタ株式会社 | 医用画像表示装置及び医用画像表示システム |
US11348289B2 (en) | 2018-12-14 | 2022-05-31 | Konica Minolta, Inc. | Medical image display device and medical image display system for superimposing analyzed images |
JP7119974B2 (ja) | 2018-12-14 | 2022-08-17 | コニカミノルタ株式会社 | 医用画像表示装置及び医用画像表示システム |
JP2019171105A (ja) * | 2019-05-31 | 2019-10-10 | 株式会社島津製作所 | 画像処理装置および放射線撮影装置 |
JP2020195589A (ja) * | 2019-06-03 | 2020-12-10 | キヤノン株式会社 | 画像処理装置、画像処理方法及びプログラム |
JP7345282B2 (ja) | 2019-06-03 | 2023-09-15 | キヤノン株式会社 | 画像処理装置、画像処理方法及びプログラム |
JP7463923B2 (ja) | 2020-09-15 | 2024-04-09 | コニカミノルタ株式会社 | X線動態画像表示装置、プログラム、x線動態画像表示方法及びx線動態画像表示システム |
Also Published As
Publication number | Publication date |
---|---|
EP2829231A4 (en) | 2015-10-28 |
EP2829231A1 (en) | 2015-01-28 |
JP5408399B1 (ja) | 2014-02-05 |
EP2829231B1 (en) | 2018-08-01 |
US20150042677A1 (en) | 2015-02-12 |
JPWO2013141067A1 (ja) | 2015-08-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5408399B1 (ja) | 画像生成装置 | |
JP6512338B2 (ja) | 画像処理装置及びプログラム | |
JP5408400B1 (ja) | 画像生成装置及びプログラム | |
JP6597548B2 (ja) | 動態解析システム | |
JP6418091B2 (ja) | 胸部画像表示システム及び画像処理装置 | |
CN104853677B (zh) | 图像处理装置以及图像处理方法 | |
JP6217241B2 (ja) | 胸部診断支援システム | |
JP7073961B2 (ja) | 動態画像解析装置、動態画像解析方法及びプログラム | |
JP5200656B2 (ja) | 動態撮影システム | |
WO2014192505A1 (ja) | 画像処理装置及びプログラム | |
JP2018148964A (ja) | 動態解析システム | |
JP2018078974A (ja) | 動態画像処理システム | |
JP2009153678A (ja) | 動態画像処理システム | |
JP2014079312A (ja) | 画像処理装置及びプログラム | |
JP2013172792A (ja) | 医用画像診断装置 | |
US20190298290A1 (en) | Imaging support apparatus and radiographic imaging system | |
JP6740910B2 (ja) | 動態画像処理システム | |
JP7255725B2 (ja) | 動態画像解析システム及び動態画像解析プログラム | |
JP7435242B2 (ja) | 動態画像解析装置、動態画像解析方法及びプログラム | |
WO2013038896A1 (ja) | 放射線動態画像撮影システムおよびプログラム | |
JP7487566B2 (ja) | プログラム、画像処理装置及び画像処理方法 | |
JP2020171475A (ja) | 動態画像解析装置、動態画像解析方法及びプログラム | |
JP6073558B2 (ja) | 医用画像診断装置 | |
JP2020062394A (ja) | 画像処理装置 | |
JP2021131742A (ja) | 画像処理装置、放射線画像システム及びプログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
ENP | Entry into the national phase |
Ref document number: 2013539064 Country of ref document: JP Kind code of ref document: A |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 13764244 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14387179 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2013764244 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |