WO2013037702A1 - Method and a system for medical imaging - Google Patents

Method and a system for medical imaging Download PDF

Info

Publication number
WO2013037702A1
WO2013037702A1 PCT/EP2012/067468 EP2012067468W WO2013037702A1 WO 2013037702 A1 WO2013037702 A1 WO 2013037702A1 EP 2012067468 W EP2012067468 W EP 2012067468W WO 2013037702 A1 WO2013037702 A1 WO 2013037702A1
Authority
WO
WIPO (PCT)
Prior art keywords
anatomical structure
medical data
unit
score
context features
Prior art date
Application number
PCT/EP2012/067468
Other languages
French (fr)
Inventor
Rahul THOTA
Amit Kale
Original Assignee
Siemens Aktiengesellschaft
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Siemens Aktiengesellschaft filed Critical Siemens Aktiengesellschaft
Priority to DE112012003818.5T priority Critical patent/DE112012003818T5/en
Publication of WO2013037702A1 publication Critical patent/WO2013037702A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5215Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/05Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves 
    • A61B5/055Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves  involving electronic [EMR] or nuclear [NMR] magnetic resonance, e.g. magnetic resonance imaging
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/4887Locating particular structures in or on the body
    • A61B5/4893Nerves
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
    • A61B6/52Devices using data or image processing specially adapted for radiation diagnosis
    • A61B6/5211Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/143Segmentation; Edge detection involving probabilistic approaches, e.g. Markov random field [MRF] modelling
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/20ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
    • A61B6/46Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment with special arrangements for interfacing with the operator or the patient
    • A61B6/467Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment with special arrangements for interfacing with the operator or the patient characterised by special input means
    • A61B6/468Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment with special arrangements for interfacing with the operator or the patient characterised by special input means allowing annotation or message recording
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
    • A61B6/46Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment with special arrangements for interfacing with the operator or the patient
    • A61B6/467Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment with special arrangements for interfacing with the operator or the patient characterised by special input means
    • A61B6/469Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment with special arrangements for interfacing with the operator or the patient characterised by special input means for selecting a region of interest [ROI]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
    • A61B6/50Clinical applications
    • A61B6/501Clinical applications involving diagnosis of head, e.g. neuroimaging, craniography
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
    • A61B6/50Clinical applications
    • A61B6/506Clinical applications involving diagnosis of nerves
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • A61B8/0833Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures
    • A61B8/085Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures for locating body or organic structures, e.g. tumours, calculi, blood vessels, nodules
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/467Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient characterised by special input means
    • A61B8/468Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient characterised by special input means allowing annotation or message recording
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/467Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient characterised by special input means
    • A61B8/469Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient characterised by special input means for selection of a region of interest
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10072Tomographic images
    • G06T2207/10081Computed x-ray tomography [CT]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10072Tomographic images
    • G06T2207/10088Magnetic resonance imaging [MRI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10132Ultrasound image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20076Probabilistic image processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30016Brain
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/40ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing

Definitions

  • the present invention relates to a method and a system for detecting an anatomical structure in a medical data
  • Medical imaging data or medical data of a subject comprises one or more multidimensional medical images of the subject, such as 2D, 3D images, and the like.
  • the medical imaging data can relate to 4D ultrasound, computer tomography (CT) , magnetic resonance (MR) .
  • CT computer tomography
  • MR magnetic resonance
  • a clinic or a radiologist is required to browse through many 2D slices of the medical imaging data.
  • the slices can be browsed using a cine feature, wherein the slices of the medical imaging data can be browsed slice by slice or
  • the cine feature comprises features like play/pause or next/previous depending on whether the slices are browsed slice by slice or like a video.
  • the object of the present invention is to improve the
  • the above object is achieved by a method of detecting an anatomical structure in a medical data, the method comprising receiving the medical data associated with the anatomical structure, comparing a test value of context features of each unit point of the medical data with a range assigned to reference context features of the medical data, wherein the reference context features are derived using information from a neighborhood of the anatomical structure from a training medical data, assigning a score to each unit point as a function of the comparison, and detecting the presence of the anatomical structure based on the score of the unit points.
  • the context features of a unit point of the medical data comprise information of a neighborhood of the unit point.
  • the reference context features can be obtained from training medical data for a respective anatomical structure.
  • the use of context features enable in using information of the surrounding of the anatomical structure.
  • the comparison of the information of the surrounding of the region of the medical data and the information of the surrounding of the unit point enables in detecting the presence of the
  • the detection of the anatomical structure includes determining unit points having the score greater than a threshold score to obtain positive unit points. This achieves in identifying unit points that most likely represent the anatomical structure.
  • the detection includes detecting appearance features of the positive unit points, and detecting a shape of the anatomical structure in the medical data using appearance features of the positive unit points. The appearance features of the positive unit points enables in detecting the shape of the anatomical structure from the positive unit points. The appearance features of the positive unit points provide additional information of the positive unit points which enables in detecting the shape of the anatomical structure.
  • the comparison of the test value of context features includes receiving an input indicating the anatomical structure to be detected, and obtaining the reference context features corresponding to the input received.
  • the corresponding reference context features relating to the anatomical structure to be detected can be obtained.
  • the method further comprises assigning a unique identification number to a portion of the medical data comprising the anatomical
  • the method further comprises displaying the portion of the medical data comprising the anatomical structure.
  • the detected anatomical structure in the medical data can be displayed.
  • Another embodiment includes a medical imaging system for detecting an anatomical structure in a medical data, the medical imaging system comprising an acquisition device for acquiring the medical data of a subject associated with the anatomical structure, a processing unit configured to compare a test value of context features of each unit point of the medical data with a range assigned to reference context features of the medical data, wherein the reference context features are derived using information from a neighborhood of the anatomical structure from a training medical data, assign a score to each unit point as a function of the comparison, and detect the presence of the anatomical structure based on the score of the unit points.
  • FIG 1 illustrates an exemplary block diagram of a medical imaging system for detecting an anatomical structure in a medical data according to an
  • FIG 2 illustrates an example of a medical data of a
  • FIG 3 illustrates an example of a score map for a medical data, wherein each unit point is assigned a score based on a comparison
  • FIG 4 illustrates an example of a shape of an anatomical structure detected using a set of positive points from a score map
  • FIG 5 an example of a cine application comprising the
  • FIG 6 is a flow diagram illustrating method of deriving reference context features for an anatomical structure from one or more training medical data according to an embodiment herein,
  • FIG 7 is a flow diagram illustrating a method of
  • FIG 8 illustrates a representative hardware environment for practicing the embodiments described herein.
  • FIG 1 illustrates an exemplary block diagram of a medical imaging system 10 for detecting an anatomical structure in a medical data according to an embodiment herein.
  • the system 10 comprises an acquisition device 15, a processing unit 20 and a memory device 25.
  • the acquisition device 15 acquires medical data of a subject.
  • the term "medical data” used herein refers to multidimensional medical images such as 2D, 3D and the like.
  • the acquired medical data is provided to the processing unit 20 for processing by the acquisition device and the processing unit is configured to process the medical data.
  • the processing unit 20 is configured to process the medical data to detect an anatomical structure of the subject.
  • FIG 2 illustrates an example of a medical data of a subject according to an embodiment herein.
  • the anatomical structure 35 do be detected is a right optic nerve.
  • right optic nerve Generally, the appearance of right optic nerve is feeble and it is very hard to detect by normal human observation.
  • the right optic nerve can be seen inside the rectangle 36, and is designated as the anatomical structure 35.
  • the processing unit 20 is configured to access the memory device 25 and obtain reference context features stored at the memory device 25 corresponding to the right optic nerve 35.
  • the reference context features are derived using information from a
  • the reference context features can be derived in a training phase for the anatomical structure from the one ore more training medical data. The process of deriving the reference context features will be explained in detail later.
  • a "processing unit” as used herein is a device for executing machine-readable instructions stored on a computer readable medium, for performing tasks and may comprise any one or combination of, hardware and firmware.
  • the processing unit may be implemented using a microcontroller, microprocessor, electronic devices, or other electronic units to perform the functions described herein or a combination thereof.
  • the machine-readable instructions may be stored within the processing unit or external to the processor.
  • the memory device 25 can be deployed using a volatile or a nonvolatile memory.
  • the processing unit 20 on obtaining the reference context features is configured to compare a test value of the context features of each unit point of the medical data 30 with a range assigned to the reference context features.
  • unit point used herein refers to the smallest unit point of the medical data that can be represented or controlled, such as a pixel of a 2D image and a voxel of a 3D image.
  • the medical data can be
  • the processing unit 20 can be configured to compare the test value of the context features of each unit point in each slice of the medical data.
  • the use of context features enable in using surrounding information of the anatomical structures obtained from training medical data for detecting the anatomical structures in the medical data. This enables in detecting an anatomical structure having feeble appearance or when
  • the processing unit 20 is configured to assign a score to each unit point of the medical data as a function of the
  • the processing unit 20 can be configured to generate a score map using the scores assigned to each unit points of the medical data.
  • the comparison and assignment of the score to each unit point based on the comparison can be performed using an adaptive boosting algorithm.
  • the processing unit 20 can be configured to perform the functions of the adaptive boosting algorithm.
  • FIG 3 illustrates an example of a score map for the medical data 30, wherein each unit point is assigned a score based on the comparison.
  • each unit point is assigned a score based on the comparison.
  • the unit points corresponding to the right optic nerve 35 are
  • the scope map 40 illustrated in the example of FIG 3 is a 2D score map.
  • the score map can be a 3D score map also. If the medical data comprises a plurality of 2D slices, a plurality of respective 2D score maps can be generated for each 2D slice. Similarly, if the medical data comprises a plurality of 3D slices, a plurality of respective 3D score maps can be generated for each 3D slice .
  • the processing unit 20 is configured to detect the presence of the right optic nerve 35. To detect the right optic nerve 35, in an aspect, the processing unit 20 is configured to determine unit points having the score greater than a threshold score to obtain positive unit points.
  • positive unit points used herein refers to the unit points having a score greater than the threshold score, and thus, are unit points that most likely represent the
  • the positive unit points, designated as 50, are shown as enclosed within the rectangle 45.
  • the threshold score can be selected as per the accuracy desired. Determining the unit points whose score is greater than the threshold score enables in removing false positive substantially.
  • the threshold score for each type of the anatomical structure can be determined as per the accuracy desired and stored in the memory device 25.
  • the processing unit 20 can be configured to access the memory device and obtain the threshold score.
  • the threshold score can be provided as an input to the processing unit 20.
  • the processing unit 20 can be configured to detect appearance features of the positive unit points 50 representing the anatomical structure 35.
  • positive unit points 50 representing the anatomical structure 35 will be the
  • a score map can comprise multiple set of positive unit points representing a plurality of anatomical structures .
  • the anatomical structures can be of the same type if one anatomical structure can be present at different locations within a body the subject.
  • the different set of positive unit points can represent different anatomical structures if the different anatomical structures desired to be detected can be detected in the same slice of the medical data.
  • a set of positive unit points will be neighboring unit points because the positive unit points will most likely be the unit points representing the
  • the processing unit 20 is configured to determine the
  • the appearance features can be detected by applying an appearance detection algorithm over the positive unit points of each slice of the medical data.
  • an appearance detection algorithm For example, a gabor based linear discriminant analysis (LDA) classifier can be used for detecting the appearance features of the positive unit points 50.
  • LDA linear discriminant analysis
  • the appearance detection algorithm can determine one or more of the appearance features, such as, intensity,
  • the processing unit 20 is configured to detect the shape of the anatomical structure 35.
  • the appearance features of the positive unit points 50 provide additional information and thus achieves in detecting the shape of the anatomical structure.
  • FIG 4 illustrates an example of a shape of the anatomical structure 35 detected using the set of positive points 50 from the score map 40 of FIG 3.
  • the shape of the anatomical structure 35 can be seen within the rectangle 55.
  • the system 10 comprises an input device 30 operably coupled to the input device 30
  • a user such as a radiologist can provide an indication to the processing unit 20 as to which
  • the processing unit 20 is configured to access the memory device 25 to obtain the reference context features corresponding to the anatomical structure for which the indication was provided via the input device 30.
  • the processing unit 20 can access the memory device 25 to obtain reference context features of different anatomical structures stored therein and detect the
  • the detected anatomical structure can be displayed to the user using a display device 35 operably coupled to the processing unit 20. Additionally, the input device 30 can be used for by the radiologist for providing the threshold score for an anatomical structure as an input to the processing unit 20.
  • the processing unit 20 is configured to assign a unique
  • the unique identification can be such that the potion of the medical data comprising the detected anatomical structure 35 can be identified uniquely.
  • the unique identification can be a unique number, a character string or a combination of number and characters.
  • the anatomical structure 35 is detected in a single 2D slice of the medical data.
  • the processing unit 20 will assign a unique identification to the slice of the medical data comprising the anatomical structure 35.
  • anatomical structure 35 i.e., the right optic nerve
  • the right optic nerve can be associated with an input received from a user corresponding to right optic nerve. For example, if a plurality of
  • the slice of the medical data comprising the right optic nerve 35 can be easily accessed by using the unique identification and then displayed. This also achieves in easy report generation as the respective portions of medical data comprising one or more respective anatomical structure can easily be accessed using the unique identification and the complete medical data is not required to be browsed for accessing the portion of the medical data comprising the anatomical structure 35. This is described in more detail in the description of FIG 5 below.
  • FIG 5 illustrates an example of a cine application comprising the embodiments described herein.
  • the cine application apart from comprising conventional browsing features, such as, play/pause, next/previous, and the like, comprises anatomy wise browsing.
  • the input tab for play is designated as 106 and the input tab for pause is designated as 107.
  • the input tab for next is designated as 108 and the input tab for previous is designated as 109.
  • the six input tabs 110 to 115 of the cine application can be assigned to anatomical structures as follows:
  • a radiologist desiring to look at a particular anatomical structure can click or press the input tab corresponding to the anatomical structure.
  • the processing unit 20 is
  • the radiologist configured to display the anatomical structure for which the input tab was clicked by accessing the portion of the medical data comprising the particular anatomical structure using the unique identification assigned to the portion of the medical data.
  • This enables the radiologist to view the desired anatomical structure easily without manually scanning each slice of the medical data.
  • the portion of the medical data 115 corresponding to right optic nerve 35 is displayed on the radiologist clicking the input tab 110.
  • the respective portion of the medical data comprising other anatomical structures designated by the respective input tabs 111 to 115 can be displayed on clicking the corresponding input tabs 111 to 115.
  • the portions of medical data comprising diagnostically relevant anatomical structures can be inserted into the report easily.
  • FIG 6 is a flow diagram illustrating method of deriving reference context features for an anatomical structure from one or more training medical data according to an embodiment herein.
  • a region is marked on an anatomical structure of one or more training medical data for which reference context features are desired to be generated.
  • the region is marked at the center of the anatomical structure.
  • a neighborhood of the region of the anatomical structure is sampled to obtain a plurality of context features for the anatomical structure.
  • the plurality of context features are evaluated to determine the context features having discriminative information for the anatomical structure and are herein referred to as reference context features.
  • the range for a reference context feature can be determined with respect to values of positive and negative training examples of the anatomical structure.
  • the range assists in identifying positive context features from the medical data to be tested.
  • the steps of blocks 130 and 135 can be performed using a feature selection algorithm such as an adaptive boosting algorithm.
  • FIG 7 is with reference to FIGS 1 through 7 is a flow diagram illustrating a method of detecting an anatomical structure in a medical data according to an embodiment herein.
  • the medical data associated with the anatomical is a flow diagram illustrating a method of detecting an anatomical structure in a medical data according to an embodiment herein.
  • each unit point of the medical data is compared with a range assigned to reference context features of the medical data, wherein the reference context features are derived using information from a neighborhood of the
  • anatomical structure from a training medical data.
  • a score is assigned to each unit point as a function of the comparison.
  • the presence of the anatomical structure 35 is detected based on the score of the unit points.
  • the embodiments herein can take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment including both hardware and software elements.
  • the embodiments that are implemented in software include but are not limited to, firmware, resident software, microcode, etc.
  • the embodiments herein can take the form of a computer program product accessible from a computer-usable or computer-readable medium providing program code for use by or in connection with a computer or any instruction execution system.
  • a computer- usable or computer readable medium can be any apparatus that can comprise, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device.
  • the medium can be an electronic, magnetic, optical,
  • a computer-readable medium examples include a semiconductor or solid state memory, magnetic tape, a removable computer diskette, a random access memory (RAM) , a read-only memory (ROM) , a rigid magnetic disk and an optical disk.
  • RAM random access memory
  • ROM read-only memory
  • Current examples of optical disks include compact disk - read only memory (CD- ROM) , compact disk - read/write (CD-R/W) and DVD.
  • executing program code will include at least one processor coupled directly or indirectly to memory elements through a system bus.
  • the memory elements can include local memory employed during actual execution of the program code, bulk storage, and cache memories which provide temporary storage of at least some program code in order to reduce the number of times code must be retrieved from bulk storage during execution .
  • I/O devices including but not limited to keyboards, displays, pointing devices, etc.
  • Network adapters may also be coupled to the system to enable the data processing system to become coupled to other data processing systems or remote printers or storage devices through intervening private or public
  • Modems, cable modem and Ethernet cards are just a few of the currently available types of network adapters.
  • FIG 8 depicts a representative hardware environment for practicing the embodiments described herein. This schematic drawing illustrates a hardware configuration of an
  • the system 160 comprises at least one processor or central processing unit (CPU) 165.
  • the CPU 165 is interconnected via bus 170 to various devices such as a memory 175, input/output (I/O) controller 180, and user interface controller 185.
  • I/O input/output
  • user interface controller 185 user interface controller
  • the memory 175 may be volatile (such as random access memory (RAM) etc., nonvolatile (read only memory (ROM), flash memory devices etc.,) or a combination of the two.
  • RAM random access memory
  • ROM read only memory
  • flash memory devices etc.
  • the memory 175 is used to store instructions and data for use by the CPU 165.
  • controller 180 can connect to peripheral devices, such as CD drives 190 and hard drives 195, or other program storage devices that are readable by the system 160.
  • peripheral devices such as CD drives 190 and hard drives 195, or other program storage devices that are readable by the system 160.
  • an operating system for the computer system 160 as well as an application program is stored onto the hard drive 195.
  • the operating system runs on the CPU 165 and is used to
  • system 160 can read the inventive system 160
  • the user interface controller 185 can connect to a keyboard 200, mouse 205, speaker 210, microphone 215, display device 220 and/or other user interface devices such as a touch screen device (not shown) to the bus 170 to gather user input and also to provide system output to the user.
  • a touch screen device not shown
  • anatomical structure Manually identifying the anatomical structure is time consuming as all the slices will have to be browsed. Additionally, in cases where the appearance of the anatomical structure is feeble, manual detection is difficult and may not be correct .
  • the use of context features achieve in detection of the feeble and geometrically constrained anatomical structures.
  • the embodiments can also be used for report generation where portions of medical data comprising diagnostically relevant anatomical structures can be inserted into the report automatically.

Abstract

The present invention relates to method and a system for detecting an anatomical structure (35) in a medical data (30), wherein the method comprises receiving the medical data (30) associated with the anatomical structure (35), comparing a test value of context features of each unit point of the medical data (30) with a range assigned to reference context features of the medical data (30), wherein the reference context features are derived using information from a neighborhood of the anatomical structure from a training medical data, assigning a score to each unit point as a function of the comparison, and detecting the presence of the anatomical structure (35) in the medical data (30) based on the score of the unit points.

Description

Description
Method and a system for medical imaging The present invention relates to a method and a system for detecting an anatomical structure in a medical data
Medical imaging data or medical data of a subject comprises one or more multidimensional medical images of the subject, such as 2D, 3D images, and the like. For example, the medical imaging data can relate to 4D ultrasound, computer tomography (CT) , magnetic resonance (MR) . To detect a particular
anatomical structure in the medical imaging data, a clinic or a radiologist is required to browse through many 2D slices of the medical imaging data. For example, the slices can be browsed using a cine feature, wherein the slices of the medical imaging data can be browsed slice by slice or
multiple slices can be browsed like a video. The cine feature comprises features like play/pause or next/previous depending on whether the slices are browsed slice by slice or like a video.
On detecting the particular anatomical structure, the
clinician generally marks a region of interest around the anatomical structures. However, to identify the anatomical structure and to mark the region of interest, the clinician has to spend considerable amount of time.
The object of the present invention is to improve the
detection of an anatomical structure in a medical data.
The above object is achieved by a method of detecting an anatomical structure in a medical data, the method comprising receiving the medical data associated with the anatomical structure, comparing a test value of context features of each unit point of the medical data with a range assigned to reference context features of the medical data, wherein the reference context features are derived using information from a neighborhood of the anatomical structure from a training medical data, assigning a score to each unit point as a function of the comparison, and detecting the presence of the anatomical structure based on the score of the unit points. The context features of a unit point of the medical data comprise information of a neighborhood of the unit point. The reference context features can be obtained from training medical data for a respective anatomical structure. The use of context features enable in using information of the surrounding of the anatomical structure. The comparison of the information of the surrounding of the region of the medical data and the information of the surrounding of the unit point enables in detecting the presence of the
anatomical structure in the medical data easily. Thus, this improves the accuracy of detection of the anatomical
structure as feeble anatomical structures can also be
detected .
According to an embodiment, the detection of the anatomical structure includes determining unit points having the score greater than a threshold score to obtain positive unit points. This achieves in identifying unit points that most likely represent the anatomical structure. According to yet another embodiment, the detection includes detecting appearance features of the positive unit points, and detecting a shape of the anatomical structure in the medical data using appearance features of the positive unit points. The appearance features of the positive unit points enables in detecting the shape of the anatomical structure from the positive unit points. The appearance features of the positive unit points provide additional information of the positive unit points which enables in detecting the shape of the anatomical structure.
According to yet another embodiment, the comparison of the test value of context features includes receiving an input indicating the anatomical structure to be detected, and obtaining the reference context features corresponding to the input received. Thus, the corresponding reference context features relating to the anatomical structure to be detected can be obtained.
According to yet another embodiment, the method further comprises assigning a unique identification number to a portion of the medical data comprising the anatomical
structure, and associating the unique identification number to a user input received corresponding to the anatomical structure. This enables in accessing the portion of the medical data comprising the anatomical structure easily and quickly as the complete medical data is not required to be browsed for accessing the portion of the medical data
comprising the desired anatomical structure.
According to yet another embodiment, wherein the method further comprises displaying the portion of the medical data comprising the anatomical structure. The detected anatomical structure in the medical data can be displayed.
Another embodiment includes a medical imaging system for detecting an anatomical structure in a medical data, the medical imaging system comprising an acquisition device for acquiring the medical data of a subject associated with the anatomical structure, a processing unit configured to compare a test value of context features of each unit point of the medical data with a range assigned to reference context features of the medical data, wherein the reference context features are derived using information from a neighborhood of the anatomical structure from a training medical data, assign a score to each unit point as a function of the comparison, and detect the presence of the anatomical structure based on the score of the unit points.
The present invention is further described hereinafter with reference to illustrated embodiments shown in the
accompanying drawings, in which: FIG 1 illustrates an exemplary block diagram of a medical imaging system for detecting an anatomical structure in a medical data according to an
embodiment herein,
FIG 2 illustrates an example of a medical data of a
subject according to an embodiment herein, FIG 3 illustrates an example of a score map for a medical data, wherein each unit point is assigned a score based on a comparison,
FIG 4 illustrates an example of a shape of an anatomical structure detected using a set of positive points from a score map,
FIG 5 an example of a cine application comprising the
embodiments described herein,
FIG 6 is a flow diagram illustrating method of deriving reference context features for an anatomical structure from one or more training medical data according to an embodiment herein,
FIG 7 is a flow diagram illustrating a method of
detecting an anatomical structure in a medical data according to an embodiment herein, and FIG 8 illustrates a representative hardware environment for practicing the embodiments described herein.
Various embodiments are described with reference to the drawings, wherein like reference numerals are used to refer to like elements throughout. In the following description, for purpose of explanation, numerous specific details are set forth in order to provide a thorough understanding of one or more embodiments. It may be evident that such embodiments may be practiced without these specific details.
FIG 1 illustrates an exemplary block diagram of a medical imaging system 10 for detecting an anatomical structure in a medical data according to an embodiment herein. The system 10 comprises an acquisition device 15, a processing unit 20 and a memory device 25. The acquisition device 15 acquires medical data of a subject. The term "medical data" used herein refers to multidimensional medical images such as 2D, 3D and the like. The acquired medical data is provided to the processing unit 20 for processing by the acquisition device and the processing unit is configured to process the medical data. According to an aspect herein, the processing unit 20 is configured to process the medical data to detect an anatomical structure of the subject.
FIG 2 illustrates an example of a medical data of a subject according to an embodiment herein. In the illustrated medical data 30, for example, it is assumed that the anatomical structure 35 do be detected is a right optic nerve.
Generally, the appearance of right optic nerve is feeble and it is very hard to detect by normal human observation. In the illustrated medical data 30, the right optic nerve can be seen inside the rectangle 36, and is designated as the anatomical structure 35.
Referring now to FIG 1 and FIG 2, to detect the feeble right optic nerve 35 in the medical data 30, the processing unit 20 is configured to access the memory device 25 and obtain reference context features stored at the memory device 25 corresponding to the right optic nerve 35. The reference context features are derived using information from a
neighborhood of the anatomical structure from one or more training medical data of the anatomical structure. For example, the reference context features can be derived in a training phase for the anatomical structure from the one ore more training medical data. The process of deriving the reference context features will be explained in detail later.
A "processing unit" as used herein is a device for executing machine-readable instructions stored on a computer readable medium, for performing tasks and may comprise any one or combination of, hardware and firmware. For example, the processing unit may be implemented using a microcontroller, microprocessor, electronic devices, or other electronic units to perform the functions described herein or a combination thereof. The machine-readable instructions may be stored within the processing unit or external to the processor. The memory device 25 can be deployed using a volatile or a nonvolatile memory.
Referring still to FIG 1 and FIG 2, the processing unit 20 on obtaining the reference context features is configured to compare a test value of the context features of each unit point of the medical data 30 with a range assigned to the reference context features. The term unit point used herein refers to the smallest unit point of the medical data that can be represented or controlled, such as a pixel of a 2D image and a voxel of a 3D image. The medical data can
comprise one or more 2D or 3D slices. Thus, the processing unit 20 can be configured to compare the test value of the context features of each unit point in each slice of the medical data. The use of context features enable in using surrounding information of the anatomical structures obtained from training medical data for detecting the anatomical structures in the medical data. This enables in detecting an anatomical structure having feeble appearance or when
geometric relation with other anatomical structures or surrounding is well constrained. Based on the comparison, the processing unit 20 is configured to assign a score to each unit point of the medical data as a function of the
comparison. For example, the processing unit 20 can be configured to generate a score map using the scores assigned to each unit points of the medical data. In an aspect, the comparison and assignment of the score to each unit point based on the comparison can be performed using an adaptive boosting algorithm. The processing unit 20 can be configured to perform the functions of the adaptive boosting algorithm.
FIG 3 illustrates an example of a score map for the medical data 30, wherein each unit point is assigned a score based on the comparison. In the illustrated score map 40, in the region within the rectangle 45 it can be seen that the unit points corresponding to the right optic nerve 35 are
distinctly represented. The scope map 40 illustrated in the example of FIG 3 is a 2D score map. However, the score map can be a 3D score map also. If the medical data comprises a plurality of 2D slices, a plurality of respective 2D score maps can be generated for each 2D slice. Similarly, if the medical data comprises a plurality of 3D slices, a plurality of respective 3D score maps can be generated for each 3D slice . Referring now to FIGS 1 through FIG 3, using the assigned score to each unit point of the medical data 30 in the score map 40, the processing unit 20 is configured to detect the presence of the right optic nerve 35. To detect the right optic nerve 35, in an aspect, the processing unit 20 is configured to determine unit points having the score greater than a threshold score to obtain positive unit points. The term positive unit points used herein refers to the unit points having a score greater than the threshold score, and thus, are unit points that most likely represent the
anatomical structure. In the shown example of FIG 3, the positive unit points, designated as 50, are shown as enclosed within the rectangle 45. The threshold score can be selected as per the accuracy desired. Determining the unit points whose score is greater than the threshold score enables in removing false positive substantially. In an aspect, the threshold score for each type of the anatomical structure can be determined as per the accuracy desired and stored in the memory device 25. The processing unit 20 can be configured to access the memory device and obtain the threshold score. In another aspect, for each type of anatomical structure, the threshold score can be provided as an input to the processing unit 20.
Referring now to FIGS 1 through 3, according an embodiment herein, to detect the shape of the anatomical structure 35, the processing unit 20 can be configured to detect appearance features of the positive unit points 50 representing the anatomical structure 35. Generally, positive unit points 50 representing the anatomical structure 35 will be the
neighboring unit points as depicted within the rectangle 45 in the score map 40. The positive unit points 50 representing the anatomical structure 35 are referred to herein as a set of positive unit points. In the shown example FIG 3, only one set of positive unit points 50 is illustrated for example purpose only. In some aspects, a score map can comprise multiple set of positive unit points representing a plurality of anatomical structures . The anatomical structures can be of the same type if one anatomical structure can be present at different locations within a body the subject. Additionally, the different set of positive unit points can represent different anatomical structures if the different anatomical structures desired to be detected can be detected in the same slice of the medical data. A set of positive unit points will be neighboring unit points because the positive unit points will most likely be the unit points representing the
anatomical structure and thus will be in a region in the medical data where the anatomical structure 35 is present. The processing unit 20 is configured to determine the
appearance features of each positive unit point in the respective score map of each slice of the medical data. The appearance features can be detected by applying an appearance detection algorithm over the positive unit points of each slice of the medical data. For example, a gabor based linear discriminant analysis (LDA) classifier can be used for detecting the appearance features of the positive unit points 50. The appearance detection algorithm can determine one or more of the appearance features, such as, intensity,
textures, edge responses, colour histograms and other
features used for object detection for each of the positive unit points 50 of the score map 40. From the detected
appearance features of the positive unit points 50, the processing unit 20 is configured to detect the shape of the anatomical structure 35. The appearance features of the positive unit points 50 provide additional information and thus achieves in detecting the shape of the anatomical structure.
FIG 4 illustrates an example of a shape of the anatomical structure 35 detected using the set of positive points 50 from the score map 40 of FIG 3. The shape of the anatomical structure 35 can be seen within the rectangle 55.
Referring now to FIG 1, according to an aspect, the system 10 comprises an input device 30 operably coupled to the
processing unit 20. A user, such as a radiologist can provide an indication to the processing unit 20 as to which
anatomical structure is required to be detected. Responsive to the input, the processing unit 20 is configured to access the memory device 25 to obtain the reference context features corresponding to the anatomical structure for which the indication was provided via the input device 30.
Alternatively, the processing unit 20 can access the memory device 25 to obtain reference context features of different anatomical structures stored therein and detect the
respective anatomical structures in the medical data. The detected anatomical structure can be displayed to the user using a display device 35 operably coupled to the processing unit 20. Additionally, the input device 30 can be used for by the radiologist for providing the threshold score for an anatomical structure as an input to the processing unit 20.
Referring now to FIG 1 and FIG 4, according to an aspect, the processing unit 20 is configured to assign a unique
identification to a portion of the medical data comprising the detected anatomical structure 35. The unique identification can be such that the potion of the medical data comprising the detected anatomical structure 35 can be identified uniquely. For example, the unique identification can be a unique number, a character string or a combination of number and characters. In the example of FIG 4, the anatomical structure 35 is detected in a single 2D slice of the medical data. Thus, the processing unit 20 will assign a unique identification to the slice of the medical data comprising the anatomical structure 35. The unique
identification assigned to the slice comprising the
anatomical structure 35, i.e., the right optic nerve, can be associated with an input received from a user corresponding to right optic nerve. For example, if a plurality of
different anatomical structures are detected in the medical data, and a user is interested in displaying the right optic nerve 35, the slice of the medical data comprising the right optic nerve 35 can be easily accessed by using the unique identification and then displayed. This also achieves in easy report generation as the respective portions of medical data comprising one or more respective anatomical structure can easily be accessed using the unique identification and the complete medical data is not required to be browsed for accessing the portion of the medical data comprising the anatomical structure 35. This is described in more detail in the description of FIG 5 below.
FIG 5 with reference to FIG 1 illustrates an example of a cine application comprising the embodiments described herein. In the illustrated example of FIG 5, the cine application, apart from comprising conventional browsing features, such as, play/pause, next/previous, and the like, comprises anatomy wise browsing. The input tab for play is designated as 106 and the input tab for pause is designated as 107.
Similarly, the input tab for next is designated as 108 and the input tab for previous is designated as 109. For example, in a head and neck CT medical data, the six input tabs 110 to 115 of the cine application can be assigned to anatomical structures as follows:
110 - Right optic nerve
111 - Left optic nerve
112 - Right eye
113 - Left eye
114 - Right parotid optic nerve
115 - Left parotid
A radiologist desiring to look at a particular anatomical structure can click or press the input tab corresponding to the anatomical structure. The processing unit 20 is
configured to display the anatomical structure for which the input tab was clicked by accessing the portion of the medical data comprising the particular anatomical structure using the unique identification assigned to the portion of the medical data. This enables the radiologist to view the desired anatomical structure easily without manually scanning each slice of the medical data. In the example of FIG 5, it is shown that the portion of the medical data 115 corresponding to right optic nerve 35 is displayed on the radiologist clicking the input tab 110. Similarly, the respective portion of the medical data comprising other anatomical structures designated by the respective input tabs 111 to 115 can be displayed on clicking the corresponding input tabs 111 to 115. Additionally, for report generation purposes, the portions of medical data comprising diagnostically relevant anatomical structures can be inserted into the report easily.
FIG 6 is a flow diagram illustrating method of deriving reference context features for an anatomical structure from one or more training medical data according to an embodiment herein. At block 120, a region is marked on an anatomical structure of one or more training medical data for which reference context features are desired to be generated.
Advantageously, the region is marked at the center of the anatomical structure. Next at, block 125, a neighborhood of the region of the anatomical structure is sampled to obtain a plurality of context features for the anatomical structure. Moving next to block 130, the plurality of context features are evaluated to determine the context features having discriminative information for the anatomical structure and are herein referred to as reference context features. Thus, a reduced set of context features having discriminative
information of the anatomical structure is obtained from the complete set of context features. This achieves in reduction in processing during the detection of the anatomical
structure in test medical data. At block 135, a value in the form of a range is assigned to the reference context
features. For example, the range for a reference context feature can be determined with respect to values of positive and negative training examples of the anatomical structure. The range assists in identifying positive context features from the medical data to be tested. For example, the steps of blocks 130 and 135 can be performed using a feature selection algorithm such as an adaptive boosting algorithm.
FIG 7 is with reference to FIGS 1 through 7 is a flow diagram illustrating a method of detecting an anatomical structure in a medical data according to an embodiment herein. At block 140, the medical data associated with the anatomical
structure 35 acquired by the acquisition device 15 is
received. Next, at block 145, a test value of context
features of each unit point of the medical data is compared with a range assigned to reference context features of the medical data, wherein the reference context features are derived using information from a neighborhood of the
anatomical structure from a training medical data. At block 150, a score is assigned to each unit point as a function of the comparison. Next, at block 155, the presence of the anatomical structure 35 is detected based on the score of the unit points.
The embodiments herein can take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment including both hardware and software elements. The embodiments that are implemented in software include but are not limited to, firmware, resident software, microcode, etc. Furthermore, the embodiments herein can take the form of a computer program product accessible from a computer-usable or computer-readable medium providing program code for use by or in connection with a computer or any instruction execution system. For the purposes of this description, a computer- usable or computer readable medium can be any apparatus that can comprise, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. The medium can be an electronic, magnetic, optical,
electromagnetic, infrared, or semiconductor system (or apparatus or device) or a propagation medium. Examples of a computer-readable medium include a semiconductor or solid state memory, magnetic tape, a removable computer diskette, a random access memory (RAM) , a read-only memory (ROM) , a rigid magnetic disk and an optical disk. Current examples of optical disks include compact disk - read only memory (CD- ROM) , compact disk - read/write (CD-R/W) and DVD. A data processing system suitable for storing and/or
executing program code will include at least one processor coupled directly or indirectly to memory elements through a system bus. The memory elements can include local memory employed during actual execution of the program code, bulk storage, and cache memories which provide temporary storage of at least some program code in order to reduce the number of times code must be retrieved from bulk storage during execution . Input/output (I/O) devices (including but not limited to keyboards, displays, pointing devices, etc.) can be coupled to the system either directly or through intervening I/O controllers. Network adapters may also be coupled to the system to enable the data processing system to become coupled to other data processing systems or remote printers or storage devices through intervening private or public
networks. Modems, cable modem and Ethernet cards are just a few of the currently available types of network adapters.
FIG 8 depicts a representative hardware environment for practicing the embodiments described herein. This schematic drawing illustrates a hardware configuration of an
information handling/computer system 160 in accordance with the embodiments herein. The system 160 comprises at least one processor or central processing unit (CPU) 165. The CPU 165 is interconnected via bus 170 to various devices such as a memory 175, input/output (I/O) controller 180, and user interface controller 185. Depending on the type and
configuration of the system 160, the memory 175 may be volatile (such as random access memory (RAM) etc., nonvolatile (read only memory (ROM), flash memory devices etc.,) or a combination of the two. The memory 175 is used to store instructions and data for use by the CPU 165. The I/O
controller 180 can connect to peripheral devices, such as CD drives 190 and hard drives 195, or other program storage devices that are readable by the system 160. Typically, an operating system for the computer system 160 as well as an application program is stored onto the hard drive 195. The operating system runs on the CPU 165 and is used to
coordinate and provide control of various components within system 160. The system 160 can read the inventive
instructions on the hard drive 195 and load them onto the memory 175 for execution by the CPU 165. The user interface controller 185 can connect to a keyboard 200, mouse 205, speaker 210, microphone 215, display device 220 and/or other user interface devices such as a touch screen device (not shown) to the bus 170 to gather user input and also to provide system output to the user. The embodiments described herein enable detecting an
anatomical structure of a subject from a medical data easily without the requirement of a radiologist browsing all the slices of the medical data manually to identify the
anatomical structure. Manually identifying the anatomical structure is time consuming as all the slices will have to be browsed. Additionally, in cases where the appearance of the anatomical structure is feeble, manual detection is difficult and may not be correct . The use of context features achieve in detection of the feeble and geometrically constrained anatomical structures. Moreover, the embodiments can also be used for report generation where portions of medical data comprising diagnostically relevant anatomical structures can be inserted into the report automatically.
While this invention has been described in detail with reference to certain preferred embodiments, it should be appreciated that the present invention is not limited to those precise embodiments. Rather, in view of the present disclosure which describes the current best mode for
practicing the invention, many modifications and variations would present themselves, to those of skilled in the art without departing from the scope and spirit of this
invention. The scope of the invention is, therefore,
indicated by the following claims rather than by the
foregoing description. All changes, modifications, and variations coming within the meaning and range of equivalency of the claims are to be considered within their scope.

Claims

Patent claims
1. A method of detecting an anatomical structure (35) in a medical data (80) ,
- receiving the medical data (30) associated with the
anatomical structure 35,
- comparing a test value of context features of each unit point of the medical data (30) with a range assigned to reference context features of the medical data (30) , wherein the reference context features are derived using information from a neighborhood of the anatomical structure from a training medical data,
- assigning a score to each unit point as a function of the comparison, and
- detecting the presence of the anatomical structure (35) in the medical data (30) based on the score of the unit points.
2. The method according to claim 1, wherein the detection of the presence of the anatomical structure (35) includes determining unit points having the score greater than a threshold score to obtain positive unit points (50) .
3. The method according to anyone of the claims 1 to 2 , wherein the detection includes:
- detecting appearance features of the positive units points (50) , and
- detecting a shape of the anatomical structure (35) in the medical data (30) using appearance features of the positive unit points (50) .
4. The method according to anyone of the claims 1 to 3 , wherein the comparison of the test value of context features includes :
- receiving an input indicating the anatomical structure (35) to be detected, and
- obtaining the reference context features corresponding to the input received.
5. The method according to anyone of the claims 1 to 4 , further comprising:
- assigning a unique identification number to a portion of the medical data (115) comprising the anatomical structure (35) , and
- associating the unique identification number to a user input received corresponding to the anatomical structure (35) .
6. The method according to claim 5, further comprising displaying the portion of the medical data (115) comprising the anatomical structure (35) .
7. A medical imaging system (10) for detecting an anatomical structure (35) in a medical data (30) , the medical imaging system (10) comprising:
- an acquisition device (15) for acquiring the medical data (30) of a subject associated with the anatomical structure (35) ,
- a processing unit (20) configured to:
- compare a test value of context features of each unit point of the medical data (30) with a range assigned to reference context features of the medical data (30) , wherein the reference context features are derived using information from a neighborhood of the anatomical structure from a training medical data,
- assign a score to each unit point as a function of the comparison, and
- detect the presence of the anatomical structure (35) based on the score of the unit points.
8. The medical imaging system (10) according to claim 7, wherein the processing unit (20) is configured to determine unit points having the score greater than a threshold score to obtain positive unit points (50) .
35
9. The medical imaging system (10) according to anyone of the claims 7 and 8, wherein the processing unit (20) is
configured to:
- identify a region (107) around the positive units points, and
- detect the anatomical structure (35) within the region (107) using appearance features of the anatomical structure (35) .
10. The medical imaging system (10) according to anyone of the claims 7 to 9 , wherein the processing unit (20) is configured to:
- receive an input indicating the anatomical structure (35) to be detected, and
- obtain the reference context features corresponding to the input received.
11. The medical imaging system (10) according to anyone of the claims 7 to 10, wherein the processing unit (20) is further configured to:
- assign a unique identification number to a portion of the medical data (115) comprising the anatomical structure (35) , and
- associate the unique identification number to a user input received corresponding to the anatomical structure (35) .
12. The medical imaging system (10) according to claim 7, further comprising a display device (35) operably coupled to the processing unit (20) and configured to display the portion of the medical data (115) comprising the anatomical structure (35) .
PCT/EP2012/067468 2011-09-14 2012-09-07 Method and a system for medical imaging WO2013037702A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
DE112012003818.5T DE112012003818T5 (en) 2011-09-14 2012-09-07 Method and system for medical imaging

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
IN1205/KOL/2011 2011-09-14
IN1205KO2011 2011-09-14

Publications (1)

Publication Number Publication Date
WO2013037702A1 true WO2013037702A1 (en) 2013-03-21

Family

ID=46875756

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2012/067468 WO2013037702A1 (en) 2011-09-14 2012-09-07 Method and a system for medical imaging

Country Status (2)

Country Link
DE (1) DE112012003818T5 (en)
WO (1) WO2013037702A1 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2529888A (en) * 2014-09-05 2016-03-09 Apical Ltd A method of image anaysis
JP2018146491A (en) * 2017-03-08 2018-09-20 横浜ゴム株式会社 Analysis method of composite material and computer program for analyzing composite material
CN109758178A (en) * 2017-11-10 2019-05-17 美国西门子医疗解决公司 Machine back work stream in ultrasonic imaging
WO2019138772A1 (en) * 2018-01-10 2019-07-18 富士フイルム株式会社 Image processing apparatus, processor apparatus, image processing method, and program
CN111833293A (en) * 2019-03-26 2020-10-27 西门子医疗有限公司 Method and data processing system for providing lymph node information

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050228272A1 (en) * 2004-04-09 2005-10-13 Daphne Yu System and method for automatically segmenting bones in computed tomography angiography data
US20060257031A1 (en) * 2005-03-31 2006-11-16 Michael Abramoff Automatic detection of red lesions in digital color fundus photographs
US20070116338A1 (en) * 2005-11-23 2007-05-24 General Electric Company Methods and systems for automatic segmentation of biological structure
US20070189590A1 (en) * 2006-02-11 2007-08-16 General Electric Company Systems, methods and apparatus of handling structures in three-dimensional images
US20070230795A1 (en) * 2006-04-03 2007-10-04 Abramoff Michael D Methods and Systems for Optic Nerve Head Segmentation
US20090228299A1 (en) * 2005-11-09 2009-09-10 The Regents Of The University Of California Methods and apparatus for context-sensitive telemedicine
US20100054563A1 (en) * 2008-09-02 2010-03-04 General Electric Company Tissue classification in medical images

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050228272A1 (en) * 2004-04-09 2005-10-13 Daphne Yu System and method for automatically segmenting bones in computed tomography angiography data
US20060257031A1 (en) * 2005-03-31 2006-11-16 Michael Abramoff Automatic detection of red lesions in digital color fundus photographs
US20090228299A1 (en) * 2005-11-09 2009-09-10 The Regents Of The University Of California Methods and apparatus for context-sensitive telemedicine
US20070116338A1 (en) * 2005-11-23 2007-05-24 General Electric Company Methods and systems for automatic segmentation of biological structure
US20070189590A1 (en) * 2006-02-11 2007-08-16 General Electric Company Systems, methods and apparatus of handling structures in three-dimensional images
US20070230795A1 (en) * 2006-04-03 2007-10-04 Abramoff Michael D Methods and Systems for Optic Nerve Head Segmentation
US20100054563A1 (en) * 2008-09-02 2010-03-04 General Electric Company Tissue classification in medical images

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2529888A (en) * 2014-09-05 2016-03-09 Apical Ltd A method of image anaysis
US9858677B2 (en) 2014-09-05 2018-01-02 Apical Ltd. Method of image analysis
GB2529888B (en) * 2014-09-05 2020-09-23 Apical Ltd A method of image analysis
JP2018146491A (en) * 2017-03-08 2018-09-20 横浜ゴム株式会社 Analysis method of composite material and computer program for analyzing composite material
CN109758178A (en) * 2017-11-10 2019-05-17 美国西门子医疗解决公司 Machine back work stream in ultrasonic imaging
WO2019138772A1 (en) * 2018-01-10 2019-07-18 富士フイルム株式会社 Image processing apparatus, processor apparatus, image processing method, and program
JPWO2019138772A1 (en) * 2018-01-10 2020-12-10 富士フイルム株式会社 Image processing equipment, processor equipment, image processing methods, and programs
JP7122328B2 (en) 2018-01-10 2022-08-19 富士フイルム株式会社 Image processing device, processor device, image processing method, and program
CN111833293A (en) * 2019-03-26 2020-10-27 西门子医疗有限公司 Method and data processing system for providing lymph node information

Also Published As

Publication number Publication date
DE112012003818T5 (en) 2014-08-07

Similar Documents

Publication Publication Date Title
US10846853B2 (en) Medical image processing apparatus, medical image processing method, and medical image processing program
KR102043130B1 (en) The method and apparatus for computer aided diagnosis
RU2595766C2 (en) Image identification and distortion reduction
US10762630B2 (en) System and method for structures detection and multi-class image categorization in medical imaging
US8958614B2 (en) Image-based detection using hierarchical learning
US20150002538A1 (en) Ultrasound image display method and apparatus
US20140089000A1 (en) Similar case searching apparatus, relevance database generating apparatus, similar case searching method, and relevance database generating method
US20140122515A1 (en) Apparatus and method for aiding diagnosis
WO2013018363A1 (en) Similar case search device and similar case search method
EP2761515B1 (en) Medical image system and method
US9767562B2 (en) Image processing apparatus, image processing method and storage medium
CN105249922A (en) Tomograms capturing device and tomograms capturing method
US9886781B2 (en) Image processing device and region extraction method
US10188361B2 (en) System for synthetic display of multi-modality data
EP2235652B2 (en) Navigation in a series of images
WO2013037702A1 (en) Method and a system for medical imaging
Fletcher et al. Observer performance for detection of pulmonary nodules at chest CT over a large range of radiation dose levels
US9483705B2 (en) Image processing device, image processing method, and image processing program
CN114202516A (en) Foreign matter detection method and device, electronic equipment and storage medium
JP5599683B2 (en) Medical image processing apparatus and medical image processing method
CN111144506B (en) Liver bag worm identification method based on ultrasonic image, storage medium and ultrasonic equipment
CN106204623A (en) Many contrasts image synchronization shows and the method and device of positioning and demarcating
JP7426908B2 (en) Medical information processing device, medical information processing system, medical information processing method, and program
WO2021103316A1 (en) Method, device, and system for determining target region of image
Kockelkorn et al. Semi-automatic classification of textures in thoracic CT scans

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12759669

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 112012003818

Country of ref document: DE

Ref document number: 1120120038185

Country of ref document: DE

122 Ep: pct application non-entry in european phase

Ref document number: 12759669

Country of ref document: EP

Kind code of ref document: A1