CN101719274B - Three-dimensional texture analyzing method of medicinal image data - Google Patents

Three-dimensional texture analyzing method of medicinal image data Download PDF

Info

Publication number
CN101719274B
CN101719274B CN2009102191360A CN200910219136A CN101719274B CN 101719274 B CN101719274 B CN 101719274B CN 2009102191360 A CN2009102191360 A CN 2009102191360A CN 200910219136 A CN200910219136 A CN 200910219136A CN 101719274 B CN101719274 B CN 101719274B
Authority
CN
China
Prior art keywords
gray
area
directions
gray level
occurrence matrixes
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN2009102191360A
Other languages
Chinese (zh)
Other versions
CN101719274A (en
Inventor
张国鹏
卢虹冰
王天
焦纯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fourth Military Medical University FMMU
Original Assignee
Fourth Military Medical University FMMU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fourth Military Medical University FMMU filed Critical Fourth Military Medical University FMMU
Priority to CN2009102191360A priority Critical patent/CN101719274B/en
Publication of CN101719274A publication Critical patent/CN101719274A/en
Application granted granted Critical
Publication of CN101719274B publication Critical patent/CN101719274B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Magnetic Resonance Imaging Apparatus (AREA)
  • Image Analysis (AREA)

Abstract

The invention provides a three-dimensional texture analyzing method of medicinal image data, which comprises the following four sequential steps of: 1. dividing an area to be processed in medicinal image data; 2. obtaining parameters of a three-dimensional gray-level co-occurrence matrix in the divided area, wherein the parameters comprise a gray-level grade range of image data to be processed, a generating direction of the gray-level co-occurrence matrix and a pixel pitch generated by the gray-level co-occurrence matrix; 3. normalizing and then calculating gray-level co-occurrence matrixes in all directions according to the direction and the pixel pitch parameters set in the step 2; and 4. calculating a characteristic value based on the gray-level co-occurrence matrixes in all the directions. Finally, a characteristic vector is obtained for the gray-level co-occurrence matrix in each direction. The characteristic values in all the directions are averaged to obtain a characteristic value vector of the area, and the vector is a three-dimensional texture analyzing result of the area data. By adopting the algorithm, texture analysis can be carried out on the appointed three-dimensional space area in the image data generated by a CT or MRI, the characteristic vector can be obtained finally, and the three-dimensional texture of an appointed area in the medicinal image data is expressed by the vector.

Description

A kind of three-D grain analytical approach of medical image volume data
Technical field
The invention belongs to Digital Image Processing and area of pattern recognition.Particularly in the medical image analysis field, utilize texture analysis method of the present invention can calculate the inherent texture pattern of interesting target, thereby realize the area of computer aided identification of target.
Background technology
Texture is a kind of common and important object recognition methods.The extracting method of textural characteristics roughly can be divided into two classes at present: statistical analysis technique and structure analysis method.The former describes texture structure from the statistical study of the relevant attribute of image according to texel and queueing discipline, the gray scale space related law between the reflection pixel; The latter then puts forth effort to find out texture primitive, forms the rule of exploring texture from structure again, also has directly to remove to seek the structure law that texture constitutes.In general statistic law is applicable to and analyzes the thin and irregular object of texture; The structure rule is applicable to the image of rule of texture primitive arrangement.
The fundamental purpose of texture feature extraction is the difference that the space structure difference of at random texture or how much textures is converted into the feature gray-scale value, describes the texture information of image with some mathematical models, comprises level and smooth, the sparse and systematicness of image-region etc.Usually, character such as the position of textural characteristics and texture, trend, size, shape are relevant, but irrelevant with average gray level (brightness).Along with the raising of Computing ability, utilizing texture analysis to carry out the detection of area of computer aided focus as core methed becomes a research focus gradually in recent years.A classic algorithm of texture analysis is the gray level co-occurrence matrixes [Haralick that is proposed by haralick in 1973, R.M, K.Shanmugam, and Itshak Dinstein. " Textural Featuresfor Image Classification. " IEEE Transactions on Systems, Man, andCybernetics, vol.Smc-3, no.6, Nov.1973.pp.610-621.].Susomboon etc. are in order to explore best texture analysis scanning window size, carry out the 2 d texture analysis after the artery that exsomatizes, fat, kidney, liver, lungs, muscle, spleen and bone scanned with different window sizes, finally obtained the Haralick representative feature vector data [R.Susomboon of these normal structures, D.S.Raicu, and J.D.Furst, " Pixel-Based Texture Classification of Tissues in ComputedTomography ", CTI Research Symposium, April 2006].S.Gordon etc. utilize the method for texture analysis that the cervigram among the PACS is looked like to analyze classification [S.Gordon, G.Zimmerman, H.Greenspan, Image segmentation of Uterine Cervix images for indexing in PACS, Proceedings of the 17th IEEE Symposium on Computer-Based Medical Systems, 2004].In the imaging examination of lung tubercle, utilizing the error of the method reduction manual read sheet of texture analysis has been a kind of relative proven technique [Bram van Ginneken*, Bart M.ter Haar Romeny, andMax A.Viergever, Member IEEE, Computer-Aided Diagnosis in ChestRadiography:A Survey, IEEE TRANSACTIONS ON MEDICAL IMAGING, 2001; 20:1228-1240].The method of analyzing based on 2 d texture is at the automatic classification [Ahmadian in the disease of liver in addition, Mostafa, Abolhassani, Salimpour, A texture classification method fordiffused liver diseases using Gabor wavelets, Engineering in Medicine and BiologySociety, 2005.IEEE-EMBS 2005.27th Annual International Conference of theVolume, Issue, 17-18Jan.2006:1567-1570] and mammary gland X take a picture to check in for application [Juliette S. is all arranged in the CAD of breast cancer, Kathy, Jeffrey et al.Detection of BreastCancer with Full-Field Digital Mammography and Computer-Aided Detection, Am.J.Roentgenol., 2009; 192:337-340].In summary it can be seen that the application of 2 d texture analysis in cad technique very extensively and obtained effect preferably.
The focus of texture analysis was the surface information of object analysis before the applicant considered, and in the medical image volume data, human organ surface through three-dimensional reconstruction has only comprised partial information, though and some little muscle, nerve and the blood vessel of organ inside may be fully under present hardware resolution three-dimensional reconstruction, it roughly moves towards and the position relation of fixing each other can constitute certain three-D grain information.And this texture information may also can change along with the pathology of part.
Summary of the invention
The three-D grain analytical approach that the purpose of this invention is to provide a kind of medical image volume data, the method can be carried out texture analysis to the three-dimensional spatial area of the appointment in the image volume data that CT or MRI produced, obtain a proper vector, be used for characterizing the three-D grain that this specifies the volume data zone.
The present invention is made of following four orderly steps:
The first step: in the medical image volume data, delimit zone to be processed.
Second step: the parameter that in the defined area, obtains three-dimensional gray level co-occurrence matrixes: the pel spacing that the gray shade scale scope of pending volume data, the generation direction of gray level co-occurrence matrixes and gray level co-occurrence matrixes generate.
Second step: after the normalization, according to the direction of second step setting and the gray level co-occurrence matrixes on the pel spacing calculation of parameter all directions.
The 4th step: based on the gray level co-occurrence matrixes of all directions, computation of characteristic values.Finally the gray level co-occurrence matrixes for each direction obtains a proper vector.Eigenwert on all directions is asked on average, obtain one should the zone feature value vector, the promptly regional for this reason volume data three-D grain analysis result of this vector.
The present invention is based on the thought of 2 d texture analytical approach in the past, invented a kind of three-D grain analytical approach of medical image volume data, be used for obtaining the inherent distribution relation and the pattern rule of volume data inside.The method is utilized CT or MRI image volume data, carry out the three-D grain analysis for the target area, the proper vector that obtains, the proper vector of utilizing these acquisitions be specification configuration variation tissue and normal structure to a certain extent in sorter at last, perhaps is used for other purposes.
Description of drawings
Fig. 1 is a gray level co-occurrence matrixes direction direction parameter example.
Embodiment
The present invention is made of following four orderly steps:
The first step: appointed area.The appointed area is meant delimit the scope that needs calculate in the medical image volume data, appointed method is often relevant with algorithm application of the present invention field.Can manually sketch the contours, also more available other robotization is chosen.
Second step: determine following three parameters (obtaining of parameter do not have sequencing):
(1) the gray shade scale scope of pending volume data.
Treat processing region and travel through, write down maximum gray shade scale MaxGrey and minimal gray grade MinGrey.
(2) the generation direction of gray level co-occurrence matrixes.
After the analysis of texture expanded to three dimensions, the variation that comprehensively show volume data texture all directions in three dimensions was compared under alternative direction parameter and the two-dimensional case and is how much multiples and increases.In order comprehensively to describe the variation of texture in three dimensions, the generation direction of having enumerated 13 kinds of gray level co-occurrence matrixes of table 1.Wherein θ represents to generate the projection of direction on XOY plane and the angle of X-axis, and φ represents to generate the angle of direction with respect to the Z axle.Consider counting yield, can choose some main directions from above-mentioned direction in actual computation, concrete choosing method can be determined according to the texture feature of practical problems, but to comprise (0 °, 90 °) at least, (90 °, 90 °) and (perpendicular to XOY plane, 0 °) three major axes orientations.For example can choose (0 °, 90 °), (45 °, 45 °), (45 °, 135 °), (90 °, 90 °), (135 °, 45 °), (135 °, 135 °), (perpendicular to XOY plane, 0 °) be totally 7 representational direction parameters, and these 7 direction concrete manifestations on 3D rendering are as shown in Figure 1.
Table 1 gray level co-occurrence matrixes generates the direction tabulation
Sequence number
1 ?0° ?45°
2 ?0° ?90°
3 ?0° ?135°
4 ?45° ?45°
5 ?45° ?90°
6 ?45° ?135°
7 ?90° ?45°
8 ?90° ?90°
9 ?90° ?135°
10 ?135° ?45°
11 ?135° ?90°
12 ?135° ?135°
13 Perpendicular to XOY plane ?0°
(3) pel spacing of gray level co-occurrence matrixes generation
On each direction, can adopt different pel spacing D i(1≤D i≤ D Max, D wherein MaxBe the pixel count between farthest two pixels in the selection area).Also can all directions adopt unified pel spacing D.The occurrence of pel spacing can rule of thumb be worth to come selected in actual applications.
The 3rd step: calculate three-dimensional gray level co-occurrence matrixes.
Because texture analysis and average gray level (brightness) are irrelevant, therefore before calculating gray level co-occurrence matrixes, the gray-scale value to selection area carries out normalization earlier.Even each volume data numerical value in the selection area:
Grey new=Grey old-MinGrey
Grey wherein OldBe the gray-scale value before the normalization, and Grey NewBe the gray-scale value after the normalization, MinGrey is the parameter of determining in second step.
The numerical range of selection area is normalized between 0 to MaxGrey-MinGrey.To the data point of all selection areas,, calculate a gray level co-occurrence matrixes according to each telegoniometer of appointment in second step with the pel spacing D of appointment.When calculating, if from current point, along some selected directions, after the D of the mobile appointment pixel, the place pixel has exceeded selected zone, then ignores the calculating that this point enters next point.
The 4th step: calculate the eigenwert that can be used for classifying
Based on the gray level co-occurrence matrixes of all directions, calculate the eigenwert of 14 kinds of all Haralick suggestions or go out some other eigenwerts with other new algorithm computation.Finally, can calculate a proper vector for the gray level co-occurrence matrixes of each direction.Suppose to have calculated altogether N eigenwert, then the proper vector on each direction is as follows:
V1=[Direction1_feature1,Direction1_feature2,...,Direction1_featureN]
V2=[Direction2_feature1,Direction2_feature2,...,Direction2_featureN]
VM=[DirectionM_feature1,DirectionM_feature2,...,DirectionM_featureN]
The volume data rotation of considering selection area can not influence operation result, therefore the eigenwert on all directions is asked average:
Feature1=(Direction1_feature1+Direction2_feature1+..+DirectionM_feature1)/M
Feature2=(Direction1_feature2+Direction2_feature2+..+DirectionM_feature2)/M
FeatureN=(Direction1_featureN+Direction2_featureN+..+DirectionM_featureN)/M
Finally obtain one should the zone feature value vector
V=[Feature1,Feature2,...,FeatureN]
The promptly regional for this reason volume data three-D grain analysis result of this vector.

Claims (1)

1. the three-D grain analytical approach of a medical image volume data is characterized in that being made of following 4 orderly steps:
The first step: specify the area of space of a three-dimensional that is made of the medical image data volume data, this region shape is not limit;
Second step: the gray-scale value upper and lower bound that calculates the appointed area; The direction of gray level co-occurrence matrixes is calculated in setting to this area of space, this direction is necessary for X, Y, the combination of one or more direction in three major axes orientations of Z and following listed 13 directions, and pel spacing or all directions of calculating gray level co-occurrence matrixes on each direction are all used same pel spacing;
Gray level co-occurrence matrixes generates the direction tabulation
Sequence number φ 1 ?0° 45° 2 ?0° 90° 3 ?0° 135° 4 ?45° 45° 5 ?45° 90° 6 ?45° 135° 7 ?90° 45° 8 ?90° 90° 9 ?90° 135° 10 ?135° 45° 11 ?135° 90° 12 ?135° 135° 13 Perpendicular to XOY plane
Wherein θ represents to generate the projection of direction on XOY plane and the angle of X-axis, and φ represents to generate the angle of direction with respect to the Z axle;
The 3rd step: calculate the gray level co-occurrence matrixes on each direction, before calculating, the volume data value of defined area is carried out the mapping of grade, make the data value in the zone that all will calculate from same fixing value; When calculating, if from current point,, moved in the step 2 behind the pel spacing of appointment along some selected directions, exceeded selected zone, then ignore the calculating that this point enters next point;
The 4th step: based on the gray level co-occurrence matrixes of the 3rd all directions that calculate of step, calculate one or more eigenwert, for the gray level co-occurrence matrixes on each direction, all calculate a proper vector, same a kind of eigenwert on all directions is asked on average, finally on this appointed area, obtained the feature value vector that a three-D grain is analyzed.
CN2009102191360A 2009-11-25 2009-11-25 Three-dimensional texture analyzing method of medicinal image data Expired - Fee Related CN101719274B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN2009102191360A CN101719274B (en) 2009-11-25 2009-11-25 Three-dimensional texture analyzing method of medicinal image data

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN2009102191360A CN101719274B (en) 2009-11-25 2009-11-25 Three-dimensional texture analyzing method of medicinal image data

Publications (2)

Publication Number Publication Date
CN101719274A CN101719274A (en) 2010-06-02
CN101719274B true CN101719274B (en) 2011-11-02

Family

ID=42433844

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2009102191360A Expired - Fee Related CN101719274B (en) 2009-11-25 2009-11-25 Three-dimensional texture analyzing method of medicinal image data

Country Status (1)

Country Link
CN (1) CN101719274B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102013205278A1 (en) 2013-03-26 2014-10-02 Siemens Aktiengesellschaft Method for displaying signal values of a combined magnetic resonance positron emission tomography device and correspondingly designed magnetic resonance positron emission tomography device
CN116645349B (en) * 2023-05-29 2024-03-19 沈阳工业大学 Image processing method and system for improving three-dimensional display effect of blood vessel

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1604014A (en) * 2003-09-30 2005-04-06 佳能株式会社 Image display apparatus and method

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1604014A (en) * 2003-09-30 2005-04-06 佳能株式会社 Image display apparatus and method

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
JP特开2000-244945A 2000.09.08
JP特开2008-97624A 2008.04.24
JP特开2009-140247A 2009.06.25

Also Published As

Publication number Publication date
CN101719274A (en) 2010-06-02

Similar Documents

Publication Publication Date Title
Cascio et al. Automatic detection of lung nodules in CT datasets based on stable 3D mass–spring models
US9025858B2 (en) Method and apparatus for automatically generating optimal 2-dimensional medical image from 3-dimensional medical image
Pu et al. Shape “break-and-repair” strategy and its application to automated medical image segmentation
US8135189B2 (en) System and method for organ segmentation using surface patch classification in 2D and 3D images
US20080225044A1 (en) Method and Apparatus for Editing Three-Dimensional Images
Beichel et al. Liver segmentation in contrast enhanced CT data using graph cuts and interactive 3D segmentation refinement methods
AlZu’bi et al. Multi-orientation geometric medical volumes segmentation using 3d multiresolution analysis
EP3444781B1 (en) Image processing apparatus and image processing method
CA2617227C (en) Deformable model for segmenting patient contours versus support structures in medical images
JP6385318B2 (en) Transform 3D objects to segment objects in 3D medical images
Kim et al. The recent progress in quantitative medical image analysis for computer aided diagnosis systems
Peng et al. A study of T [sub] 2 [/sub]-weighted MR image texture features and diffusion-weighted MR image features for computer-aided diagnosis of prostate cancer
Ciecholewski Automatic liver segmentation from 2D CT images using an approximate contour model
Widodo et al. Lung diseases detection caused by smoking using support vector machine
Soza et al. Non-rigid registration with use of hardware-based 3D Bézier functions
CN101719274B (en) Three-dimensional texture analyzing method of medicinal image data
CN1836258B (en) Method and system for using structure tensors to detect lung nodules and colon polyps
Erdt et al. Computer aided segmentation of kidneys using locally shape constrained deformable models on CT images
Hopp et al. Registration of 3D ultrasound computer tomography and MRI for evaluation of tissue correspondences
RU2478337C2 (en) Method of determining heart contour on fluorogpaphy images
Al-Qunaieer et al. Multi-resolution level sets with shape priors: a validation report for 2D segmentation of prostate gland in T2W MR images
Kerr et al. “True” color surface anatomy: mapping the Visible Human to patient-specific CT data
Liu et al. Study and application of medical image visualization technology
Xie Design and Development of Medical Image Processing Experiment System Based on IDL Language.
Khan et al. Achieving enhanced accuracy and strength performance with parallel programming for invariant affine point cloud registration

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20111102

Termination date: 20171125

CF01 Termination of patent right due to non-payment of annual fee