CN104966100A - A benign and malignant image lump classification method based on texture primitives - Google Patents

A benign and malignant image lump classification method based on texture primitives Download PDF

Info

Publication number
CN104966100A
CN104966100A CN201510337202.XA CN201510337202A CN104966100A CN 104966100 A CN104966100 A CN 104966100A CN 201510337202 A CN201510337202 A CN 201510337202A CN 104966100 A CN104966100 A CN 104966100A
Authority
CN
China
Prior art keywords
image
lump
dic
outer peripheral
peripheral areas
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201510337202.XA
Other languages
Chinese (zh)
Inventor
李艳凤
陈后金
魏学业
李居朋
彭亚辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Jiaotong University
Original Assignee
Beijing Jiaotong University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Jiaotong University filed Critical Beijing Jiaotong University
Priority to CN201510337202.XA priority Critical patent/CN104966100A/en
Publication of CN104966100A publication Critical patent/CN104966100A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2413Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on distances to training or reference patterns
    • G06F18/24133Distances to prototypes
    • G06F18/24137Distances to cluster centroïds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/03Recognition of patterns in medical or anatomical images

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Artificial Intelligence (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Image Analysis (AREA)

Abstract

The invention provides a benign and malignant image lump classification method based on texture primitives. The method mainly comprises the following steps: dividing a lump area image obtained after normalization processing into a center area and a peripheral area; obtaining feature vectors of the lump area image according to the features of the center area and the peripheral area, and combining the feature vectors into a training sample feature matrix, and training a K nearest neighbor classifier; extracting feature vector Fq of the lump image area obtained through normalization processing of a lump area image to be identified; inputting the feature vector Fq to the trained K nearest neighbor classifier to obtain benign and malignant classification results of the lump area image to be identified. The method, with the difference of features of the center area and the peripheral area of the lump area being taken into consideration, carries out texture dictionary construction and feature extraction on the center area and the peripheral area, introduces linear discriminant analysis to the texture primitive dictionary construction and realizes effective classification of benign and malignant image lumps base on the texture primitives.

Description

The good pernicious sorting technique of image lump based on texture primitive
Technical field
The present invention relates to image lump recognition technology field, particularly relate to the good pernicious sorting technique of a kind of image lump based on texture primitive.
Background technology
Breast cancer is the modal malignant tumour of a kind of women, China oneself become one of the fastest country of breast cancer incidence growth rate, and the age of onset of China's breast cancer is rejuvenation trend.In numerous mammography technology, mammography is one of the most frequently used clinically breast cancer disease examination means.Lump is one of modal Features of breast cancer, differentiates the optimum and pernicious of breast lump by shape, edge and textural characteristics.The good pernicious classification of lump based on image procossing and pattern-recognition can provide objectively auxiliary suggestion for doctor, and reduce unnecessary live tissue puncturing, the diagnosis for clinical early-stage breast cancer has great importance.
In existing lump sorting technique, morphological feature and textural characteristics are conventional feature representation modes.Morphological feature requires higher to the segmentation precision of mass edge, but for clinical image, lump is overlapped with gland tissue, very difficult automatically and be accurately partitioned into mass edge, be thus difficult in clinical middle realization.Comparatively morphological feature, the robustness of textural characteristics is better, requires lower to the segmentation precision of mass edge.
The shortcoming of the existing lump sorting technique based on textural characteristics is: texture primitive does not consider the space distribution information of pixel, and texture primitive lost the classification information of training image when building texture dictionary.
Summary of the invention
The embodiment provides the good pernicious sorting technique of a kind of image lump based on texture primitive, to realize perniciously effectively classifying to image lump is good based on texture primitive.
To achieve these goals, this invention takes following technical scheme.
The good pernicious sorting technique of image lump based on texture primitive, comprising:
Training lump area image is normalized, lump area image after normalized is divided into central area and outer peripheral areas, described central area and outer peripheral areas are built respectively and can distinguish texture dictionary, then obtain the feature of described central area and outer peripheral areas respectively;
The feature of described central area and outer peripheral areas is merged to the proper vector of the lump area image after obtaining described normalized;
The combination of eigenvectors of the training lump area image after described normalized is become training sample eigenmatrix, training k nearest neighbor sorter;
Lump area image to be identified is normalized, the proper vector F of lump image-region after extraction normalized q, by the proper vector F of lump area image qbe input to the k nearest neighbor sorter trained, obtain the good pernicious classification results of described lump area image to be identified.
Preferably, described is normalized training lump area image, training image after normalized is divided into central area and outer peripheral areas, the central area of described training image and outer peripheral areas are built respectively and can distinguish texture dictionary, then obtain the central area of described training image and the feature of outer peripheral areas respectively, comprising:
Brightness normalization is carried out to the training image of image lump, image I after normalization nrepresent, build the rectangle template with two coaxial layers, Mask inand Mask ex, described Mask inand Mask exlength and be widely respectively the length of described lump training image and wide setting multiple, use described rectangle template that described lump training image is divided into central area R inwith outer peripheral areas R ex, R in=I × Mask in, R ex=I × Mask ex;
Extract the central area R of described training image inthe each neighborhood gray scale of M of each pixel, it is M that described M territory ash neighborhood gray scale is adjusted to length 2row vector, all row vector organization center region Neighborhood matrix G in; Extract outer peripheral areas R exm × M neighborhood gray scale of each pixel, it is M that this M × M neighborhood gray scale is adjusted to length 2row vector, all row vectors composition outer peripheral areas Neighborhood matrix G ex;
To described central area Neighborhood matrix G inwith the classification information matrix application linear discriminant analysis of its correspondence, obtain central area neighborhood weight row vector ω in; To described outer peripheral areas Neighborhood matrix G exwith the classification information matrix application linear discriminant analysis of its correspondence, obtain outer peripheral areas neighborhood weight row vector ω ex;
By described central area neighborhood weight row vector ω inwith described central area Neighborhood matrix G inevery a line corresponding point be multiplied, obtain central area and can distinguish Neighborhood matrix Gd in; By described outer peripheral areas neighborhood weight row vector ω exwith described outer peripheral areas Neighborhood matrix G exevery a line corresponding point be multiplied, obtain outer peripheral areas and can distinguish Neighborhood matrix Gd ex;
To described Gd inand Gd exapply k mean cluster respectively, obtain central area and can distinguish texture primitive dictionary Dic inand outer peripheral areas can distinguish texture primitive dictionary Dic ex;
According to described training image central area Neighborhood matrix G in, ω inwith Dic inobtain training image central area feature F in; According to described training image outer peripheral areas Neighborhood matrix G ex, ω exwith Dic exobtain training image outer peripheral areas feature F ex.
Preferably, described to training image Gd inand Gd exapply k mean cluster respectively, obtain central area and can distinguish texture primitive dictionary Dic inand outer peripheral areas can distinguish texture primitive dictionary Dic ex, comprising:
The Gd corresponding to benign tumors training image inapplication k mean cluster, obtains cluster centre Cb in, the quantity of cluster centre is to distinguish the primitive quantity K that texture primitive dictionary comprises dic, each center is M 2the row vector of length; The Gd corresponding to Malignant mass training image inapplication k mean cluster, obtains cluster centre Cm in, the quantity of cluster centre is K dic, each center is M 2the row vector of length; Combine described Cb inand Cm in, form 2K dicrow M 2the matrix of row Cb in Cm in , Remove with other row vector average Euclidean apart from close K dicindividual row vector, finally obtains K dicindividual center, by this K dicindividual center is as the texture dictionary Dic of central area in;
The Gd corresponding to benign tumors training image exapplication k mean cluster, obtains cluster centre Cb ex, the quantity of cluster centre is to distinguish the primitive quantity K that texture primitive dictionary comprises dic, each center is M 2the row vector of length; The Gd corresponding to Malignant mass training image exapplication k mean cluster, obtains cluster centre Cm ex, the quantity of cluster centre is K dic, each center is M 2the row vector of length; Combine described Cb exand Cm ex, form 2K dicrow M 2the matrix of row Cb ex Cm ex , Remove with other row vector average Euclidean apart from close K dicindividual row vector, finally obtains K dicindividual center, by this K dicindividual center is as the texture dictionary Dic of outer peripheral areas ex.
Preferably, described according to described G in, ω inwith Dic inobtain central area feature F in, comprising:
By described G inany a line G in iwith ω incorresponding point are multiplied, and obtain central area and can distinguish neighborhood vector Gd in i, k=1 ..., M 2.; By described Gd in iquantize to Dic inin the texture primitive nearest with its Euclidean distance quantize mark value, obtain quantize mark value L in(Gd in i), it is specifically calculated as:
L in ( Gd in i ) = arg min j { d ( Gd in i , Dic in j ) } , j = 1 , . . . , K dic .
Wherein d is Euclidean distance, K dicto distinguish the texture primitive quantity that texture dictionary comprises;
After obtaining the quantification mark value of each pixel in central area, calculate the histogram of all quantification mark value, as the feature F of central area in, it is specifically calculated as:
Wherein, N inbe the quantity of central area pixel, δ function is:
δ [ L in ( Gd in i ) , j ] = 1 , L in ( Gd in i ) = j ; 0 , L in ( Gd in i ) ≠ j . .
Preferably, described according to described G ex, ω exwith Dic exobtain outer peripheral areas feature F ex, comprising:
By described G exany a line G ex iwith ω excorresponding point are multiplied, and obtain outer peripheral areas and can distinguish neighborhood vector Gd ex i, Gd ex i ( k ) = ω ex ( k ) · G ex i ( k ) , k=1,...,M 2.
By described Gd ex iquantize to Dic exin the texture primitive nearest with it quantize mark value, obtain quantizing mark value L ex(Gd ex i), L ex ( Gd ex i ) = arg min j { d ( Gd ex i , Dic ex j ) } , j = 1 , . . . , K dic . ;
After obtaining the quantification mark value of each pixel of outer peripheral areas, calculate the feature F of histogram as outer peripheral areas of all mark value ex, F ex ( j ) = Σ i = 1 N ex δ [ L ex ( Gd ex i ) , j ] / N ex ;
Wherein N exthe quantity of outer peripheral areas pixel.
Preferably, described merges to the central area of described training image and the feature of outer peripheral areas the proper vector obtaining described training image, comprising:
By described central area feature F inbe multiplied by factor lambda, and with described outer peripheral areas feature F exjoin end to end, as proper vector F=[the λ F of described training image inf ex].
Preferably, the described proper vector training k nearest neighbor sorter utilizing described training image, comprising:
The combination of eigenvectors of all training images is become training sample eigenmatrix, combined training sample characteristics matrix and its classification information, training k nearest neighbor sorter, is input to the distance dis (F of described k nearest neighbor sorter 1, F 2) be specifically calculated as follows:
dis ( F 1 , F 2 ) = 1 2 Σ j = 1 K dic [ F 1 ( j ) - F 2 ( j ) ] 2 F 1 ( j ) + F 2 ( j ) + 1 2 Σ i = K dic + 1 2 K dic [ F 1 ( i ) - F 1 ( i ) ] 2 F 1 ( i ) + F 2 ( i )
Above-mentioned F 1, F 2be respectively the proper vector of two training images.
Preferably, described is input to the proper vector of the lump area image to be identified after described normalized the k nearest neighbor sorter trained, and obtains the good pernicious classification results of described lump area image to be identified, comprising:
By described training image feature extracting method, extract the proper vector F of image to be identified q, by the proper vector F of image to be identified qbe input to the k nearest neighbor sorter trained, determine in the proper vector of the training image in described k nearest neighbor sorter with F qnearest front K width lump image, belongs to the quantity L of benign tumors in calculating K width lump image 1with the quantity L belonging to Malignant mass in K width lump image 2if, L 1>=L 2, then lump image to be identified is judged as optimum, if L 1<L 2, then lump image to be identified is judged as pernicious.
The technical scheme provided as can be seen from the embodiment of the invention described above, the embodiment of the present invention is solve the problem that texture primitive does not consider pixel space distribution information, from tumor growth Track character, design many coaxial layers template, by region and outer peripheral areas centered by lump Region dividing.Consider the feature otherness of central area and outer peripheral areas, respectively texture dictionary built to central area and outer peripheral areas and extract feature.Lose the problem of classification information when building texture dictionary for solving texture primitive, linear discriminant analysis is incorporated into texture primitive dictionary and builds by the present invention, proposition can realize the good malignant characteristics of lump and extract by subregion texture primitive, achieves and perniciously effectively to classify to image lump is good based on texture primitive.
The aspect that the present invention adds and advantage will part provide in the following description, and these will become obvious from the following description, or be recognized by practice of the present invention.
Accompanying drawing explanation
In order to be illustrated more clearly in the technical scheme of the embodiment of the present invention, below the accompanying drawing used required in describing embodiment is briefly described, apparently, accompanying drawing in the following describes is only some embodiments of the present invention, for those of ordinary skill in the art, under the prerequisite not paying creative work, other accompanying drawing can also be obtained according to these accompanying drawings.
Fig. 1 is the processing flow chart of the good pernicious sorting technique of the image lump based on texture primitive of the embodiment of the present invention;
Fig. 2 is the lump area sample image for training in the step 1 in the treatment scheme of the good pernicious sorting technique of the image lump based on texture primitive of the embodiment of the present invention;
Fig. 3 is that training image is divided into central area R by the above-mentioned rectangle template of use that the embodiment of the present invention provides inwith outer peripheral areas R exschematic diagram;
Fig. 4 is that the embodiment of the present invention is based on lump area image to be identified in the step 14 in the treatment scheme of the good pernicious sorting technique of image lump of texture primitive.
Embodiment
Be described below in detail embodiments of the present invention, the example of described embodiment is shown in the drawings, and wherein same or similar label represents same or similar element or has element that is identical or similar functions from start to finish.Being exemplary below by the embodiment be described with reference to the drawings, only for explaining the present invention, and can not limitation of the present invention being interpreted as.
Those skilled in the art of the present technique are appreciated that unless expressly stated, and singulative used herein " ", " one ", " described " and " being somebody's turn to do " also can comprise plural form.Should be further understood that, the wording used in instructions of the present invention " comprises " and refers to there is described feature, integer, step, operation, element and/or assembly, but does not get rid of and exist or add other features one or more, integer, step, operation, element, assembly and/or their group.Should be appreciated that, when we claim element to be " connected " or " coupling " to another element time, it can be directly connected or coupled to other elements, or also can there is intermediary element.In addition, " connection " used herein or " coupling " can comprise wireless connections or couple.Wording "and/or" used herein comprises one or more arbitrary unit listing item be associated and all combinations.
Those skilled in the art of the present technique are appreciated that unless otherwise defined, and all terms used herein (comprising technical term and scientific terminology) have the meaning identical with the general understanding of the those of ordinary skill in field belonging to the present invention.Should also be understood that those terms defined in such as general dictionary should be understood to have the meaning consistent with the meaning in the context of prior art, unless and define as here, can not explain by idealized or too formal implication.
The invention provides the good pernicious sorting technique of a kind of image lump based on texture primitive, particular flow sheet as shown in Figure 1, comprises sorter training process and uses the sorter trained to carry out the good pernicious assorting process of lump:
One, sorter training process, specifically comprises the following steps:
Step 1: brightness normalization is carried out, to reduce the impact of image I contrast change to arbitrary width training lump area image I.Its concrete grammar is:
I n(x,y)=[I(x,y)-I min]/(I max-I min)
Wherein I (x, y) is the gray-scale value of pixel (x, y), I minand I maxthe minimum of image I and maximum gradation value, I n(x, y) is the gray-scale value after pixel (x, y) normalization.
Step 2: to arbitrary width training image, builds the two-value rectangle template with two coaxial layers: Mask inand Mask ex, Mask inand Mask exlength and be widely respectively the long and wide α of training image doubly, α is the coefficient of setting.Use above-mentioned rectangle template that training image is divided into central area R inwith outer peripheral areas R ex.Be specially: R in=I n× Mask in, R ex=I n× Mask ex.
Step 3: to arbitrary width training image, extract M × M neighborhood gray scale of each pixel in picture centre region, it is M that this M × M neighborhood gray scale is adjusted to length 2row vector, all row vector organization center region Neighborhood matrix G in.If training image belongs to benign tumors, then the classification information of each pixel in this picture centre region is 1, if training image belongs to Malignant mass, then the classification information of each pixel in this picture centre region is 2, the classification information organization center area classification information matrix of all pixels.
Step 4: to arbitrary width training image, extract M × M neighborhood gray scale of each pixel of outer peripheral areas, it is M that this M × M neighborhood gray scale is adjusted to length 2row vector, all row vectors composition outer peripheral areas Neighborhood matrix G ex.If training image belongs to benign tumors, then the classification information of each pixel of this image outer peripheral areas is 1, if training image belongs to Malignant mass, then the classification information of each pixel of this image outer peripheral areas is 2, the classification information composition outer peripheral areas classification information matrix of all pixels.
Step 5: to central area Neighborhood matrix G inclassification information matrix application LDA (Linear Discriminant Analysis, linear discriminant analysis) with its correspondence, obtains central area neighborhood weight row vector ω in.
Step 6: to outer peripheral areas Neighborhood matrix G exwith the classification information matrix application LDA of its correspondence, obtain outer peripheral areas neighborhood weight row vector ω ex.
Step 7: by central area neighborhood weight row vector ω inwith Neighborhood matrix G inevery a line corresponding point be multiplied, obtain central area and can distinguish Neighborhood matrix Gd in.
Step 8: by outer peripheral areas neighborhood weight row vector ω exwith Neighborhood matrix G exevery a line corresponding point be multiplied, obtain outer peripheral areas and can distinguish Neighborhood matrix Gd ex.
Step 9: to Gd inand Gd exapply k mean cluster respectively, obtain central area and can distinguish texture primitive dictionary Dic inand outer peripheral areas can distinguish texture primitive dictionary Dic ex.The initial center of k mean cluster adopts the mode of random selecting, for reducing the impact of random selecting, carries out repeatedly k mean cluster, and center corresponding when selecting training accuracy the highest is as Dic inand Dic ex.Each Dic inand Dic excomputation process specific as follows:
1) corresponding to benign tumors training image Gd inapplication k mean cluster, obtains cluster centre Cb in, the quantity of cluster centre is K dic, each center is M 2the row vector of length.
2) corresponding to Malignant mass training image Gd inapplication k mean cluster, obtains cluster centre Cm in, the quantity of cluster centre is K dic, each center is M 2the row vector of length.
3) Cb is combined inand Cm in, form 2K dicrow M 2the matrix of row Cb in Cm in , Remove with other row vector average Euclidean apart from close K dicindividual row vector, finally obtains K dicindividual center, it can be used as the texture dictionary Dic of central area in.
4) corresponding to benign tumors training image Gd exapplication k mean cluster, obtains cluster centre Cb ex, the quantity of cluster centre is K dic, each center is M 2the row vector of length.
5) corresponding to Malignant mass training image Gd exapplication k mean cluster, obtains cluster centre Cm ex, the quantity of cluster centre is K dic, each center is M 2the row vector of length.
6) Cb is combined exand Cm ex, form 2K dicrow M 2the matrix of row Cb ex Cm ex , Remove with other row vector average Euclidean apart from close K dicindividual row vector, finally obtains K dicindividual center, it can be used as the texture dictionary Dic of outer peripheral areas ex.
Step 10: to arbitrary width training image, uses G in, ω inwith Dic in, obtain the feature F of central area in.Specific as follows:
1) by G inany a line G in iwith ω incorresponding point are multiplied, and obtain central area and can distinguish neighborhood vector Gd in i, be specially: Gd in i ( k ) = &omega; in ( k ) &CenterDot; G in i ( k ) , k=1,...,M 2.
2) by Gd in iquantize to Dic inin the texture primitive nearest with it, obtain quantizing mark value L in(Gd in i), it is specifically calculated as:
L in ( Gd in i ) = arg min j { d ( Gd in i , Dic in j ) } , j = 1 , . . . , K dic .
Wherein d is Euclidean distance, K diccan distinguish the texture primitive quantity that texture dictionary comprises, with the K in step 9 dicidentical.
3), after obtaining the quantification mark value of each pixel in central area, the histogram of all quantification mark value is calculated, as the feature F of central area in, it is specifically calculated as:
Wherein, N inbe the quantity of central area pixel, δ function is:
&delta; [ L in ( Gd in i ) , j ] = 1 , L in ( Gd in i ) = j ; 0 , L in ( Gd in i ) &NotEqual; j . .
Step 11: to arbitrary width training image, uses G ex, ω exwith Dic ex, obtain the feature F of outer peripheral areas ex.Specific as follows:
1) by G exany a line G ex iwith ω excorresponding point are multiplied, and obtain outer peripheral areas and can distinguish neighborhood vector Gd ex i, Gd ex i ( k ) = &omega; ex ( k ) &CenterDot; G ex i ( k ) , k=1,...,M 2.
2) by Gd ex iquantize to Dic exin the texture primitive nearest with it, obtain quantizing mark value L ex(Gd ex i), it is specifically calculated as: L ex ( Gd ex i ) = arg min j { d ( Gd ex i , Dic ex j ) } , j = 1 , . . . , K dic . .
3), after obtaining the quantification mark value of each pixel of outer peripheral areas, the feature F of histogram as outer peripheral areas of all mark value is calculated ex, it is specifically calculated as:
Wherein N exthe quantity of outer peripheral areas pixel.
Step 12: central area feature F inbe multiplied by factor lambda, and with outer peripheral areas feature F exjoin end to end, as proper vector F=[the λ F of this training image inf ex].
Step 13: the combination of eigenvectors of all training images is become training sample eigenmatrix, combined training sample characteristics matrix and its classification information, training k nearest neighbor sorter.Be input to the distance dis (F of k nearest neighbor sorter 1, F 2) be specifically calculated as follows:
dis ( F 1 , F 2 ) = 1 2 &Sigma; j = 1 K dic [ F 1 ( j ) - F 2 ( j ) ] 2 F 1 ( j ) + F 2 ( j ) + 1 2 &Sigma; i = K dic + 1 2 K dic [ F 1 ( i ) - F 1 ( i ) ] 2 F 1 ( i ) + F 2 ( i )
Above-mentioned F 1, F 2be respectively the proper vector of two training images.
Two, use the sorter trained to carry out the good pernicious assorting process of lump, specifically comprise the following steps:
Step 14: the normalized described in lump area image to be identified carry out step 1.
Step 15: the proper vector F obtaining this lump area image according to the process described by step 10 to step 12 q.
Step 16: by described training image feature extracting method, extract the proper vector F of image to be identified q, by the proper vector F of image to be identified qbe input to the k nearest neighbor sorter trained, determine in the proper vector of the training image in described k nearest neighbor sorter with F qnearest front K width lump image, belongs to the quantity L of benign tumors in calculating K width lump image 1with the quantity L belonging to Malignant mass in K width lump image 2if, L 1>=L 2, then lump image to be identified is judged as optimum, if L 1<L 2, then lump image to be identified is judged as pernicious.
For ease of the understanding to the embodiment of the present invention, be further explained explanation below in conjunction with accompanying drawing for a specific embodiment, this embodiment does not form the restriction to the embodiment of the present invention.
Embodiment one
It is a kind of in conjunction with many coaxial form Region dividing and the good pernicious sorting technique of lump can distinguishing texture primitive that this embodiment provides, and carries out concrete enforcement by following steps:
Step 1: brightness normalization is carried out to training lump area image, the gray-scale value of image is all normalized between [0,1], trains the sample instantiation of lump area image in the present embodiment as shown in Figure 2.
Step 2: to arbitrary width training image, builds the rectangle template with two coaxial layers, as shown in Figure 3.Use this template that training image is divided into central area and outer peripheral areas.In this example, α=0.5.
Step 3: to arbitrary width training image, extracts M × M neighborhood gray scale of each pixel in central area, neighborhood gray scale is adjusted to length M 2row vector, all row vector organization center region Neighborhood matrix G in, then computing center's area classification information matrix.In this example, M=5.
Step 4: to arbitrary width training image, extract M × M neighborhood gray scale of each pixel of outer peripheral areas, neighborhood gray scale being adjusted to length is M 2row vector, all row vectors composition outer peripheral areas Neighborhood matrix G ex, then calculate outer peripheral areas classification information matrix.
Step 5: to central area Neighborhood matrix G inwith the classification information matrix application LDA of its correspondence, obtain central area neighborhood weight row vector ω in.
Step 6: to outer peripheral areas Neighborhood matrix G exwith the classification information matrix application LDA of its correspondence, obtain outer peripheral areas neighborhood weight row vector ω ex.
Step 7: by central area neighborhood weight row vector ω inwith Neighborhood matrix G inevery a line corresponding point be multiplied, obtain central area and can distinguish Neighborhood matrix Gd in.
Step 8: by outer peripheral areas neighborhood weight row vector ω exwith Neighborhood matrix G exevery a line corresponding point be multiplied, obtain outer peripheral areas and can distinguish Neighborhood matrix Gd ex.
Step 9: to Gd inand Gd exapply k mean cluster respectively, obtain central area and can distinguish texture primitive dictionary Dic inand outer peripheral areas can distinguish texture primitive dictionary Dic ex.In this example, the texture primitive quantity K that texture dictionary comprises can be distinguished dic=100.
Step 10: to arbitrary width training image, uses G in, ω inwith Dic in, obtain the feature F of central area in.
Step 11: to arbitrary width training image, uses G ex, ω exwith Dic ex, obtain the feature F of outer peripheral areas ex.
Step 12: central area feature F inbe multiplied by factor lambda, and with outer peripheral areas feature F exjoin end to end, as the proper vector F of this training image.In this example, λ=0.8.
Step 13: the combination of eigenvectors of all training images is become training sample eigenmatrix, combined training sample characteristics matrix and its classification information, training k nearest neighbor sorter.In this example, the parameter K=3 of k nearest neighbor sorter.
Step 14: the normalized described in lump area image to be identified carry out step 1, the lump area image to be identified that this embodiment provides as shown in Figure 4.
Step 15: the proper vector F obtaining this lump area image according to the process described by step 10 to step 12 q.
Step 16: by the proper vector F of image to be identified qbe input to the k nearest neighbor sorter trained, obtain classification results.
In sum, the embodiment of the present invention is solve the problem that texture primitive does not consider pixel space distribution information, from tumor growth Track character, designs many coaxial layers template, by region and outer peripheral areas centered by lump Region dividing.Consider the feature otherness of central area and outer peripheral areas, respectively texture dictionary built to central area and outer peripheral areas and extract feature.Lose the problem of classification information when building texture dictionary for solving texture primitive, linear discriminant analysis is incorporated into texture primitive dictionary and builds by the present invention, proposition can realize the good malignant characteristics of lump and extract by subregion texture primitive, achieves and perniciously effectively to classify to image lump is good based on texture primitive.
The embodiment of the present invention, without the need to carrying out Accurate Segmentation to lump edges of regions, directly carries out good pernicious identification to the region comprising lump.Be applied to the clinical doctor of can be and objectively auxiliary suggestion is provided, reduce unnecessary live tissue puncturing.
One of ordinary skill in the art will appreciate that: accompanying drawing is the schematic diagram of an embodiment, the module in accompanying drawing or flow process might not be that enforcement the present invention is necessary.
As seen through the above description of the embodiments, those skilled in the art can be well understood to the mode that the present invention can add required general hardware platform by software and realizes.Based on such understanding, technical scheme of the present invention can embody with the form of software product the part that prior art contributes in essence in other words, this computer software product can be stored in storage medium, as ROM/RAM, magnetic disc, CD etc., comprising some instructions in order to make a computer equipment (can be personal computer, server, or the network equipment etc.) perform the method described in some part of each embodiment of the present invention or embodiment.
Each embodiment in this instructions all adopts the mode of going forward one by one to describe, between each embodiment identical similar part mutually see, what each embodiment stressed is the difference with other embodiments.Especially, for device or system embodiment, because it is substantially similar to embodiment of the method, so describe fairly simple, relevant part illustrates see the part of embodiment of the method.Apparatus and system embodiment described above is only schematic, the wherein said unit illustrated as separating component or can may not be and physically separates, parts as unit display can be or may not be physical location, namely can be positioned at a place, or also can be distributed in multiple network element.Some or all of module wherein can be selected according to the actual needs to realize the object of the present embodiment scheme.Those of ordinary skill in the art, when not paying creative work, are namely appreciated that and implement.
The above; be only the present invention's preferably embodiment, but protection scope of the present invention is not limited thereto, is anyly familiar with those skilled in the art in the technical scope that the present invention discloses; the change that can expect easily or replacement, all should be encompassed within protection scope of the present invention.Therefore, protection scope of the present invention should be as the criterion with the protection domain of claim.

Claims (8)

1., based on the good pernicious sorting technique of image lump of texture primitive, it is characterized in that, comprising:
Training lump area image is normalized, lump area image after normalized is divided into central area and outer peripheral areas, described central area and outer peripheral areas are built respectively and can distinguish texture dictionary, then obtain the feature of described central area and outer peripheral areas respectively;
The feature of described central area and outer peripheral areas is merged to the proper vector of the lump area image after obtaining described normalized;
The combination of eigenvectors of the training lump area image after described normalized is become training sample eigenmatrix, training k nearest neighbor sorter;
Lump area image to be identified is normalized, the proper vector F of lump image-region after extraction normalized q, by the proper vector F of lump area image qbe input to the k nearest neighbor sorter trained, obtain the good pernicious classification results of described lump area image to be identified.
2. the good pernicious sorting technique of the image lump based on texture primitive according to claim 1, it is characterized in that, described is normalized training lump area image, training image after normalized is divided into central area and outer peripheral areas, the central area of described training image and outer peripheral areas are built respectively and can distinguish texture dictionary, then obtain the central area of described training image and the feature of outer peripheral areas respectively, comprising:
Brightness normalization is carried out to the training image of image lump, image I after normalization nrepresent, build the rectangle template with two coaxial layers, Mask inand Mask ex, described Mask inand Mask exlength and be widely respectively the length of described lump training image and wide setting multiple, use described rectangle template that described lump training image is divided into central area R inwith outer peripheral areas R ex, R in=I × Mask in, R ex=I × Mask ex;
Extract the central area R of described training image inm × M neighborhood gray scale of each pixel, it is M that described M × M neighborhood gray scale is adjusted to length 2row vector, all row vector organization center region Neighborhood matrix G in; Extract outer peripheral areas R exm × M neighborhood gray scale of each pixel, it is M that this M × M neighborhood gray scale is adjusted to length 2row vector, all row vectors composition outer peripheral areas Neighborhood matrix G ex;
To described central area Neighborhood matrix G inwith the classification information matrix application linear discriminant analysis of its correspondence, obtain central area neighborhood weight row vector ω in; To described outer peripheral areas Neighborhood matrix G exwith the classification information matrix application linear discriminant analysis of its correspondence, obtain outer peripheral areas neighborhood weight row vector ω ex;
By described central area neighborhood weight row vector ω inwith described central area Neighborhood matrix G inevery a line corresponding point be multiplied, obtain central area and can distinguish Neighborhood matrix Gd in; By described outer peripheral areas neighborhood weight row vector ω exwith described outer peripheral areas Neighborhood matrix G exevery a line corresponding point be multiplied, obtain outer peripheral areas and can distinguish Neighborhood matrix Gd ex;
To described Gd inand Gd exapply k mean cluster respectively, obtain central area and can distinguish texture primitive dictionary Dic inand outer peripheral areas can distinguish texture primitive dictionary Dic ex;
According to described training image central area Neighborhood matrix G in, ω inwith Dic inobtain training image central area feature F in; According to described training image outer peripheral areas Neighborhood matrix G ex, ω exwith Dic exobtain training image outer peripheral areas feature F ex.
3. the good pernicious sorting technique of the image lump based on texture primitive according to claim 2, is characterized in that, described to training image Gd inand Gd exapply k mean cluster respectively, obtain central area and can distinguish texture primitive dictionary Dic inand outer peripheral areas can distinguish texture primitive dictionary Dic ex, comprising:
The Gd corresponding to benign tumors training image inapplication k mean cluster, obtains cluster centre Cb in, the quantity of cluster centre is to distinguish the primitive quantity K that texture primitive dictionary comprises dic, each center is M 2the row vector of length; The Gd corresponding to Malignant mass training image inapplication k mean cluster, obtains cluster centre Cm in, the quantity of cluster centre is K dic, each center is M 2the row vector of length; Combine described Cb inand Cm in, form 2K dicrow M 2the matrix of row C b in Cm in , Remove with other row vector average Euclidean apart from close K dicindividual row vector, finally obtains K dicindividual center, by this K dicindividual center is as the texture dictionary Dic of central area in;
The Gd corresponding to benign tumors training image exapplication k mean cluster, obtains cluster centre Cb ex, the quantity of cluster centre is to distinguish the primitive quantity K that texture primitive dictionary comprises dic, each center is M 2the row vector of length; The Gd corresponding to Malignant mass training image exapplication k mean cluster, obtains cluster centre Cm ex, the quantity of cluster centre is K dic, each center is M 2the row vector of length; Combine described Cb exand Cm ex, form 2K dicrow M 2the matrix of row Cb ex Cm ex , Remove with other row vector average Euclidean apart from close K dicindividual row vector, finally obtains K dicindividual center, by this K dicindividual center is as the texture dictionary Dic of outer peripheral areas ex.
4. the good pernicious sorting technique of the image lump based on texture primitive according to claim 2, is characterized in that, described according to described G in, ω inwith Dic inobtain central area feature F in, comprising:
By described G inany a line G in iwith ω incorresponding point are multiplied, and obtain central area and can distinguish neighborhood vector Gd in i, Gd i i n(k)=ω in(k) G in i(k), k=1 ..., M 2.; By described Gd in iquantize to Dic inin the texture primitive nearest with its Euclidean distance quantize mark value, obtain quantize after mark value L in(Gd in i), it is specifically calculated as:
L in ( Cd in i ) = arg min j { d ( Gd in i , D ic in j ) } , j = 1 , . . . , K dic ,
Wherein d is Euclidean distance, K dicto distinguish the texture primitive quantity that texture dictionary comprises;
After obtaining the quantification mark value of each pixel in central area, calculate the histogram of all quantification mark value, as the feature F of central area in, it is specifically calculated as:
Wherein, N inbe the quantity of central area pixel, δ function is:
&delta; [ L in ( Gd in i ) , j ] = 1 , L in ( Gd in i ) = j ; 0 , L in ( Gd in i ) &NotEqual; j . .
5. the good pernicious sorting technique of the image lump based on texture primitive according to claim 2, is characterized in that, described according to described G ex, ω exwith Dic exobtain outer peripheral areas feature F ex, comprising:
By described G exany a line G ex iwith ω excorresponding point are multiplied, and obtain outer peripheral areas and can distinguish neighborhood vector Gd ex i, Gd ex i ( k ) = &omega; ex ( k ) &CenterDot; G ex i ( k ) , k = 1 , . . . , M 2 .
By described Gd ex iquantize to Dic exin the texture primitive nearest with it quantize mark value, obtain the mark value L after quantizing ex(Gd ex i), L ex ( Gd ex i ) = arg min j { d ( Gd ex i , Dic ex j ) } , j = 1 , . . . , K dic . ;
After obtaining the quantification mark value of each pixel of outer peripheral areas, calculate the feature F of histogram as outer peripheral areas of all mark value ex, F ex ( j ) = &Sigma; i = 1 N ex &delta; [ L ex ( Gd ex i ) , j ] / N ex ;
Wherein N exthe quantity of outer peripheral areas pixel.
6. the good pernicious sorting technique of the image lump based on texture primitive according to claim 1, is characterized in that, described merges to the central area of described training image and the feature of outer peripheral areas the proper vector obtaining described training image, comprising:
By described central area feature F inbe multiplied by factor lambda, and with described outer peripheral areas feature F exjoin end to end, as proper vector F=[the λ F of described training image inf ex].
7. the good pernicious sorting technique of the image lump based on texture primitive according to claim 1,
It is characterized in that, the described proper vector training k nearest neighbor sorter utilizing described training image, comprising:
The combination of eigenvectors of all training images is become training sample eigenmatrix, combined training sample characteristics matrix and its classification information, training k nearest neighbor sorter, is input to the distance dis (F of described k nearest neighbor sorter 1, F 2) be specifically calculated as follows:
dis ( F 1 , F 2 ) = 1 2 &Sigma; j = 1 K dic [ F 1 ( j ) - F 2 ( j ) ] 2 F 1 ( j ) + F 2 ( j ) + 1 2 &Sigma; i = K dic + 1 2 K dic [ F 1 ( i ) - F 2 ( i ) ] 2 F 1 ( i ) + F 2 ( i )
Above-mentioned F 1, F 2be respectively the proper vector of two training images.
8. the good pernicious sorting technique of the image lump based on texture primitive according to any one of claim 1 to 7, it is characterized in that, described is input to the proper vector of the lump area image to be identified after described normalized the k nearest neighbor sorter trained, obtain the good pernicious classification results of described lump area image to be identified, comprising:
By described training image feature extracting method, extract the proper vector F of image to be identified q, by the proper vector F of image to be identified qbe input to the k nearest neighbor sorter trained, determine in the proper vector of the training image in described k nearest neighbor sorter with F qnearest front K width lump image, belongs to the quantity L of benign tumors in calculating K width lump image 1with the quantity L belonging to Malignant mass in K width lump image 2if, L 1>=L 2, then lump image to be identified is judged as optimum, if L 1<L 2, then lump image to be identified is judged as pernicious.
CN201510337202.XA 2015-06-17 2015-06-17 A benign and malignant image lump classification method based on texture primitives Pending CN104966100A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510337202.XA CN104966100A (en) 2015-06-17 2015-06-17 A benign and malignant image lump classification method based on texture primitives

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510337202.XA CN104966100A (en) 2015-06-17 2015-06-17 A benign and malignant image lump classification method based on texture primitives

Publications (1)

Publication Number Publication Date
CN104966100A true CN104966100A (en) 2015-10-07

Family

ID=54220135

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510337202.XA Pending CN104966100A (en) 2015-06-17 2015-06-17 A benign and malignant image lump classification method based on texture primitives

Country Status (1)

Country Link
CN (1) CN104966100A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108280835A (en) * 2018-01-23 2018-07-13 研靖信息科技(上海)有限公司 A kind of method and apparatus determining lesion property based on imaging agent concentration
CN108830835A (en) * 2018-05-25 2018-11-16 北京长木谷医疗科技有限公司 It identifies the method for spinal sagittal bit image exception and calculates equipment
CN109191424A (en) * 2018-07-23 2019-01-11 哈尔滨工业大学(深圳) A kind of detection of breast lump and categorizing system, computer readable storage medium
CN111340135A (en) * 2020-03-12 2020-06-26 广州领拓医疗科技有限公司 Renal mass classification method based on random projection

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108280835A (en) * 2018-01-23 2018-07-13 研靖信息科技(上海)有限公司 A kind of method and apparatus determining lesion property based on imaging agent concentration
CN108830835A (en) * 2018-05-25 2018-11-16 北京长木谷医疗科技有限公司 It identifies the method for spinal sagittal bit image exception and calculates equipment
CN108830835B (en) * 2018-05-25 2021-12-03 北京长木谷医疗科技有限公司 Method and computing equipment for identifying spine sagittal position image abnormity
CN109191424A (en) * 2018-07-23 2019-01-11 哈尔滨工业大学(深圳) A kind of detection of breast lump and categorizing system, computer readable storage medium
CN109191424B (en) * 2018-07-23 2022-04-22 哈尔滨工业大学(深圳) Breast mass detection and classification system and computer-readable storage medium
CN111340135A (en) * 2020-03-12 2020-06-26 广州领拓医疗科技有限公司 Renal mass classification method based on random projection
CN111340135B (en) * 2020-03-12 2021-07-23 甄鑫 Renal mass classification method based on random projection

Similar Documents

Publication Publication Date Title
Lladó et al. A textural approach for mass false positive reduction in mammography
Wajid et al. Local energy-based shape histogram feature extraction technique for breast cancer diagnosis
CN106683076B (en) The method of train wheel tread damage detection based on textural characteristics cluster
CN105138970B (en) Classification of Polarimetric SAR Image method based on spatial information
US20080166035A1 (en) Computer-Aided Pathological Diagnosis System
Pham et al. A comparative study for classification of skin cancer
Li et al. Texton analysis for mass classification in mammograms
CN102509286B (en) Target region sketching method for medical image
CN105913086A (en) Computer-aided mammary gland diagnosing method by means of characteristic weight adaptive selection
CN102163281B (en) Real-time human body detection method based on AdaBoost frame and colour of head
CN104376147A (en) Image-based risk score-a prognostic predictor of survival and outcome from digital histopathology
Hua et al. Multimodal brain tumor segmentation using cascaded V-Nets
Xu et al. Adjustable adaboost classifier and pyramid features for image-based cervical cancer diagnosis
CN106326916B (en) Object detection method based on Analysis On Multi-scale Features estimation and high-order BING feature
CN104966100A (en) A benign and malignant image lump classification method based on texture primitives
Casanova et al. Texture analysis using fractal descriptors estimated by the mutual interference of color channels
Palma et al. Detection of masses and architectural distortions in digital breast tomosynthesis images using fuzzy and a contrario approaches
Georgiou et al. Multi-scaled morphological features for the characterization of mammographic masses using statistical classification schemes
CN102509104A (en) Confidence map-based method for distinguishing and detecting virtual object of augmented reality scene
CN104517116A (en) Device and method for confirming object region in image
CN108427913A (en) The Hyperspectral Image Classification method of combined spectral, space and hierarchy information
CN103871066A (en) Method for constructing similarity matrix in ultrasound image Ncut segmentation process
CN101551854A (en) A processing system of unbalanced medical image and processing method thereof
CN103678552A (en) Remote-sensing image retrieving method and system based on salient regional features
Gardezi et al. Fusion of completed local binary pattern features with curvelet features for mammogram classification

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20151007

RJ01 Rejection of invention patent application after publication