CN106503727B - A kind of method and device of classification hyperspectral imagery - Google Patents
A kind of method and device of classification hyperspectral imagery Download PDFInfo
- Publication number
- CN106503727B CN106503727B CN201610872312.0A CN201610872312A CN106503727B CN 106503727 B CN106503727 B CN 106503727B CN 201610872312 A CN201610872312 A CN 201610872312A CN 106503727 B CN106503727 B CN 106503727B
- Authority
- CN
- China
- Prior art keywords
- matrix
- sample
- category
- laplacian
- similarity
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/21—Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
- G06F18/214—Generating training patterns; Bootstrap methods, e.g. bagging or boosting
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
- G06F18/241—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
- G06F18/2413—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on distances to training or reference patterns
- G06F18/24133—Distances to prototypes
Landscapes
- Engineering & Computer Science (AREA)
- Data Mining & Analysis (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Artificial Intelligence (AREA)
- Evolutionary Biology (AREA)
- Evolutionary Computation (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Image Analysis (AREA)
Abstract
The present invention provides a kind of method and device of classification hyperspectral imagery, this method, comprising: determines training sample from high spectrum image in advance, corresponding category is arranged for each training sample;Extract at least two characteristics of image of the high spectrum image;Determine the similarity matrix of the closest figure of each described image feature;Determine the figure Laplacian Matrix of each similarity matrix;Each figure Laplacian Matrix is incident upon in Grassmann manifold, the distance between every two figure Laplacian Matrix is determined in the Grassmann manifold;According to the category of the distance between described every two figure Laplacian Matrix and the training sample, the category matrix of the high spectrum image is determined;According to the category matrix, the category of each sample of the high spectrum image is determined.The present invention provides a kind of method and devices of classification hyperspectral imagery, can be improved the accuracy of classification.
Description
Technical field
The present invention relates to technical field of image processing, in particular to a kind of method and device of classification hyperspectral imagery.
Background technique
With the fast development of image processing techniques, image processing techniques is in multiple necks such as military affairs, monitoring, agricultural, chemistry
Domain is widely used.Wherein, how class object treated by image processing techniques and is classified and be increasingly taken seriously.
In the prior art, class object is treated to be classified mainly according to each in the two dimensional image of target to be sorted
Partial textural characteristics come for the various pieces classification in two dimensional image.For example, target to be sorted is a geographic area,
It include: river, wheatland, house etc. in the geographic area, in order to by each portions such as river, wheatland, houses in the geographic area
Divide and distinguish, in the prior art, classified according to the textural characteristics of the two dimensional image of the geographic area, it is, will
Textural characteristics are similar to be divided into one kind.
As can be seen from the above description, in the prior art according to the textural characteristics of the various pieces of two dimensional image come to X-Y scheme
As classifying, since the possible similarity of the textural characteristics of the part to belong to a different category in two dimensional image is also very high, classifying
When different classes of part will be divided into same category, therefore, the classification accuracy of the prior art is lower.
Summary of the invention
The embodiment of the invention provides a kind of method and devices of classification hyperspectral imagery, can be improved the accurate of classification
Property.
On the one hand, the embodiment of the invention provides a kind of methods of classification hyperspectral imagery, comprising:
S1: determining training sample from high spectrum image in advance, and corresponding category is arranged for each training sample;
S2: at least two characteristics of image of the high spectrum image are extracted;
S3: the similarity matrix of the closest figure of each described image feature is determined;
S4: the figure Laplacian Matrix of each similarity matrix is determined;
S5: each figure Laplacian Matrix is incident upon in Grassmann manifold, in the Grassmann manifold
Determine the distance between every two figure Laplacian Matrix;
S6: according to the category of the distance between described every two figure Laplacian Matrix and the training sample, institute is determined
State the category matrix of high spectrum image;
S7: according to the category matrix, the category of each sample of the high spectrum image is determined.
Further, the S6, comprising:
A1: according to the category of each training sample, the initial category matrix of the high spectrum image is determined;
A2: the sum of the distance of each figure Laplacian Matrix and other figure Laplacian Matrixes is determined;
A3: the initial value of the corresponding fusion weight of each figure Laplacian Matrix in objective function is determined, wherein described
Objective function are as follows:
Wherein, Q is the category matrix, and Y is the initial category matrix of the high spectrum image, L(m)It is general for m-th of Tula
Lars matrix, U are default matrix, γαFor the first default degree of restraint parameter, γsFor the second default degree of restraint parameter, S(m)For
The sum of the distance of m-th figure Laplacian Matrix and other figure Laplacian Matrixes, α(m)For m-th of figure Laplacian Matrix pair
The fusion weight answered, M are the quantity of the figure Laplacian Matrix;
A4: according to the corresponding fusion weight of each figure Laplacian Matrix and formula one, updating the category matrix,
Wherein, the formula one are as follows:
A5: according to the category matrix and formula two, updating the corresponding fusion weight of each figure Laplacian Matrix,
Wherein, the formula two are as follows:
Wherein,g(m)=Tr (Q-Y)TU(Q-Y);
A6: according to each corresponding fusion weight of figure Laplacian Matrix and the category matrix, judge the mesh
Whether the value of scalar functions restrains, if it is, exporting the category matrix, otherwise, returns to A4.
Further, after the S1, before the S6, further comprise:
B1: the quantity of the training sample is judged whether less than the first preset value, if it is, successively executing B2 to B9;
B2: principal component analysis is carried out to the spectrum of the high spectrum image, extracts the master of the spectrum of the high spectrum image
Ingredient;
B3: the gradient matrix of each principal component is determined;
B4: by the corresponding superposition of the gradient matrix of each principal component, the gradient matrix of the high spectrum image is generated;
B5: the segmentation of moisture ridge is carried out to the gradient matrix of the high spectrum image, obtains at least one cut zone;
B6: carrying out clustering processing to each cut zone, determine the cluster centre of each cut zone, will be every
The cluster centre of a cut zone is as aid sample;
B7: the corresponding most like training sample of each aid sample is determined;
B8: the category of the corresponding most like training sample of each aid sample is set to the class of each aid sample
Mark;
B9: using each aid sample for being provided with category as training sample, B1 is returned to.
Further, the S5, comprising:
Eigenvalues Decomposition is carried out to each figure Laplacian Matrix, obtains the spy of each figure Laplacian Matrix
Value indicative;
For each figure Laplacian Matrix, from each characteristic value of each figure Laplacian Matrix, choosing
Take the second maximum space eigenvalues of preset value numerical value;
According to the feature vector of the space eigenvalues of each figure Laplacian Matrix, each Tula pula is generated
The corresponding feature space of this matrix;
Determine the geodesic curve distance between the corresponding feature space of figure Laplacian Matrix described in every two;
By the corresponding geodesic curve distance of figure Laplacian Matrix described in every two as figure Laplce's square described in every two
The distance between battle array.
Further, described image feature, comprising: spectral signature;
The S2, comprising:
Clustering processing is carried out to the high spectrum image according to the mutual information between the spectrum in the high spectrum image, it is raw
At the spectrum block of the high spectrum image;
Using each spectrum block as a spectral signature.
Further, the S4, comprising:
According to formula three, the figure Laplacian Matrix of each similarity matrix is determined, wherein the formula three are as follows:
Wherein, L is the figure Laplacian Matrix, and W is the similarity matrix, and I is unit matrix, and D is diagonal matrix,
I-th of element on the diagonal line of D is ∑jWij。
Further, described image feature, comprising: spectral signature;
The S3, comprising:
According to the spectral signature of the high spectrum image, the spectral value of each sample is determined;
According to the spectral value of each sample, the Euclidean distance in the spectral signature between every two sample is determined;
It is determined initial between every two sample according to the Euclidean distance between every two sample using gaussian kernel function
Spectrum similarity;
According to the initial spectrum similarity between every two sample, third preset value initial spectrum is chosen for each sample
The closest sample of similarity maximum first;
According to each sample the first closest sample of corresponding third preset value, the neighbouring sample of each sample is generated
Collection;
Determine the neighbouring sample set similarity between the neighbouring sample set of every two sample;
The initial spectrum similarity of every two sample is multiplied with neighbouring sample set similarity, the of acquisition every two sample
One final similarity;
According to the final similarity of the third preset value numerical value maximum first of each sample, the Spectral Properties are obtained
The similarity matrix of the closest figure of sign.
Further, described image feature includes: spatial position feature;
The S3, comprising:
According to the spatial position feature of the high spectrum image, the spatial position of each sample is determined;
According to the spatial position of each sample, the distance between every two sample of the high spectrum image is determined;
The initial space between every two sample is determined using gaussian kernel function according to the distance between every two sample
Location information similarity;
According to the initial spatial location information similarity between every two sample, the 4th preset value is chosen for each sample
The closest sample of initial spatial location information similarity maximum second;
Characterized by the spectrum of sample, using gaussian kernel function, each sample and each of which second closest sample are determined
Between intermediate similarity;
By the initial spatial location information similarity and intermediate similarity of each sample and each of which the second closest sample
It is multiplied, obtains the second final similarity of each sample Yu each of which the second closest sample;
According to the second final similarity of each sample and each of which the second closest sample, it is special to obtain the spatial position
The similarity matrix of the closest figure of sign.
Further, described image feature includes: textural characteristics;
The S3, comprising:
According to the textural characteristics of the high spectrum image, the textural characteristics of each sample are determined;
The texture similarity between every two sample is determined using gaussian kernel function according to the textural characteristics of each sample;
According to the 5th maximum texture similarity of preset value numerical value of each sample, the most adjacent of the textural characteristics is obtained
The similarity matrix of nearly figure.
On the other hand, the embodiment of the invention provides a kind of devices of classification hyperspectral imagery, comprising:
Sample determination unit, for determining training sample from high spectrum image, for each training sample setting pair
The category answered;
Extraction unit, for extracting at least two characteristics of image of the high spectrum image;
Similarity matrix determination unit, the similarity matrix of the closest figure for determining each described image feature;
Figure Laplacian Matrix determination unit, for determining the figure Laplacian Matrix of each similarity matrix;
Distance determining unit, for each figure Laplacian Matrix to be incident upon Grassmann manifold, described
The distance between every two figure Laplacian Matrix is determined in Grassmann manifold;
Category matrix determination unit, for according to the distance between described every two figure Laplacian Matrix and the training
Sample determines the category matrix of the high spectrum image;
Category determination unit, for determining the category of each sample of the high spectrum image according to the category matrix.
Further, the category matrix determination unit, comprising:
First determines subelement, for the category according to each training sample, determines the first of the high spectrum image
Beginning category matrix;
Second determine subelement, for determine each figure Laplacian Matrix at a distance from other figure Laplacian Matrixes it
With;
Third determines subelement, for determining the corresponding fusion weight of each figure Laplacian Matrix in objective function
Initial value, wherein the objective function are as follows:
Wherein, Q is the category matrix, and Y is the initial category matrix of the high spectrum image, L(m)It is general for m-th of Tula
Lars matrix, U are default matrix, γαFor the first default degree of restraint parameter, γsFor the second default degree of restraint parameter, S(m)For
The sum of the distance of m-th figure Laplacian Matrix and other figure Laplacian Matrixes, α(m)For m-th of figure Laplacian Matrix pair
The fusion weight answered, M are the quantity of the figure Laplacian Matrix;
Category matrix update subelement, for according to the corresponding fusion weight of each figure Laplacian Matrix and formula
One, update the category matrix, wherein the formula one are as follows:
The fusion weight updates subelement, the category for determining according to the category matrix update subelement
Matrix and formula two update the corresponding fusion weight of each figure Laplacian Matrix, wherein the formula two are as follows:
Wherein,g(m)=Tr (Q-Y)TU(Q-Y);
Judgment sub-unit, for updating each figure Laplce square that subelement is determined according to the fusion weight
The category matrix that the corresponding fusion weight of battle array and the category matrix update subelement are determined, judges the objective function
Value whether restrain, if it is, exporting the category matrix, otherwise, return at the category matrix update subelement
Reason.
Further, the distance determining unit, comprising:
Eigenvalues Decomposition subelement obtains each for carrying out Eigenvalues Decomposition to each figure Laplacian Matrix
The characteristic value of the figure Laplacian Matrix;
Space eigenvalues determine subelement, general from each Tula for being directed to each figure Laplacian Matrix
In each characteristic value of Lars matrix, the second maximum space eigenvalues of preset value numerical value are chosen;
Feature space generates subelement, for according to the features of the space eigenvalues of each figure Laplacian Matrix to
Amount generates the corresponding feature space of each figure Laplacian Matrix;
Geodesic curve distance determine subelement, for determine the corresponding feature space of figure Laplacian Matrix described in every two it
Between geodesic curve distance;
Matrix distance determines subelement, is used for the corresponding geodesic curve of figure Laplacian Matrix described in every two apart from conduct
The distance between figure Laplacian Matrix described in every two.
In embodiments of the present invention, at least two characteristics of image for extracting high spectrum image, determine each characteristics of image
Corresponding figure Laplacian Matrix obtains the category of high spectrum image in the category for combining the training sample on high spectrum image
Matrix determines that the category of each sample of high spectrum image, the embodiment of the present invention are based at least two according to the category matrix
What a characteristics of image was classified, improve the accuracy of classification.
Detailed description of the invention
In order to more clearly explain the embodiment of the invention or the technical proposal in the existing technology, to embodiment or will show below
There is attached drawing needed in technical description to be briefly described, it should be apparent that, the accompanying drawings in the following description is the present invention
Some embodiments for those of ordinary skill in the art without creative efforts, can also basis
These attached drawings obtain other attached drawings.
Fig. 1 is a kind of flow chart of the method for classification hyperspectral imagery that one embodiment of the invention provides;
Fig. 2 is the flow chart of the method for another classification hyperspectral imagery that one embodiment of the invention provides;
Fig. 3 is the flow chart of the method for another classification hyperspectral imagery that one embodiment of the invention provides;
Fig. 4 is the flow chart of the method for another classification hyperspectral imagery that one embodiment of the invention provides;
Fig. 5 is a kind of schematic diagram of the device for classification hyperspectral imagery that one embodiment of the invention provides;
Fig. 6 is the schematic diagram of the device for another classification hyperspectral imagery that one embodiment of the invention provides.
Specific embodiment
In order to make the object, technical scheme and advantages of the embodiment of the invention clearer, below in conjunction with the embodiment of the present invention
In attached drawing, technical scheme in the embodiment of the invention is clearly and completely described, it is clear that described embodiment is
A part of the embodiment of the present invention, instead of all the embodiments, based on the embodiments of the present invention, those of ordinary skill in the art
Every other embodiment obtained without making creative work, shall fall within the protection scope of the present invention.
As shown in Figure 1, the embodiment of the invention provides a kind of method of classification hyperspectral imagery, this method may include with
Lower step:
S1: determining training sample from high spectrum image in advance, and corresponding category is arranged for each training sample;
S2: at least two characteristics of image of the high spectrum image are extracted;
S3: the similarity matrix of the closest figure of each described image feature is determined;
S4: the figure Laplacian Matrix of each similarity matrix is determined;
S5: each figure Laplacian Matrix is incident upon in Grassmann manifold, in the Grassmann manifold
Determine the distance between every two figure Laplacian Matrix;
S6: according to the category of the distance between described every two figure Laplacian Matrix and the training sample, institute is determined
State the category matrix of high spectrum image;
S7: according to the category matrix, the category of each sample of the high spectrum image is determined.
In embodiments of the present invention, at least two characteristics of image for extracting high spectrum image, determine each characteristics of image
Corresponding figure Laplacian Matrix obtains the category of high spectrum image in the category for combining the training sample on high spectrum image
Matrix determines that the category of each sample of high spectrum image, the embodiment of the present invention are based at least two according to the category matrix
What a characteristics of image was classified, improve the accuracy of classification.
In order to further increase the accuracy of classification, based on a kind of method of classification hyperspectral imagery shown in FIG. 1, at this
It invents in an embodiment, as shown in Fig. 2, the S6, comprising:
Step 201: according to the category of each training sample, determining the initial category matrix of the high spectrum image;
Step 202: determining the sum of the distance of each figure Laplacian Matrix and other figure Laplacian Matrixes;
Step 203: determine the initial value of the corresponding fusion weight of each figure Laplacian Matrix in objective function,
In, the objective function are as follows:
Wherein, Q is the category matrix, and Y is the initial category matrix of the high spectrum image, L(m)It is general for m-th of Tula
Lars matrix, U are default matrix, γαFor the first default degree of restraint parameter, γsFor the second default degree of restraint parameter, S(m)For
The sum of the distance of m-th figure Laplacian Matrix and other figure Laplacian Matrixes, α(m)For m-th of figure Laplacian Matrix pair
The fusion weight answered, M are the quantity of the figure Laplacian Matrix;
Step 204: according to the corresponding fusion weight of each figure Laplacian Matrix and formula one, updating the category
Matrix, wherein the formula one are as follows:
Step 205: according to the category matrix and formula two, updating the corresponding fusion of each figure Laplacian Matrix
Weight, wherein the formula two are as follows:
Wherein,g(m)=Tr (Q-Y)TU(Q-Y);
Step 206: according to each corresponding fusion weight of figure Laplacian Matrix and the category matrix, judging institute
Whether the value for stating objective function restrains, if so, 207 are thened follow the steps, otherwise, return step 204.
Step 207: exporting the category matrix.
In embodiments of the present invention, the row of category matrix indicates that sample, column indicate category, each element representation current line pair
Whether the sample answered belongs to when the corresponding category in forefront.For initial category matrix, due to the category of training sample be it is known,
So in initial category matrix, element corresponding to the column of the affiliated category of the row and training sample of each training sample
It is 1, the other elements of the row are 0.For example, there are 4 samples in high spectrum image, wherein sample 1 and sample 3 are trained
Sample, sample are divided into 3 classes, it is, sample 1 belongs to category 1, and sample 3 belongs to category 2, initial category matrix there are three category
Row respectively correspond sample 1 to sample 4, column respectively correspond classification 1 to classification 3, then initial category matrix are as follows:
Wherein, S(m)It can be expressed asdi,mFor i-th of figure Laplacian Matrix
The distance between m-th of figure Laplacian Matrix.For example, 3 M, the i.e. quantity of figure Laplacian Matrix are 3, S(1)
For the sum of the distance of the 1st figure Laplacian Matrix and other figure Laplacian Matrixes, specifically, the 1st figure Laplce's square
Battle array is d at a distance from the 2nd figure Laplacian Matrix2,1, the 1st figure Laplacian Matrix and the 3rd figure Laplacian Matrix
Distance is d3,1, S(1)For d2,1And d3,1The sum of.
In one possible implementation, the initial value of the corresponding fusion weight of each figure Laplacian Matrix is respectively provided with
For 1/M.U can be used for controlling the significance level of sample.In the present embodiment, α(m)Constantly become during loop iteration with Q
Change.Formula one therein can be according to seeking local derviation to the Q in objective function and obtaining under conditions of making local derviation 0, specifically
Ground, byIt is derived from formula one.Formula two therein can be by using drawing to objective function
Ge Lang multiplier method obtains, and obtains following formula using method of Lagrange multipliers to objective function:
Wherein, β is Lagrange multiplier,
Respectively to the α in φ (α, β)(m)Local derviation is sought with β, and makes local derviation 0 respectively, derives formula two.
After determining category matrix, step S7 may include: by the maximum element of numerical value in the corresponding row of each sample
The category of the column at place, the category as each sample.For example, in category matrix, corresponding i-th row of sample A, i-th
In row, the element of jth column is maximum, and jth arranges corresponding class and is designated as n, then the class of sample A is designated as n.If the corresponding row of current sample
In have the maximum element of more than one, then a maximum element can be randomly selected, according to the maximum member being randomly selected
Element determines the category of current sample.
In order to obtain better classifying quality, in an embodiment of the present invention, after the S1, before the S6,
Further comprise:
B1: the quantity of the training sample is judged whether less than the first preset value, if it is, successively executing B2 to B9;
B2: principal component analysis is carried out to the spectrum of the high spectrum image, extracts the master of the spectrum of the high spectrum image
Ingredient;
B3: the gradient matrix of each principal component is determined;
B4: by the corresponding superposition of the gradient matrix of each principal component, the gradient matrix of the high spectrum image is generated;
B5: the segmentation of moisture ridge is carried out to the gradient matrix of the high spectrum image, obtains at least one cut zone;
B6: carrying out clustering processing to each cut zone, determine the cluster centre of each cut zone, will be every
The cluster centre of a cut zone is as aid sample;
B7: the corresponding most like training sample of each aid sample is determined;
B8: the category of the corresponding most like training sample of each aid sample is set to the class of each aid sample
Mark;
B9: using each aid sample for being provided with category as training sample, B1 is returned to.
When the quantity of training sample is less than the first preset value, the method that can be provided through the embodiment of the present invention increases instruction
Practice the quantity of sample, and then improve the accuracy of classification by more training samples, obtains better classifying quality, realize only
There is the high-precision classification under a small amount of training sample, reduces the difficulty that classification hyperspectral imagery artificially marks.
The first preset value in the present embodiment can be 5.By carrying out principal component analysis to high spectrum image, bloom is extracted
Principal component in the spectrum of spectrogram picture, it is subsequent that principal component is handled, the dimension of high spectrum image can be reduced, is not being influenced
In the case where classification accuracy, reduce calculation amount, improves classification speed.The gradient matrix of each principal component is corresponding folded
Add, specifically by the corresponding addition of element in the gradient matrix of each principal component.It, can when carrying out the segmentation of moisture ridge in B5
To determine corresponding partitioning parameters according to the difference of the value range of the bands of a spectrum of high spectrum image, pass through different partitioning parameters
Realize various sizes of segmentation to high spectrum image, specifically, it is larger that biggish partitioning parameters obtain cut zone, lesser
It is smaller that partitioning parameters obtain cut zone.When carrying out clustering processing in B6, it can be carried out at cluster by K mean cluster method
Reason, specifically, can make K=1.Obtained cluster centre is a sample, which is the most representative of cut zone
Sample, using the sample as aid sample.In B7, the Europe of calculating each aid sample and each training sample can be passed through
The mode of formula distance determines the corresponding most like training sample of each aid sample, and specifically, Euclidean distance is closer, more phase
Seemingly.Furthermore it is possible to training sample set be arranged to store training sample, and the aid sample for being provided with category is added to trained sample
This concentration.
Based on a kind of method of classification hyperspectral imagery shown in FIG. 1, in an embodiment of the present invention, as shown in figure 3, institute
State S5, comprising:
Step 301: Eigenvalues Decomposition being carried out to each figure Laplacian Matrix, obtains each figure Laplce
The characteristic value of matrix;
Step 302: each figure Laplacian Matrix is directed to, from each feature of each figure Laplacian Matrix
In value, the second maximum space eigenvalues of preset value numerical value are chosen;
Specifically, for the 1st figure Laplacian Matrix, the 1st figure Laplacian Matrix has 5 characteristic values, is respectively
1,2,3,4,5;Second preset value is 3, then 3 maximum space eigenvalues, that is, selection spy are chosen from this 5 characteristic values
5,4,3 these three characteristic values of value indicative are as space eigenvalues.
In addition, the second preset value can be made to meet: the sky determined by the second preset value when determining the second preset value
Between the sum of characteristic value account for the 90% of the sum of all characteristic values.
Step 303: according to the feature vector of the space eigenvalues of each figure Laplacian Matrix, generating each described
The corresponding feature space of figure Laplacian Matrix;
Specifically, it is generated for the 1st figure Laplacian Matrix according to the corresponding feature vector of space eigenvalues 5,4,3
The feature space of 1st figure Laplacian Matrix.
Step 304: determining the geodesic curve distance between the corresponding feature space of figure Laplacian Matrix described in every two;
Specifically, the geodesic curve distance between each pair of feature space is remoter, indicates that ability is poorer.
Step 305: by the corresponding geodesic curve distance of figure Laplacian Matrix described in every two as Tula described in every two
The distance between this matrix of pula.
Specifically, the geodesic curve distance between the corresponding feature space of every two figure Laplacian Matrix is used as every two figure
The distance between Laplacian Matrix.
In an embodiment of the present invention, described image feature, comprising: spectral signature;
The S2, comprising:
Clustering processing is carried out to the high spectrum image according to the mutual information between the spectrum in the high spectrum image, it is raw
At the spectrum block of the high spectrum image;
Using each spectrum block as a spectral signature.
Since different samples is different in the different feature shown spectrally, in certain spectrally features
It is obvious, it is unobvious in certain spectrally features.It in the present embodiment, can by the way that high spectrum image is divided into spectrum block
To describe each sample from different spectrum blocks respectively, each spectrum block is used for as an individual spectral signature point
Class, spectrum block have more identification, can classify from different spectral regions to sample respectively, so that classification results are more
It is accurate to add.Wherein, at least two characteristics of image may include: at least two spectral signatures.In addition, in order to improve classification
Accuracy can remove the spectrum block less comprising band that clustering processing obtains, and specifically, choosing preset quantity includes
The most spectrum block of band, using the spectrum block as spectral signature.
In an embodiment of the present invention, described image feature, comprising: spectral signature;
The S3, comprising:
According to the spectral signature of the high spectrum image, the spectral value of each sample is determined;
Specifically, spectral value here can be the quantity of band.
According to the spectral value of each sample, the Euclidean distance in the spectral signature between every two sample is determined;
Specifically, the Euclidean distance between any two of all samples is calculated.
It is determined initial between every two sample according to the Euclidean distance between every two sample using gaussian kernel function
Spectrum similarity;
According to the initial spectrum similarity between every two sample, third preset value initial spectrum is chosen for each sample
The closest sample of similarity maximum first;
Specifically, when choosing the first closest sample of current sample, according to the initial of current sample and other samples
The descending sequence of spectrum similarity selects other samples of third preset value.
According to each sample the first closest sample of corresponding third preset value, the neighbouring sample of each sample is generated
Collection;
Determine the neighbouring sample set similarity between the neighbouring sample set of every two sample;
Specifically, the neighbouring sample set of each sample is constituted into neighbouring matrix, calculates phase of the every two between matrix
Guan Xing obtains neighbouring sample set similarity of the every two between sample set.
The initial spectrum similarity of every two sample is multiplied with neighbouring sample set similarity, the of acquisition every two sample
One final similarity;
For example, the initial spectrum similarity of sample A and sample B is 0.5, the neighbouring sample set of sample A and sample B it
Between neighbouring sample set similarity be 0.6, then the first final similarity of sample A and sample B be 0.5 × 0.6=0.3.
According to the final similarity of the third preset value numerical value maximum first of each sample, the Spectral Properties are obtained
The similarity matrix of the closest figure of sign.
Specifically, the final similarity of third preset value numerical value maximum first for retaining each sample, by other the
One final similarity is set as 0, constitutes the similarity matrix of the closest figure of spectral signature, the phase of the closest figure of spectral signature
Like the behavior sample of degree matrix, it is classified as sample, each element is the of the corresponding sample of current line sample corresponding with forefront is worked as
One final similarity.The implementation provided through the embodiment of the present invention realizes the building of spectral signature graph model.Work as Spectral Properties
When sign is spectrum block, the corresponding similarity matrix of each spectrum block.
In an embodiment of the present invention, described image feature includes: spatial position feature;
The S3, comprising:
According to the spatial position feature of the high spectrum image, the spatial position of each sample is determined;
Specifically, for spatial position feature, the S2 may include: to make the coordinate of sample each in high spectrum image
For the spatial position feature of high spectrum image.Specifically, rectangular coordinate system can be established on the two dimensional image of high spectrum image,
Determine each sample in the coordinate of the rectangular coordinate system, spatial position of the coordinate of each sample as each sample.
According to the spatial position of each sample, the distance between every two sample of the high spectrum image is determined;
Specifically, the distance between sample is determined according to the coordinate of each sample.
The initial space between every two sample is determined using gaussian kernel function according to the distance between every two sample
Location information similarity;
According to the initial spatial location information similarity between every two sample, the 4th preset value is chosen for each sample
The closest sample of initial spatial location information similarity maximum second;
Characterized by the spectrum of sample, using gaussian kernel function, each sample and each of which second closest sample are determined
Between intermediate similarity;
It specifically, can also be characterized by the principal component of the spectrum of sample.
By the initial spatial location information similarity and intermediate similarity of each sample and each of which the second closest sample
It is multiplied, obtains the second final similarity of each sample Yu each of which the second closest sample;
According to the second final similarity of each sample and each of which the second closest sample, it is special to obtain the spatial position
The similarity matrix of the closest figure of sign.
Specifically, the behavior sample of the similarity matrix of the closest figure of spatial position feature is classified as sample, each element
For the second final similarity of the corresponding sample of current line sample corresponding with forefront is worked as.The reality provided through the embodiment of the present invention
Existing mode realizes the building of spatial position feature graph model.
In an embodiment of the present invention, described image feature includes: textural characteristics;
The S3, comprising:
According to the textural characteristics of the high spectrum image, the textural characteristics of each sample are determined;
Specifically, for textural characteristics, the S2 may include: each spectrum principal component point to high spectrum image
Analysis, extracts the principal component of each spectrum;Using 2-D Gabor filter, to the Principle component extraction textural characteristics of each spectrum;It will
The textural characteristics of the principal component of each spectrum stack, and generate the textural characteristics of high spectrum image.The texture of the high spectrum image is special
Sign is three-dimensional textural characteristics block.
The texture similarity between every two sample is determined using gaussian kernel function according to the textural characteristics of each sample;
According to the 5th maximum texture similarity of preset value numerical value of each sample, the most adjacent of the textural characteristics is obtained
The similarity matrix of nearly figure.
Specifically, the 5th maximum texture similarity of preset value numerical value for retaining each sample, by other texture phases
It is set as 0 like degree, constitutes the similarity matrix of the closest figure of textural characteristics, the similarity matrix of the closest figure of textural characteristics
Behavior sample, be classified as sample, each element is the texture similarity of the corresponding sample of current line sample corresponding with forefront is worked as.
The implementation provided through the embodiment of the present invention realizes the building of textural characteristics graph model.
In an embodiment of the present invention, the S4, comprising:
According to formula three, the figure Laplacian Matrix of each similarity matrix is determined, wherein the formula three are as follows:
Wherein, L is the figure Laplacian Matrix, and W is the similarity matrix, and I is unit matrix, and D is diagonal matrix,
I-th of element on the diagonal line of D is ∑jWij。
In inventive embodiments, at least two characteristics of image can be with are as follows: at least two spectral signatures;At least two images are special
Sign can be with are as follows: spatial position feature and textural characteristics;At least two characteristics of image can be with are as follows: spatial position feature and textural characteristics
One or more of and at least one spectral signature.
As shown in figure 4, a kind of method of classification hyperspectral imagery provided in an embodiment of the present invention, comprising:
Step 401: determining training sample from high spectrum image in advance, corresponding category is set for each training sample;
For example, high-spectrum seems the high spectrum image of a geographic area, wherein includes: wheatland, river, room
Room.Sample in high spectrum image can be divided into wheatland, river and house three classes.It, can be every when training sample is arranged
One training sample is at least set in one kind.
It should be understood that each sample in high spectrum image corresponds to each picture of the two dimensional image of the high spectrum image
Vegetarian refreshments, that is to say, that the sample reflection of high spectrum image is exactly a pixel into the two dimensional image of high spectrum image.
Step 402: extracting the multiple spectral signatures, spatial position feature and textural characteristics of high spectrum image.
Step 403: determine respectively the similarity matrix of the closest figure of each spectral signature, spatial position feature it is most adjacent
The similarity matrix of the closest figure of the similarity matrix and textural characteristics of nearly figure;
Step 404: determining figure Laplacian Matrix, the sky of the similarity matrix of the closest figure of each spectral signature respectively
Between position feature closest figure similarity matrix figure Laplacian Matrix and textural characteristics closest figure similarity
The figure Laplacian Matrix of matrix;
Step 405: each figure Laplacian Matrix being incident upon in Grassmann manifold, is determined in Grassmann manifold
The distance between every two figure Laplacian Matrix;
Step 406: according to the category of the distance between every two figure Laplacian Matrix and training sample, determining EO-1 hyperion
The category matrix of image;
Step 407: according to category matrix, determining the category of each sample of high spectrum image.
Specifically, the sample in high spectrum image is divided into wheatland, river and house three classes, upper wheat is set for each sample
Any one in field, the corresponding category in river and house.
High spectrum image is different with normal image, and this kind of image is that we provide spectral information abundant, models
The several hundred a continuous spectrums contained from visible light to near-infrared are enclosed, in addition to this, high spectrum image additionally provides sky abundant
Between information, the adjacent substance in geographical location, category is usually same or similar, i.e., the content in image is not in larger
Variation.It is including the multiple fields such as military, monitoring, agricultural, chemical and mining that classification hyperspectral imagery, which can be applied,.
In embodiments of the present invention, never Tongfang is in face of high spectrum image extraction feature, spectral signature, sky including image
Between position feature and textural characteristics etc.;Closest figure (KNN-Graph, K-Nearest are established respectively for different features
Neighbor Graph), calculate figure Laplacian Matrix;Figure Laplacian Matrix is incident upon in Grassmann manifold, is being flowed
The distance between figure Laplce is calculated on row, according to calculated distance, optimizes the fusion weight of closest figure;Pass through class
The mode for marking transmitting, passes to the sample of unknown category from known class target training sample for category information, to reach raising
The purpose of classification accuracy.
In embodiments of the present invention, by the way of multiple features fusion, different features is extracted to high spectrum image, especially
It is to be clustered for spectral signature by spectrum, and obtained spectrum block has more identification, merges with other various features, can
To describe from many aspects to high spectrum image, keep classification more accurate.
In embodiments of the present invention, graph model is established to different features, graph model can effectively be kept between data
Structural relation, in more graphics habit, final fused graph model contains the information that all features provide, in the mistake of fusion
Cheng Zhong has further quantified the strong of the ability to express of different type feature by calculating the distance between figure Laplacian Matrix
Weak, obtained fusion results are more reasonable.
As shown in Figure 5, Figure 6, the embodiment of the invention provides a kind of devices of classification hyperspectral imagery.Installation practice can
Can also be realized by way of hardware or software and hardware combining by software realization.For hardware view, such as Fig. 5 institute
Show, is a kind of hardware structure diagram of equipment where a kind of device of classification hyperspectral imagery provided in an embodiment of the present invention, in addition to
Except processor shown in fig. 5, memory, network interface and nonvolatile memory, the equipment in embodiment where device is logical
It often can also include other hardware, such as be responsible for the forwarding chip of processing message.Taking software implementation as an example, as shown in fig. 6, making
It is by the CPU of equipment where it by computer journey corresponding in nonvolatile memory for the device on a logical meaning
Sequence instruction is read into memory what operation was formed.A kind of device of classification hyperspectral imagery provided in this embodiment, comprising:
Sample determination unit 601, for determining training sample from high spectrum image, for each training sample setting
Corresponding category;
Extraction unit 602, for extracting at least two characteristics of image of the high spectrum image;
Similarity matrix determination unit 603, the similarity matrix of the closest figure for determining each described image feature;
Figure Laplacian Matrix determination unit 604, for determining the figure Laplacian Matrix of each similarity matrix;
Distance determining unit 605, for each figure Laplacian Matrix to be incident upon Grassmann manifold, in institute
It states and determines the distance between every two figure Laplacian Matrix in Grassmann manifold;
Category matrix determination unit 606, for according to the distance between described every two figure Laplacian Matrix and described
Training sample determines the category matrix of the high spectrum image;
Category determination unit 607, for determining the class of each sample of the high spectrum image according to the category matrix
Mark.
In an embodiment of the present invention, the category matrix determination unit, comprising:
First determines subelement, for the category according to each training sample, determines the first of the high spectrum image
Beginning category matrix;
Second determine subelement, for determine each figure Laplacian Matrix at a distance from other figure Laplacian Matrixes it
With;
Third determines subelement, for determining the corresponding fusion weight of each figure Laplacian Matrix in objective function
Initial value, wherein the objective function are as follows:
Wherein, Q is the category matrix, and Y is the initial category matrix of the high spectrum image, L(m)It is general for m-th of Tula
Lars matrix, U are default matrix, γαFor the first default degree of restraint parameter, γsFor the second default degree of restraint parameter, S(m)For
The sum of the distance of m-th figure Laplacian Matrix and other figure Laplacian Matrixes, α(m)For m-th of figure Laplacian Matrix pair
The fusion weight answered, M are the quantity of the figure Laplacian Matrix;
Category matrix update subelement, for according to the corresponding fusion weight of each figure Laplacian Matrix and formula
One, update the category matrix, wherein the formula one are as follows:
The fusion weight updates subelement, the category for determining according to the category matrix update subelement
Matrix and formula two update the corresponding fusion weight of each figure Laplacian Matrix, wherein the formula two are as follows:
Wherein,g(m)=Tr (Q-Y)TU(Q-Y);
Judgment sub-unit, for updating each figure Laplce square that subelement is determined according to the fusion weight
The category matrix that the corresponding fusion weight of battle array and the category matrix update subelement are determined, judges the objective function
Value whether restrain, if it is, exporting the category matrix, otherwise, return at the category matrix update subelement
Reason.
In an embodiment of the present invention, the distance determining unit, comprising:
Eigenvalues Decomposition subelement obtains each for carrying out Eigenvalues Decomposition to each figure Laplacian Matrix
The characteristic value of the figure Laplacian Matrix;
Space eigenvalues determine subelement, general from each Tula for being directed to each figure Laplacian Matrix
In each characteristic value of Lars matrix, the second maximum space eigenvalues of preset value numerical value are chosen;
Feature space generates subelement, for according to the features of the space eigenvalues of each figure Laplacian Matrix to
Amount generates the corresponding feature space of each figure Laplacian Matrix;
Geodesic curve distance determine subelement, for determine the corresponding feature space of figure Laplacian Matrix described in every two it
Between geodesic curve distance;
Matrix distance determines subelement, is used for the corresponding geodesic curve of figure Laplacian Matrix described in every two apart from conduct
The distance between figure Laplacian Matrix described in every two.
In an embodiment of the present invention, further comprise: aid sample unit is used for:
B1: the quantity of the training sample is judged whether less than the first preset value, if it is, successively executing B2 to B9;
B2: principal component analysis is carried out to the spectrum of the high spectrum image, extracts the master of the spectrum of the high spectrum image
Ingredient;
B3: the gradient matrix of each principal component is determined;
B4: by the corresponding superposition of the gradient matrix of each principal component, the gradient matrix of the high spectrum image is generated;
B5: the segmentation of moisture ridge is carried out to the gradient matrix of the high spectrum image, obtains at least one cut zone;
B6: carrying out clustering processing to each cut zone, determine the cluster centre of each cut zone, will be every
The cluster centre of a cut zone is as aid sample;
B7: the corresponding most like training sample of each aid sample is determined;
B8: the category of the corresponding most like training sample of each aid sample is set to the class of each aid sample
Mark;
B9: using each aid sample for being provided with category as training sample, B1 is returned to.
In an embodiment of the present invention, described image feature, comprising: spectral signature;
The extraction unit, is used for:
Clustering processing is carried out to the high spectrum image according to the mutual information between the spectrum in the high spectrum image, it is raw
At the spectrum block of the high spectrum image;
Using each spectrum block as a spectral signature.
In an embodiment of the present invention, described image feature, comprising: spectral signature;
The similarity matrix determination unit, is used for:
According to the spectral signature of the high spectrum image, the spectral value of each sample is determined;
According to the spectral value of each sample, the Euclidean distance in the spectral signature between every two sample is determined;
It is determined initial between every two sample according to the Euclidean distance between every two sample using gaussian kernel function
Spectrum similarity;
According to the initial spectrum similarity between every two sample, third preset value initial spectrum is chosen for each sample
The closest sample of similarity maximum first;
According to each sample the first closest sample of corresponding third preset value, the neighbouring sample of each sample is generated
Collection;
Determine the neighbouring sample set similarity between the neighbouring sample set of every two sample;
The initial spectrum similarity of every two sample is multiplied with neighbouring sample set similarity, the of acquisition every two sample
One final similarity;
According to the final similarity of the third preset value numerical value maximum first of each sample, the Spectral Properties are obtained
The similarity matrix of the closest figure of sign.
In an embodiment of the present invention, described image feature includes: spatial position feature;
The similarity matrix determination unit, is used for:
According to the spatial position feature of the high spectrum image, the spatial position of each sample is determined;
According to the spatial position of each sample, the distance between every two sample of the high spectrum image is determined;
The initial space between every two sample is determined using gaussian kernel function according to the distance between every two sample
Location information similarity;
According to the initial spatial location information similarity between every two sample, the 4th preset value is chosen for each sample
The closest sample of initial spatial location information similarity maximum second;
Characterized by the spectrum of sample, using gaussian kernel function, each sample and each of which second closest sample are determined
Between intermediate similarity;
By the initial spatial location information similarity and intermediate similarity of each sample and each of which the second closest sample
It is multiplied, obtains the second final similarity of each sample Yu each of which the second closest sample;
According to the second final similarity of each sample and each of which the second closest sample, it is special to obtain the spatial position
The similarity matrix of the closest figure of sign.
In an embodiment of the present invention, described image feature includes: textural characteristics;
The similarity matrix determination unit, is used for:
According to the textural characteristics of the high spectrum image, the textural characteristics of each sample are determined;
The texture similarity between every two sample is determined using gaussian kernel function according to the textural characteristics of each sample;
According to the 5th maximum texture similarity of preset value numerical value of each sample, the most adjacent of the textural characteristics is obtained
The similarity matrix of nearly figure.
In an embodiment of the present invention, the figure Laplacian Matrix determination unit, is used for:
According to formula three, the figure Laplacian Matrix of each similarity matrix is determined, wherein the formula three are as follows:
Wherein, L is the figure Laplacian Matrix, and W is the similarity matrix, and I is unit matrix, and D is diagonal matrix,
I-th of element on the diagonal line of D is ∑jWij。
The contents such as the information exchange between each unit, implementation procedure in above-mentioned apparatus, due to implementing with the method for the present invention
Example is based on same design, and for details, please refer to the description in the embodiment of the method for the present invention, and details are not described herein again.
In addition, assessing the classifying quality of scheme provided in an embodiment of the present invention by following emulation experiment:
Simulated conditions: it is Intel (R) Core i5-4460 that the emulation experiment of the embodiment of the present invention, which is in central processing unit,
On 3.2GHZ, 7 operating system of memory 16G, WINDOWS, carried out with MATLAB 2015b software.
Emulation content: emulation experiment of the invention using following Indian Pines database, PaviaU database,
SalinasA database.
SalinasA database: high spectrum image shooting is whole in California, USA Salinas paddy in 1998
Image include 512*217 pixel, spatial resolution is 3.7m/ pixel, we take a subset of the image, and range is original
[591-678] * [158-240] of high spectrum image, includes 224 bands of a spectrum altogether, and removal noise spectrum band is left 204 bands of a spectrum.
Indian Pines database: high spectrum image shooting is in the indiana northwestward, 145*145 picture altogether
Element, 220 bands comprising wave band from 0.4 to 2.5 μm, spatial resolution 20m, removal noise spectrum band there remains 200
Bands of a spectrum.
PaviaU database: the high spectrum image is shot in Italian Pavia university, and database has 610*340 picture altogether
Element, wave band include 0.43 to 0.86 μm of 103 bands of a spectrum, spatial resolution 1.3m/ pixel with higher.
The evaluating standard of this emulation experiment: it is using overall classification accuracy (Overall Accuracy) and Kappa coefficient
Index evaluates and tests performance, and overall classification accuracy (OA) is equal to the pixel summation correctly classified divided by total pixel number.By just
The pixel number really classified is distributed along the diagonal line of confusion matrix, and the pixel that total pixel number is equal to all true reference sources is total
Number, Kappa are calculated by the following formula:
Wherein, wherein N is all pixel sums really referred to, xiiFor the number on confusion matrix diagonal line, xi+For
The true pixel sum of earth's surface, x in certain one kind+iTo be classified pixel sum in such.Shown in contrast and experiment table described as follows.
The overall classification accuracy comparison sheet of four kinds of methods on 1 SalinasA database of table
The identification Kappa coefficients comparison table of four kinds of methods on 2 SalinasA database of table
The overall classification accuracy comparison sheet of four kinds of methods on 3 Indian Pines database of table
The identification Kappa coefficients comparison table of four kinds of methods on 4 Indian Pines database of table
The overall classification accuracy comparison sheet of four kinds of methods on 5 PaviaU database of table
The identification Kappa coefficients comparison table of four kinds of methods on 6 PaviaU database of table
Wherein, table 1 to table 6 is illustrated respectively in SalinasA database, Indian Pines database, PaviaU database
Upper progress classification hyperspectral imagery experiment, the method that the SVM in table indicates the classification hyperspectral imagery based on support vector machine,
Cross+Stacked indicates the semi-supervised hyperspectral image classification method based on figure, and GCK+MLR expression is learnt based on multiple features
Hyperspectral image classification method, the method that GMMGL indicates one of embodiment of the present invention classification hyperspectral imagery.
Wherein, table 1 to table 6 is illustrated respectively in SalinasA database, Indian Pines database, PaviaU database
Upper progress classification hyperspectral imagery experiment, the classification accuracy of method provided in an embodiment of the present invention is more compared with three control methods
Close to 1, so method provided in an embodiment of the present invention is that effect is best in four kinds of methods.This is because the embodiment of the present invention
The method of offer has fully considered multiple features, describes the same high-spectrum from many aspects, graph model is finally utilized, by category
The sample of unknown category is passed to, so the mistake of classification is overcome, thus to obtain other images are better than on recognition accuracy
The effect for sharpening recognition methods, further demonstrates the advance of method provided in an embodiment of the present invention.
The each embodiment of the present invention at least has the following beneficial effects:
1, at least two characteristics of image in embodiments of the present invention, extracting high spectrum image determine that each image is special
Corresponding figure Laplacian Matrix is levied, in the category for combining the training sample on high spectrum image, obtains the class of high spectrum image
Matrix is marked, determines that the category of each sample of high spectrum image, the embodiment of the present invention are based at least according to the category matrix
What two characteristics of image were classified, improve the accuracy of classification.
2, in embodiments of the present invention, when the negligible amounts of training sample, aid sample can be generated, by aid sample
As training sample, increase the quantity of training sample, and then improve the accuracy of classification by more training samples, obtains more
Good classifying quality realizes the high-precision classification only having under a small amount of training sample, reduces what classification hyperspectral imagery artificially marked
It is difficult.
3, in embodiments of the present invention, by the way of multiple features fusion, different features is extracted to high spectrum image,
In particular for spectral signature, being clustered by spectrum, obtained spectrum block has more identification, it is merged with other various features,
High spectrum image can be described from many aspects, keep classification more accurate.
4, graph model in embodiments of the present invention, is established to different features, graph model can effectively be kept between data
Structural relation, in more graphics habit, final fused graph model contains the information that all features provide, in fusion
In the process, by calculating the distance between figure Laplacian Matrix, the ability to express of different type feature has further been quantified
Power, obtained fusion results are more reasonable.
It 5, can be respectively from different spectrum in embodiments of the present invention, by the way that high spectrum image is divided into spectrum block
Block describes each sample, and each spectrum block is used to classify as an individual spectral signature, and spectrum block is with more sentencing
Other property can respectively classify to sample from different spectral regions, so that classification results are more accurate.
It should be noted that, in this document, such as first and second etc relational terms are used merely to an entity
Or operation is distinguished with another entity or operation, is existed without necessarily requiring or implying between these entities or operation
Any actual relationship or order.Moreover, the terms "include", "comprise" or its any other variant be intended to it is non-
It is exclusive to include, so that the process, method, article or equipment for including a series of elements not only includes those elements,
It but also including other elements that are not explicitly listed, or further include solid by this process, method, article or equipment
Some elements.In the absence of more restrictions, the element limited by sentence " including one ", is not arranged
Except there is also other identical factors in the process, method, article or apparatus that includes the element.
Those of ordinary skill in the art will appreciate that: realize that all or part of the steps of above method embodiment can pass through
The relevant hardware of program instruction is completed, and program above-mentioned can store in computer-readable storage medium, the program
When being executed, step including the steps of the foregoing method embodiments is executed;And storage medium above-mentioned includes: ROM, RAM, magnetic disk or light
In the various media that can store program code such as disk.
Finally, it should be noted that the foregoing is merely presently preferred embodiments of the present invention, it is merely to illustrate skill of the invention
Art scheme, is not intended to limit the scope of the present invention.Any modification for being made all within the spirits and principles of the present invention,
Equivalent replacement, improvement etc., are included within the scope of protection of the present invention.
Claims (8)
1. a kind of method of classification hyperspectral imagery characterized by comprising
S1: determining training sample from high spectrum image in advance, and corresponding category is arranged for each training sample;
S2: at least two characteristics of image of the high spectrum image are extracted;
S3: the similarity matrix of the closest figure of each described image feature is determined;
S4: the figure Laplacian Matrix of each similarity matrix is determined;
S5: each figure Laplacian Matrix is incident upon in Grassmann manifold, is determined in the Grassmann manifold
The distance between every two figure Laplacian Matrix;
S6: according to the category of the distance between described every two figure Laplacian Matrix and the training sample, the height is determined
The category matrix of spectrum picture;
S7: according to the category matrix, the category of each sample of the high spectrum image is determined;
The S6, comprising:
A1: according to the category of each training sample, the initial category matrix of the high spectrum image is determined;
A2: the sum of the distance of each figure Laplacian Matrix and other figure Laplacian Matrixes is determined;
A3: the initial value of the corresponding fusion weight of each figure Laplacian Matrix in objective function is determined, wherein the target
Function are as follows:
Wherein, Q is the category matrix, and Y is the initial category matrix of the high spectrum image, L(m)For m-th of figure Laplce
Matrix, U are default matrix, γαFor the first default degree of restraint parameter, γsFor the second default degree of restraint parameter, S(m)For m
The sum of the distance of a figure Laplacian Matrix and other figure Laplacian Matrixes, α(m)It is corresponding for m-th of figure Laplacian Matrix
Weight is merged, M is the quantity of the figure Laplacian Matrix;
A4: according to the corresponding fusion weight of each figure Laplacian Matrix and formula one, updating the category matrix,
In, the formula one are as follows:
A5: according to the category matrix and formula two, updating the corresponding fusion weight of each figure Laplacian Matrix,
In, the formula two are as follows:
Wherein,g(m)=Tr (Q-Y)TU(Q-Y);
A6: according to each corresponding fusion weight of figure Laplacian Matrix and the category matrix, judge the target letter
Whether several values restrains, if it is, exporting the category matrix, otherwise, returns to A4.
2. the method according to claim 1, wherein
After the S1, before the S6, further comprise:
B1: the quantity of the training sample is judged whether less than the first preset value, if it is, successively executing B2 to B9;
B2: principal component analysis is carried out to the spectrum of the high spectrum image, extracts the principal component of the spectrum of the high spectrum image;
B3: the gradient matrix of each principal component is determined;
B4: by the corresponding superposition of the gradient matrix of each principal component, the gradient matrix of the high spectrum image is generated;
B5: the segmentation of moisture ridge is carried out to the gradient matrix of the high spectrum image, obtains at least one cut zone;
B6: clustering processing is carried out to each cut zone, the cluster centre of each cut zone is determined, by each institute
The cluster centre of cut zone is stated as aid sample;
B7: the corresponding most like training sample of each aid sample is determined;
B8: the category of the corresponding most like training sample of each aid sample is set to the category of each aid sample;
B9: using each aid sample for being provided with category as training sample, B1 is returned to.
3. the method according to claim 1, wherein
The S5, comprising:
Eigenvalues Decomposition is carried out to each figure Laplacian Matrix, obtains the feature of each figure Laplacian Matrix
Value;
For each figure Laplacian Matrix, from each characteristic value of each figure Laplacian Matrix, the is chosen
The two maximum space eigenvalues of preset value numerical value;
According to the feature vector of the space eigenvalues of each figure Laplacian Matrix, each figure Laplce square is generated
The corresponding feature space of battle array;
Determine the geodesic curve distance between the corresponding feature space of figure Laplacian Matrix described in every two;
By the corresponding geodesic curve of figure Laplacian Matrix described in every two distance as figure Laplacian Matrix described in every two it
Between distance.
4. the method according to claim 1, wherein
Described image feature, comprising: spectral signature;
The S2, comprising:
Clustering processing is carried out to the high spectrum image according to the mutual information between the spectrum in the high spectrum image, generates institute
State the spectrum block of high spectrum image;
Using each spectrum block as a spectral signature.
5. the method according to claim 1, wherein
The S4, comprising:
According to formula three, the figure Laplacian Matrix of each similarity matrix is determined, wherein the formula three are as follows:
Wherein, L is the figure Laplacian Matrix, and W is the similarity matrix, and I is unit matrix, and D is diagonal matrix, D's
I-th of element on diagonal line is ∑jWij。
6. any method in -5 according to claim 1, which is characterized in that
Described image feature, comprising: spectral signature;
The S3, comprising:
According to the spectral signature of the high spectrum image, the spectral value of each sample is determined;
According to the spectral value of each sample, the Euclidean distance in the spectral signature between every two sample is determined;
The initial spectrum between every two sample is determined using gaussian kernel function according to the Euclidean distance between every two sample
Similarity;
According to the initial spectrum similarity between every two sample, it is similar that third preset value initial spectrum is chosen for each sample
Spend the maximum first closest sample;
According to each sample the first closest sample of corresponding third preset value, the neighbouring sample set of each sample is generated;
Determine the neighbouring sample set similarity between the neighbouring sample set of every two sample;
The initial spectrum similarity of every two sample is multiplied with neighbouring sample set similarity, the first of acquisition every two sample is most
Whole similarity;
According to the final similarity of the third preset value numerical value maximum first of each sample, the spectral signature is obtained
The similarity matrix of closest figure;
And/or
Described image feature includes: spatial position feature;
The S3, comprising:
According to the spatial position feature of the high spectrum image, the spatial position of each sample is determined;
According to the spatial position of each sample, the distance between every two sample of the high spectrum image is determined;
The initial spatial location between every two sample is determined using gaussian kernel function according to the distance between every two sample
Information similarity;
According to the initial spatial location information similarity between every two sample, it is initial that the 4th preset value is chosen for each sample
The closest sample of spatial positional information similarity maximum second;
Characterized by the spectrum of sample, using gaussian kernel function, determine between each sample and each of which second closest sample
Intermediate similarity;
The initial spatial location information similarity and intermediate similarity of each sample and each of which the second closest sample are multiplied,
Obtain the second final similarity of each sample Yu each of which the second closest sample;
According to the second final similarity of each sample and each of which the second closest sample, the spatial position feature is obtained
The similarity matrix of closest figure;
And/or
Described image feature includes: textural characteristics;
The S3, comprising:
According to the textural characteristics of the high spectrum image, the textural characteristics of each sample are determined;
The texture similarity between every two sample is determined using gaussian kernel function according to the textural characteristics of each sample;
According to the 5th maximum texture similarity of preset value numerical value of each sample, the closest figure of the textural characteristics is obtained
Similarity matrix.
7. a kind of device of classification hyperspectral imagery characterized by comprising
Sample determination unit, it is corresponding for each training sample setting for determining training sample from high spectrum image
Category;
Extraction unit, for extracting at least two characteristics of image of the high spectrum image;
Similarity matrix determination unit, the similarity matrix of the closest figure for determining each described image feature;
Figure Laplacian Matrix determination unit, for determining the figure Laplacian Matrix of each similarity matrix;
Distance determining unit is drawn for each figure Laplacian Matrix to be incident upon Grassmann manifold in the lattice
The distance between every two figure Laplacian Matrix is determined in this graceful manifold;
Category matrix determination unit, for according to the distance between described every two figure Laplacian Matrix and the trained sample
This, determines the category matrix of the high spectrum image;
Category determination unit, for determining the category of each sample of the high spectrum image according to the category matrix;
The category matrix determination unit, comprising:
First determines that subelement determines the initial classes of the high spectrum image for the category according to each training sample
Mark matrix;
Second determines subelement, for determining the sum of the distance of each figure Laplacian Matrix and other figure Laplacian Matrixes;
Third determines subelement, for determining the initial of the corresponding fusion weight of each figure Laplacian Matrix in objective function
Value, wherein the objective function are as follows:
Wherein, Q is the category matrix, and Y is the initial category matrix of the high spectrum image, L(m)For m-th of figure Laplce
Matrix, U are default matrix, γαFor the first default degree of restraint parameter, γsFor the second default degree of restraint parameter, S(m)For m
The sum of the distance of a figure Laplacian Matrix and other figure Laplacian Matrixes, α(m)It is corresponding for m-th of figure Laplacian Matrix
Weight is merged, M is the quantity of the figure Laplacian Matrix;
Category matrix update subelement, for according to the corresponding fusion weight of each figure Laplacian Matrix and formula one,
Update the category matrix, wherein the formula one are as follows:
It merges weight and updates subelement, the category matrix and public affairs for determining according to the category matrix update subelement
Formula two updates the corresponding fusion weight of each figure Laplacian Matrix, wherein the formula two are as follows:
Wherein,g(m)=Tr (Q-Y)TU(Q-Y);
Judgment sub-unit, for updating each figure Laplacian Matrix pair that subelement is determined according to the fusion weight
The category matrix that the fusion weight and the category matrix update subelement answered are determined, judges the value of the objective function
Whether restrain, if it is, exporting the category matrix, otherwise, returns to the category matrix update subelement and handled.
8. device according to claim 7, which is characterized in that
The distance determining unit, comprising:
Eigenvalues Decomposition subelement obtains each described for carrying out Eigenvalues Decomposition to each figure Laplacian Matrix
The characteristic value of figure Laplacian Matrix;
Space eigenvalues determine subelement, for being directed to each figure Laplacian Matrix, from each figure Laplce
In each characteristic value of matrix, the second maximum space eigenvalues of preset value numerical value are chosen;
Feature space generates subelement, for the feature vector according to the space eigenvalues of each figure Laplacian Matrix,
Generate the corresponding feature space of each figure Laplacian Matrix;
Geodesic curve distance determines subelement, for determining between the corresponding feature space of figure Laplacian Matrix described in every two
Geodesic curve distance;
Matrix distance determines subelement, for regarding the corresponding geodesic curve distance of figure Laplacian Matrix described in every two as every two
The distance between a described figure Laplacian Matrix.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610872312.0A CN106503727B (en) | 2016-09-30 | 2016-09-30 | A kind of method and device of classification hyperspectral imagery |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610872312.0A CN106503727B (en) | 2016-09-30 | 2016-09-30 | A kind of method and device of classification hyperspectral imagery |
Publications (2)
Publication Number | Publication Date |
---|---|
CN106503727A CN106503727A (en) | 2017-03-15 |
CN106503727B true CN106503727B (en) | 2019-09-24 |
Family
ID=58293703
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201610872312.0A Active CN106503727B (en) | 2016-09-30 | 2016-09-30 | A kind of method and device of classification hyperspectral imagery |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN106503727B (en) |
Families Citing this family (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107292264B (en) * | 2017-06-19 | 2018-05-29 | 陈晓龙 | A kind of information processing method and electronic equipment |
CN107609580B (en) * | 2017-08-29 | 2021-02-02 | 天津大学 | Direct-push type low-rank tensor discriminability analysis method |
CN108734122B (en) * | 2018-05-17 | 2022-05-20 | 北京理工大学 | Hyperspectral urban water body detection method based on self-adaptive sample selection |
CN108846361B (en) * | 2018-06-15 | 2022-05-03 | 南京邮电大学 | Hyperspectral image classification method based on spectral band specificity learning |
CN109740504B (en) * | 2018-12-28 | 2022-10-14 | 四创科技有限公司 | Method for extracting sea area resources based on remote sensing image |
CN109858531B (en) * | 2019-01-14 | 2022-04-26 | 西北工业大学 | Hyperspectral remote sensing image fast clustering algorithm based on graph |
CN109948693B (en) * | 2019-03-18 | 2021-09-28 | 西安电子科技大学 | Hyperspectral image classification method based on superpixel sample expansion and generation countermeasure network |
CN110619348A (en) * | 2019-08-07 | 2019-12-27 | 合肥工业大学 | Feature learning algorithm based on personalized discrimination |
CN111783865B (en) * | 2020-06-23 | 2022-03-15 | 西北工业大学 | Hyperspectral classification method based on space spectrum neighborhood embedding and optimal similarity graph |
CN111738370B (en) * | 2020-08-25 | 2020-11-17 | 湖南大学 | Image feature fusion and clustering collaborative expression method and system of intrinsic manifold structure |
CN115700831A (en) * | 2021-07-29 | 2023-02-07 | 脸萌有限公司 | Image labeling method, image classification method and machine learning model training method |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101441716A (en) * | 2008-11-27 | 2009-05-27 | 上海交通大学 | Integral and local characteristic fuse recognition system facing to identification |
CN101930547A (en) * | 2010-06-24 | 2010-12-29 | 北京师范大学 | Method for automatically classifying remote sensing image based on object-oriented unsupervised classification |
CN101976361A (en) * | 2010-11-23 | 2011-02-16 | 中国矿业大学 | Multi-kernel support vector machine classification method for remote sensing images |
CN102024258A (en) * | 2010-12-15 | 2011-04-20 | 中国科学院遥感应用研究所 | Multi-scale segmentation method for remote sensing image with boundary maintenance characteristics |
CN103500343A (en) * | 2013-09-30 | 2014-01-08 | 河海大学 | Hyperspectral image classification method based on MNF (Minimum Noise Fraction) transform in combination with extended attribute filtering |
CN103971123A (en) * | 2014-05-04 | 2014-08-06 | 南京师范大学 | Hyperspectral image classification method based on linear regression Fisher discrimination dictionary learning (LRFDDL) |
CN104036289A (en) * | 2014-06-05 | 2014-09-10 | 哈尔滨工程大学 | Hyperspectral image classification method based on spatial and spectral features and sparse representation |
CN104462818A (en) * | 2014-12-08 | 2015-03-25 | 天津大学 | Embedding manifold regression model based on Fisher criterion |
CN105139031A (en) * | 2015-08-21 | 2015-12-09 | 天津中科智能识别产业技术研究院有限公司 | Data processing method based on subspace clustering |
CN105303196A (en) * | 2015-10-27 | 2016-02-03 | 沈阳大学 | Data clustering analysis method on the basis of Grassmann manifold |
-
2016
- 2016-09-30 CN CN201610872312.0A patent/CN106503727B/en active Active
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101441716A (en) * | 2008-11-27 | 2009-05-27 | 上海交通大学 | Integral and local characteristic fuse recognition system facing to identification |
CN101930547A (en) * | 2010-06-24 | 2010-12-29 | 北京师范大学 | Method for automatically classifying remote sensing image based on object-oriented unsupervised classification |
CN101976361A (en) * | 2010-11-23 | 2011-02-16 | 中国矿业大学 | Multi-kernel support vector machine classification method for remote sensing images |
CN102024258A (en) * | 2010-12-15 | 2011-04-20 | 中国科学院遥感应用研究所 | Multi-scale segmentation method for remote sensing image with boundary maintenance characteristics |
CN103500343A (en) * | 2013-09-30 | 2014-01-08 | 河海大学 | Hyperspectral image classification method based on MNF (Minimum Noise Fraction) transform in combination with extended attribute filtering |
CN103971123A (en) * | 2014-05-04 | 2014-08-06 | 南京师范大学 | Hyperspectral image classification method based on linear regression Fisher discrimination dictionary learning (LRFDDL) |
CN104036289A (en) * | 2014-06-05 | 2014-09-10 | 哈尔滨工程大学 | Hyperspectral image classification method based on spatial and spectral features and sparse representation |
CN104462818A (en) * | 2014-12-08 | 2015-03-25 | 天津大学 | Embedding manifold regression model based on Fisher criterion |
CN105139031A (en) * | 2015-08-21 | 2015-12-09 | 天津中科智能识别产业技术研究院有限公司 | Data processing method based on subspace clustering |
CN105303196A (en) * | 2015-10-27 | 2016-02-03 | 沈阳大学 | Data clustering analysis method on the basis of Grassmann manifold |
Non-Patent Citations (3)
Title |
---|
基于光谱特征和纹理特征协同学习的高光谱图像数据分类;李吉明等;《光电工程》;20121115;第39卷(第11期);第89-94页 * |
基于流形学习的高光谱图像降维与分类研究;王立志;《中国优秀硕士学位论文全文数据库 信息科技辑》;20130315(第03期);I140-668 * |
高光谱遥感影像降维的拉普拉斯特征映射方法;黄蕾;《遥感信息》;20111215;第37-41页 * |
Also Published As
Publication number | Publication date |
---|---|
CN106503727A (en) | 2017-03-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN106503727B (en) | A kind of method and device of classification hyperspectral imagery | |
CN110263774B (en) | A kind of method for detecting human face | |
CN106372581B (en) | Method for constructing and training face recognition feature extraction network | |
Lu et al. | Learning optimal seeds for diffusion-based salient object detection | |
CN110532920B (en) | Face recognition method for small-quantity data set based on FaceNet method | |
Jia et al. | A novel ranking-based clustering approach for hyperspectral band selection | |
CN110348399B (en) | Hyperspectral intelligent classification method based on prototype learning mechanism and multidimensional residual error network | |
CN107368807B (en) | Monitoring video vehicle type classification method based on visual word bag model | |
CN107451614B (en) | Hyperspectral classification method based on fusion of space coordinates and space spectrum features | |
CN105224951B (en) | A kind of vehicle type classification method and sorter | |
Kim et al. | Color–texture segmentation using unsupervised graph cuts | |
CN109766858A (en) | Three-dimensional convolution neural network hyperspectral image classification method combined with bilateral filtering | |
CN107066559A (en) | A kind of method for searching three-dimension model based on deep learning | |
CN105069774B (en) | The Target Segmentation method of optimization is cut based on multi-instance learning and figure | |
CN106326288A (en) | Image search method and apparatus | |
CN103544499B (en) | The textural characteristics dimension reduction method that a kind of surface blemish based on machine vision is detected | |
CN109886295A (en) | A kind of butterfly recognition methods neural network based and relevant device | |
CN111738090A (en) | Pedestrian re-recognition model training method and device and pedestrian re-recognition method and device | |
CN110674685B (en) | Human body analysis segmentation model and method based on edge information enhancement | |
CN105184260A (en) | Image characteristic extraction method, pedestrian detection method and device | |
CN109002463A (en) | A kind of Method for text detection based on depth measure model | |
CN108776777A (en) | The recognition methods of spatial relationship between a kind of remote sensing image object based on Faster RCNN | |
CN109034213B (en) | Hyperspectral image classification method and system based on correlation entropy principle | |
CN111815582B (en) | Two-dimensional code region detection method for improving background priori and foreground priori | |
CN107016371A (en) | UAV Landing Geomorphological Classification method based on improved depth confidence network |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
CB03 | Change of inventor or designer information |
Inventor after: Li Chao Inventor after: Ban Miaoju Inventor after: Deng Cheng Inventor after: Xue Yumeng Inventor after: Yang Yanhua Inventor after: Li Zeyu Inventor before: Li Chao Inventor before: Deng Cheng Inventor before: Xue Yumeng Inventor before: Yang Yanhua Inventor before: Li Zeyu |
|
CB03 | Change of inventor or designer information | ||
GR01 | Patent grant | ||
GR01 | Patent grant |